International Space Station data with Python research 🌎

Related tags

Data AnalysisISS_data
Overview

espaciador

International Space Station data with Python research 🌎

Plotting ISS trajectory, calculating the velocity over the earth and more.


Plotting trajectory:

We are going to make a graph of the trajectory of the ISS that is N minutes long. The N will be chosen by the user according to their preferences. This means that the program will run and keep points in a list for N minutes.
We will use an API to retrieve ISS current position in latitude and longitude:

http://open-notify.org/Open-Notify-API/ISS-Location-Now/

First we need to import the following python modules:

Pandas to read json data from ISS API, plotly to make the plot of the trajectory and time to time.sleep function
import pandas as pd
import plotly.express as px
import time

Second we must initialize the list that will preserve the latitude and longitude points (every sixty seconds). You also have to initialize the N variable with time in minutes

latitudes = []
longitudes = []
N = 60 # Sixty for one hour trajectory

Then we will create the following for loop to keep recording latitude-longitude points separated by one minute

We use for i in range(N), which is the time that the script will keep running (in hours) because we have a time.sleep(60) at the end
for i in range(N):  
    url = "http://api.open-notify.org/iss-now.json" # API URL

    df = pd.read_json(url) # Pandas read JSON data from API
    
    latitudes.append(df["iss_position"]["latitude"])  # We append latitude ISS position to latitudes list
    longitudes.append(df["iss_position"]["longitude"]) # We append longitude ISS position to longitudes list
    
    time.sleep(60) # This is used to separate de point records with one minute

When the for loop finish the iterating we will have a record of N minutes ISS trajectory. Now we can plot this with Plotly (px.line_geo):

px.line_geo will create a plot with earth map
fig = px.line_geo(lat=latitudes, lon=longitudes) # Passing our latitudes and longitudes list as parameter
fig.show()  

image

This is a two hours trajectory plot

We can update our plot to orthographic projection with this code:

fig.update_geos(projection_type="orthographic")
fig.update_layout(height=300, margin={"r":0,"t":0,"l":0,"b":0})
fig.show()  

image

30 minutes trajectory plot

image

2 Hours trajectory plot GIF

Estimating ISS velocity:

We will estimate the ISS velocity using two diferent latitude-longitude points separated by one minute (sixty seconds). We can get the distance between that two points and then use phisics formula velocity(m/s) = distance(in meters)/time(in seconds)

First import the following python modules

import pandas as pd # Pandas to read API data
import time # Time for time.sleep
import geopy.distance # Geopy to get distance between two lat-lon points
import requests # Get another API data
import json # Read that data
We need to initialize two empty list to save latitudes and longitudes
lat = []
long = []
Next we will use a for loop to get the two latitude-longitude points separated by 60 seconds (time.sleep(60))
for i in range(2):  # for in range(2) because we want two lat-lon points

    url = "http://api.open-notify.org/iss-now.json" # API url

    df = pd.read_json(url) # Read API Json data with Pandas

    lat.append(df["iss_position"]["latitude"]) # Append latitude to lat list
    long.append(df["iss_position"]["longitude"]) # Append longitude to long list

    time.sleep(60) # Wait 60 seconds to record the second lat-lon point
When this for loop finish we will have a lat list with two latitude positions and one long list with two longitude positions. In conjuntion of this 4 numbers we have two lat-lon points in different time moments (separated by one minute)

Then we must get the distance between this points:

We create the two different points. The first one with lat[0] index and long[0]. The second one with lat[1] and long[0]
coords_1 = (lat[0], long[0]) 
coords_2 = (lat[1], long[1])
Then calculate distance with geopy library:
distance = (
geopy.distance.distance(coords_1, coords_2).m
) # Distance between the points in meters
But we must make a litle correction. Because ISS isn't moving in earth surface. It's orbiting aproximately 400Km above earth surface. So the radius is greater. The distance traveled is a litle bit more. To do this, we need to get ISS current altitud. Use the following code:

image

iss_alt_url = "https://api.wheretheiss.at/v1/satellites/25544"
r = requests.get(iss_alt_url)
r = r.text
r = json.loads(r)

iss_alt = int(r["altitude"]) # IN KM
Now apply phisics formula to make the correction
earth_radius = 6371 # in KM
distance_corrected = (distance * (earth_radius+iss_alt)/earth_radius)
Now finish the calculation with speed formula already explained:
speed = distancia_corrected/60 


print(round(speed*3.6, 3), "KM/H") # Multiplied by 3.6 to convert from m/s to km/h. Rounded by 3.

Output:

26367.118 KM/h
Owner
Facundo Pedaccio
Studying computer engineering and economics. I like computer science, physics, astrophysics, rocket science. Or rather the perfect combination of them.
Facundo Pedaccio
My solution to the book A Collection of Data Science Take-Home Challenges

DS-Take-Home Solution to the book "A Collection of Data Science Take-Home Challenges". Note: Please don't contact me for the dataset. This repository

Jifu Zhao 1.5k Jan 03, 2023
Analytical view of olist e-commerce in Brazil

Analysis of E-Commerce Public Dataset by Olist The objective of this project is to propose an analytical view of olist e-commerce in Brazil. For this

Gurpreet Singh 1 Jan 11, 2022
Data Analysis for First Year Laboratory at Imperial College, London.

Data Analysis for First Year Laboratory at Imperial College, London. For personal reference only, and to reference in lab reports and lab books.

Martin He 0 Aug 29, 2022
Data processing with Pandas.

Processing-data-with-python This is a simple example showing how to use Pandas to create a dataframe and the processing data with python. The jupyter

1 Jan 23, 2022
Using Python to scrape some basic player information from www.premierleague.com and then use Pandas to analyse said data.

PremiershipPlayerAnalysis Using Python to scrape some basic player information from www.premierleague.com and then use Pandas to analyse said data. No

5 Sep 06, 2021
MIR Cheatsheet - Survival Guidebook for MIR Researchers in the Lab

MIR Cheatsheet - Survival Guidebook for MIR Researchers in the Lab

SeungHeonDoh 3 Jul 02, 2022
Titanic data analysis for python

Titanic-data-analysis This Repo is an analysis on Titanic_mod.csv This csv file contains some assumed data of the Titanic ship after sinking This full

Hardik Bhanot 1 Dec 26, 2021
🌍 Create 3d-printable STLs from satellite elevation data 🌏

mapa 🌍 Create 3d-printable STLs from satellite elevation data Installation pip install mapa Usage mapa uses numpy and numba under the hood to crunch

Fabian Gebhart 13 Dec 15, 2022
A program that uses an API and a AI model to get info of sotcks

Stock-Market-AI-Analysis I dont mind anyone using this code but please give me credit A program that uses an API and a AI model to get info of stocks

1 Dec 17, 2021
A data parser for the internal syncing data format used by Fog of World.

A data parser for the internal syncing data format used by Fog of World. The parser is not designed to be a well-coded library with good performance, it is more like a demo for showing the data struc

Zed(Zijun) Chen 40 Dec 12, 2022
This repo is dedicated to the data extraction and manipulation of the World Bank's database called STEP.

Overview Welcome to the Step-X repository. This repo is dedicated to the data extraction and manipulation of the World Bank's database called STEP. Be

Keanu Pang 0 Jan 20, 2022
This is an example of how to automate Ridit Analysis for a dataset with large amount of questions and many item attributes

This is an example of how to automate Ridit Analysis for a dataset with large amount of questions and many item attributes

Ishan Hegde 1 Nov 17, 2021
Minimal working example of data acquisition with nidaqmx python API

Data Aquisition using NI-DAQmx python API Based on this project It is a minimal working example for data acquisition using the NI-DAQmx python API. It

Pablo 1 Nov 05, 2021
A neural-based binary analysis tool

A neural-based binary analysis tool Introduction This directory contains the demo of a neural-based binary analysis tool. We test the framework using

Facebook Research 208 Dec 22, 2022
Program that predicts the NBA mvp based on data from previous years.

NBA MVP Predictor A machine learning model using RandomForest Regression that predicts NBA MVP's using player data. Explore the docs Β» View Demo Β· Rep

Muhammad Rabee 1 Jan 21, 2022
scikit-survival is a Python module for survival analysis built on top of scikit-learn.

scikit-survival scikit-survival is a Python module for survival analysis built on top of scikit-learn. It allows doing survival analysis while utilizi

Sebastian PΓΆlsterl 876 Jan 04, 2023
AptaMat is a simple script which aims to measure differences between DNA or RNA secondary structures.

AptaMAT Purpose AptaMat is a simple script which aims to measure differences between DNA or RNA secondary structures. The method is based on the compa

GEC UTC 3 Nov 03, 2022
Visions provides an extensible suite of tools to support common data analysis operations

Visions And these visions of data types, they kept us up past the dawn. Visions provides an extensible suite of tools to support common data analysis

168 Dec 28, 2022
Open-source Laplacian Eigenmaps for dimensionality reduction of large data in python.

Fast Laplacian Eigenmaps in python Open-source Laplacian Eigenmaps for dimensionality reduction of large data in python. Comes with an wrapper for NMS

17 Jul 09, 2022
Python script for transferring data between three drives in two separate stages

Waterlock Waterlock is a Python script meant for incrementally transferring data between three folder locations in two separate stages. It performs ha

David Swanlund 13 Nov 10, 2021