The file busiest_airports.txt provides details of the 30 busiest airports in the world in 2014. The tab-delimited fields are: three-letter IATA code, airport name, airport location, latitude and longitude (both in degrees).
Write a program to determine the distance between two airports identified by their three-letter IATA code, using the Haversine formula (see, for example, Exercise 4.4.2) and assuming a spherical Earth of radius 6378.1 km).
Solution P6.2.2
This solution uses a structured array. Note that the strings in this array are bytestrings and so for comparison and output we must encode and decode appropriately.
import sys
import numpy as np
haversin = lambda alpha: np.sin(alpha / 2) ** 2
def gc_distance(airport_coords1, airport_coords2):
"""
Return the great-circle distance between two coordinates on the Earth
provided as two (latitude, longitude) tuples in radians.
"""
# Earth's radius, km
R = 6378.1
(phi1, lambda1), (phi2, lambda2) = airport_coords1, airport_coords2
d = (
2
* R
* np.arcsin(
np.sqrt(
haversin(phi2 - phi1)
+ np.cos(phi1) * np.cos(phi2) * haversin(lambda2 - lambda1)
)
)
)
return d
iata1, iata2 = sys.argv[1:]
dt = np.dtype(
[
("IATA", "S50"),
("Name", "S50"),
("Location", "S50"),
("Latitude", "f8"),
("Longitude", "f8"),
]
)
to_radians = lambda alpha: np.radians(float(alpha))
airports = np.loadtxt(
"busiest_airports.txt",
dtype=dt,
delimiter="\t",
converters={3: to_radians, 4: to_radians},
)
def get_airport(iata):
"""
Return the airport with IATA code iata from the airports array or
raise an Exception if it isn't recognised.
"""
b_iata = bytes(iata, encoding="ascii")
airport = airports[airports["IATA"] == b_iata]
# check we retrieved an airport: NB we can't use assert airport, ...
# because of a bug in numpy versions older than 1.8
assert len(airport) == 1, "Airport not recognised: {:s}".format(iata)
return airport
airport1 = get_airport(iata1)
airport2 = get_airport(iata2)
airport_coords1 = airport1[["Latitude", "Longitude"]][0]
airport_coords2 = airport2[["Latitude", "Longitude"]][0]
airport_name1 = airport1["Name"][0].decode()
airport_name2 = airport2["Name"][0].decode()
d = gc_distance(airport_coords1, airport_coords2)
print(
f"Distance from {airport_name1:s} ({iata1:3s}) to {airport_name2:s}"
f" ({iata2:3s}) is {int(d):d} km"
)
For example:
$ python airport-distances.py SFO LHR
Distance from San Francisco International Airport (SFO) to London Heathrow
Airport (LHR) is 8625 km