Geographers define antipode as a point exactly opposite a reference point on the opposite side of the earth. To calculate the latitude of an antipode, change the sign and direction of the latitude of the reference point. To calculate the longitude of an antipode, subtract the absolute value of the reference-point longitude from 180 degrees and change the sign and direction of the answer with reference to the reference point.
Latitude Latitude measures position on the surface of the earth in a north-south sense. The starting point for latitude measurements is the equator, which is designated as 0 degrees latitude. There are 90 degrees of latitude north of the equator, and 90 degrees south. Longitude Longitude measures east-west position on the surface of the earth. The starting point for longitude measurements is the prime meridian in Greenwich, England. Greenwich is designated as 0 degrees longitude. The basis for this designation is historical legacy. There are 180 degrees of longitude east of Greenwich, and -180 degrees west. A measurement of 180 degrees east longitude is the same as a measurement of 180 degrees west longitude. Explanation of Calculation It's easy to see that the antipode of the north pole, at 90 degrees north latitude, is the south pole, at -90 south latitude. It's also easy to see that if we move one degree south of the north pole, to 89 degrees north latitude, the antipode of that point will be one degree north of the south pole, at -89 degrees south latitude. This pattern holds for any reference point on the face of the earth. The latitude of the antipode will be the opposite sign and direction of the reference point. There are 360 total degrees of longitude on earth, so the longitude of the antipode will always be 180 degrees away from the longitude of the reference point. Unfortunately, we can't simply add or subtract 180 degrees to the longitude of the reference point, because of the way that geographers designate longitude. Instead, we must calculate the supplement of the absolute value of the longitude of the reference point, to account for the negative degrees in western longitudes, then change the sign of the answer respective to the reference point.
The latitude and longitude of Tampa International Airport (TPA) are +27.97 degrees north and -82.53 degrees west.
To calculate the latitude of the antipode, change the sign and direction of TPA's latitude. The answer is -27.97 degrees south latitude.
To calculate the longitude of the antipode, subtract the absolute value of TPA's longitude from 180 degrees, and change the sign and direction of the answer to the opposites of the reference point. The answer is +97.47 degrees east longitude.
The latitude and longitude of the antipode to TPA are -27.97 degrees south latitude and +97.47 degrees east longitude, a point in the Indian Ocean west of Australia.
- Geo Swan, Wikimedia Commons