Respuesta :
Applying the Pythagorean theorem, the distance between the Earth's horizon and the satellite would be 5756.70 miles.
How to calculate the distance from the satellite to the Earth's horizon?
To calculate the distance from the satellite to the Earth's horizon, it is necessary to use the Pythagorean Theorem as follows:
We must identify the data we have:
- Distance from Earth to satellite 140 miles.
- Earth diameter 8000 miles.
To calculate the distance from the satellite to the Earth's horizon, we must first establish the distance from the center of the Earth to the satellite, for this we calculate the radius of the Earth and add the distance to the satellite:
- Radius of Earth = 8000 miles / 2
- Radius of Earth = 4000 miles
- 4,000 miles + 140 miles = 4,140 miles
With these data we can draw an imaginary triangle from the center of the Earth, one of its legs would be the distance between the center of the Earth and the satellite, the other leg would be the distance between the center of the Earth and the Earth's horizon (at 90° to the other leg).
Now, to identify the distance between the horizon and the satellite, we apply the Pythagorean Theorem:
- a² + b² = c²
- 4140² + 4000² = c²
- 17139600 + 16000000 = c²
- 33139600 = c²
- 5756.70 = c
According to the above, the distance between the satellite and the horizon would be 5756.70 miles.
Learn more about Pythagorean Theorem in: https://brainly.com/question/15190643
#SPJ1