I must warn you I am not certain about this answer but this is the best I can come up with. Let's assume that the astronaut is 50 miles above the surface of the earth, due east of the centre of the Earth. He will therefore see the satellite when it is 50 miles above the surface of the earth, due north of the centre of the Earth. You can draw a triangle ABC where AB is the distance from the earth's centre to the astronaut, AC is the distance from the centre to the horizon, and BC is the distance from the astronaut to the horizon. The radius of the earth is 4000 miles, so the distance from the centre to the astronaut is 4050miles. This is the same as the distance from the centre to the horizon. This means AB and AC are 4050 miles. You can then use pythagoras' theorem to find out the length from the astronaut to the horizon: √(4050^2+4050^2)=5727.56 So the distance is 5728 miles from the astronaut to the horizon.
To find the distance from the astronaut to the horizon, use Pythagoras' theorem to form a right triangle with the Earth's radius and the line of sight distance. Substitute the given values and solve the equation to find the distance. ;
The distance from the astronaut to the horizon is approximately 632 miles, calculated using the formula for the distance to the horizon with Earth's radius and the astronaut's height. This uses the formula d = 2 r h .
;