At some point around 300 B.C., Eratosthenes calculated the circumference of the Earth to a remarkable accuracy. To do this, Eratosthenes noted that vertical poles placed at different latitudes would cast shadows at different angles. He assumed that the sun's rays were parallel, and that the Earth was round.

Suppose that Eratosthenes incorrectly assumed that the Earth was flat, and that the discrepancy was explained by the sun's rays not being parallel. In this scenario, two vertical poles are placed 1000 km apart. One pole casts a shadow at 0 degrees away from the normal, while the other casts a shadow at an angle of 8.98 degrees away from the normal. Supposing the Earth is flat, what would be the height of the sun above the ground, in kilometers, necessary to produce this effect?

Assume that the height of the poles are negligible.

×

Problem Loading...

Note Loading...

Set Loading...