2
$\begingroup$

Suppose I have a 60° fov camera pointed at the horizon and from that position I start climbing up in altitude. The higher I go the more of the earth curvature I will begin to notice. Call the left most point in my field of vision on the horizon A and the right most point B, how would you calculate how many meters do A and B go below an imaginary straight line? Estimates are okay for my purpose.

$\endgroup$

    1 Answer1

    Reset to default
    1
    $\begingroup$

    enter image description here

    $R$: Radius of the Earth

    $h$: Height of camera$C$from the surface

    $t$:$\angle DAB$

    $d$: Depth of the point$D$from line$EC$(the horizontal line at camera height) = Length$(ED)$

    In$\Delta DBA$:

    $\cos t = \frac{R}{R+h}$

    In$\Delta ECA$:

    $\cos t = \frac{R+h}{R+d}$

    $\therefore \frac{R}{R+h}=\frac{R+h}{R+d}$

    $d = \frac{(R+h)^2}{R} - R = 2h + \frac{h^2}{R}$

    For$h<, the 2nd term is much smaller than the first, so it can be ignored.

    $\therefore d = 2h$

    In other words, your height above the surface of the Earth is equal to the depth of the furthest point you can see on the horizon, the said depth being measured from the plane from which your height is being measured, that is the plane tangential to the surface of the Earth at the point underneath you.

    We can reach the same answer using symmetry considerations as well.

    $\endgroup$

      Your Answer

      By clicking “Post Your Answer”, you agree to ourterms of service,privacy policyandcookie policy

      Not the answer you're looking for? Browse other questions taggedorask your own question.