In standard kinematics problems involving gravity, we assume a constant gravitational field strength. Suppose instead that we have a fictitious scenario in which the gravity field strength varies as follows: $g = g_0 + ky,$ where $y$ (in meters) is the distance above the ground. Here, the gravity is weakest at ground level and becomes stronger with increasing altitude (whilst always pointing toward the ground).

Consider the time $t_f$ (in seconds) that it would take for an object to fall from an initial resting height $y_0$ to the ground. To 2 decimal places, what is the limiting value of $t_f$ as $y_0$ approaches infinity?

Note: The quantity $g$ has units of $\text{m/s}^2$, $g_0 = 10 \text{ m/s}^2$, and $k= 1/\text{s}^{2}$.

×