Question #cd938

1 Answer
Jun 13, 2017

1.53 "s"

Explanation:

We're (essentially) asked to find the time when the rock hits the ground, given it fell from a height of 11.5 "m".

This rock is undergoing free-fall; i.e. it is under the sole influence of Earth's gravitational force, and falling toward Earth's surface with an acceleration of -g, which is -9.8"m/s"^2

To find the time t when the rock reaches a position of y = -11.5 "m", we can use the equation

y = y_0 + v_(0y)t - 1/2g t^2

where

  • y is the position at time t (-11.5 "m"),

  • y_0 is the initial position (0 "m"),

  • v_(0y) is the initial y-velocity, which is 0 since the rock was merely dropped,

  • t is the time, in "s" (what we must find), and

  • g is 9.8 "m/s"^2 (the minus sign in front of the 1/2 indicates that this acceleration is downward)

Plugging in known values, we have

-11.5 "m" = 0 "m" + (0 "m/s)"t - (4.9"m/s"^2)t^2

(4.9"m/s"^2)t^2 = 11.5 "m"

t^2 = (11.5"m")/(4.9"m/s"^2)

t = sqrt((11.5cancel("m"))/(4.9cancel("m")"/s"^2)) = color(red)(1.53 color(red)("s"

When dropped from a well with a depth of 11.5 meters, the rock will thus take 1.53 seconds to reach the bottom of the well.