Question #2e587
1 Answer
Explanation:
We're asked to find the time, in seconds, it takes an object to fall to Earth's surface from a height of
To do this, we can use the kinematics equation
ul(y = y_0 + v_(0y)t - 1/2g t^2
where
-
y is the height at timet (which is0 , ground-level) -
y_0 is the initial height (given as250 "m" ) -
v_(0y) is the initial velocity (it dropped from a state of rest, so this is0 ) -
t is the time (what we're trying to find) -
g = 9.81 "m/s"^2
Sine the initial
y = y_0 - 1/2g t^2
Let's solve this for our unknown variable,
y-y_0= -1/2g t^2
-2(y-y_0) = g t^2
t^2 = (-2(y-y_0))/g
color(red)(t = sqrt((-2(y-y_0))/g)
Plugging in known values:
t = sqrt((-2(0-250color(white)(l)"m"))/(9.81color(white)(l)"m/s"^2)) = color(blue)(ulbar(|stackrel(" ")(" "7.14color(white)(l)"s"" ")|)
So ultimately, if you're ever given a situation where you're asked to find the time it takes an object to fall a certain distance (with
color(red)(t = sqrt((2*"height")/g)