An object is thrown vertically from a height of 6 m6m at 2 m/s2ms. How long will it take for the object to hit the ground?

1 Answer
Feb 7, 2017

The object reaches the ground after 1.33 s.

Explanation:

The convenient thing about using the equations of motion for a problem like this, is that you do not have to split the motion into an "upward" part and a "downward" part. Just as long as the acceleration is constant (it is) and you keep track of the signs on your numbers, you can do it is a single calculation.

Here it is. Use this equation: Deltay=v_oDeltat +1/2 aDeltat^2

because you do not know the final velocity of the object.

In this problem, Deltay=-6m as the landing point is below the starting location. Also, the acceleration is -9.8m/s^2, but v_o=+2m/s, as the object begins its motion in an upward direction. So,

-6=2Deltat -4.9Deltat^2

This is a quadratic equation! Write it in standard form ax^2+bx+c=0 and use the quadratic formula.

4.9Deltat^2-2Deltat-6= 0

Deltat= (2+-sqrt((-2)^2-4(4.9)(-6)))/(2(4.9))

Deltat= (2+-sqrt(121.6))/9.8

Deltat=(2+-11)/9.8

The two answers are Deltat=13/9.8 = 1.33 s and Deltat=-9/9.8 = -0.92s

The first answer is the one we want. The second represents the time the object would have been on the ground if, instead of being thrown from 6 m, it had been thrown 0.92 s earlier, from the ground.