A projectile is shot from the ground at a velocity of 13 m/s and at an angle of (2pi)/3. How long will it take for the projectile to land?

1 Answer
Mar 18, 2018

2.3 seconds

Explanation:

Let's breakdown the velocity into x and y parts using trigonometry.

The x axis has a cosine term and the y axis has a sine term, so
v_x = cos(2pi / 3) * 13 m/s = -1/2 cdot 13 m/s = -6.5 m/s
v_y = sin(2pi / 3) * 13 m/s = sqrt3/2 cdot 13 m/s approx 11.26 m/s

We don't care about the x coordinate, because we only want to know how long it takes for the y coordinate to go back to 0.

We can use basic kinematics:
y = 1/2 at^2 + v_0 t implies y = -1/2 g t^2 + v_yt

We want to know when this hits the ground, i.e. when y=0:
y = 0 = -1/2 gt^2 + v_yt = -1/2 tcdot (g t - 2v_y)
which is 0 when t = 0 or (2v_y)/ g

The first makes sense because when we first shoot it off, it is at the ground,

The second solution is
t = (2v_y) / g approx (2 * 11.26 m/s) / (9.8 m/s^2) approx 2.3 s