Anonymous
Anonymous asked in Science & MathematicsPhysics · 1 month ago

# Kinematics Physics problem?

A baseball is thrown by an outfielder (O) towards the catcher (C) with an initial speed of 20 m/s at an angle of 45◦ with the horizontal. At the moment that the ball is thrown, C is 50 m from O. At what speed and in what direction must C run to catch the ball at the same height at which it was released? Assume that C catches the ball at the same moment that it arrives.

My answer is Vc =  - 3.2 m/s?

Relevance

Let h(t) = height and launch height = ho

h(t) = ho + Vyo*t - 4.9t² = ho + 20sin45*t - 4.9t² = ho at t = 20sin45/4.9 = 2.89s = flight time. So in 2.89s the ball returns to launch height at Vxo*t = 20cos45 *2.89 = 40.816m from O towards C. It lands 50 - 40.816 = 9.184m from C. 9.184m/2.89s = Vcatcher = 3.18m/s towards O. So O throws the ball but it is only above launch height for 2.89s.  during which it travels horizontally 9.2m from the catcher so the catcher has to run towards the outfielder to catch the ball. I get 3.2m/s too but why -3.2m/s?

• h = v₀ᵧt – ½gt²

0 = 0 + 20sin45°t - ½(9.8)t²

t = 20sin45° / ½(9.8)

t = 2.89s

s = vₓt

s = 20cos45°(2.89)

s = 40.8 m

50 - 40.8 = 9.2 m in front of the catcher

9.2 / 2.89 = 3.2 m/s

The catcher must move 3.2 m/s toward the ball