Ask Question
19 January, 00:01

A particular baseball pitcher throws a baseball at a speed of 39.1 m/s (about 87.5 mi/hr) toward home plate. We use g = 9.8 m/s2 and ignore air friction.

(a) Assuming the pitcher releases the ball 16.6 m from home plate and throws it so the ball is initially moving horizontally, how long does it take the ball to reach home plate?

+1
Answers (1)
  1. 19 January, 00:28
    0
    There is no acceleration in the horizontal direction (just g in the vertical), so we can use v = d/t, where v is velocity, d is distance and t is time. We can solve for time like so: t = d/v, we can plug in numbers (v is 39.1m/s completely in the horizontal direction, so no need to break it down with sin's and cos's, just plug it in) and we get t = (16.6m) / (39.1 m/s) = 0.42 s. Keep in mind it wouldn't fall far enough vertically to hit home plate (though we don't know the ball's initial height anyway), but would be in the air just above it. Cheers!
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “A particular baseball pitcher throws a baseball at a speed of 39.1 m/s (about 87.5 mi/hr) toward home plate. We use g = 9.8 m/s2 and ignore ...” in 📗 Physics if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers