Ask Question
12 June, 03:47

A rock is thrown horizontally at a speed of 5.0 m/s from the top of a cliff 64.7 m high. The rock hits the ground 18.0 m from the base of the cliff. How would this distance change if the rock was thrown at 10.0 m/s?

+3
Answers (1)
  1. 12 June, 04:10
    0
    Refer to the diagram shown.

    Assume g = 9.8 m/s² and ignore air resistance

    When the rock is launched from a height of 64.7 m,

    u = 5.0 m/s, the horizontal velocity

    v = 0, the initial vertical velocity

    If the rock hits the ground 18.0 m from the base of the cliff, then the time of flight is

    t = (18.0 m) / (5.0 m/s) = 3.6 s

    The vertical distance traveled is

    s = (1/2) * (9.8 m/s²) * (3.6 s) ² = 63.504 m

    Because this distance is less than 64.7 m, ground level is slightly higher away from the base of the cliff. It is higher by

    64.7 - 63.504 = 1.196 m

    If the rock is thrown at 10 m/s, the time of flight remains the same because acceleration due to gravity is the same.

    Therefore the horizontal distance traveled is

    (10.0 m/s) * (3.6 s) = 36.0 m

    Answer: The distance will be 36.0 m
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “A rock is thrown horizontally at a speed of 5.0 m/s from the top of a cliff 64.7 m high. The rock hits the ground 18.0 m from the base of ...” in 📗 Physics if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers