Ask Question
7 February, 02:59

A radio signal travels at 3.00 • 10^8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.54 • 10^7 meters? Show your work.

would the answer be rate=3*10^8m/s

distance=3.6*10^7m

time=distance/rate=3.6/30?

+2
Answers (1)
  1. 7 February, 03:11
    0
    The answer is 0.118 seconds.

    The velocity (v) is the distance (d) divided by time (t):

    v = d : t

    It is given:

    v = 3.00 * 10⁸ meters per second

    d = 3.54 * 10⁷ meters

    It is unknown:

    t = ?

    If:

    v = d : t

    Then:

    t = d : v

    t = 3.54 * 10⁷ meters : 3.00 * 10⁸ meters/second

    t = 1.18 * 10⁻¹ seconds

    t = 0.118 seconds

    Therefore, radio signal will travel from a satellite to the surface ofEarth 0.118 seconds.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “A radio signal travels at 3.00 • 10^8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the ...” in 📗 Mathematics if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers