Ask Question
15 April, 05:54

Radio signals travel at a rate of 3x10^8 meter per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 9.6*10^6 meters? (Hint: Time divide by speed.)

A) 3.2*10^2 seconds

B) 3.2*10^-2 seconds

C) 3.13 * 10^1seconds

D) 2.88*10^15 seconds

+3
Answers (1)
  1. 15 April, 06:10
    0
    Distance = velocity x time

    distance = 9.6 x 10^15 meters

    velocity = 3x10^8 meters/second

    time = distance / velocity = m / (m/s) = s
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “Radio signals travel at a rate of 3x10^8 meter per second. How many seconds would it take for a radio signal to travel from a satellite to ...” in 📗 Mathematics if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers