Ask Question
13 July, 03:35

A boat traveled upstream a distance of 90 miles at an average speed of (v-3) miles per hour and then traveled the same distance downstream at an average of (v+3) miles per hour. if the trip upstream took half an hour longer than the trip downstream, how many hours did it take the boat to travel downstream?

+4
Answers (1)
  1. 13 July, 03:42
    0
    Trip upstream took 90 v-3 90 v-3 hours and trip downstream took 90 v+3 90 v+3 hours. Also given that the difference in times was 12 12 hours - - > 90 v-3 - 90 v+3 = 12 90 v-3 - 90 v+3 = 12;

    90 v-3 - 90 v+3 = 12 90 v-3 - 90 v+3 = 12 - - > 90 (v+3) - 90 (v-3) v2 - 9 = 12 90 (v+3) - 90 (v-3) v2 - 9 = 12 - - > 90∗6 v2 - 9 = 12 90∗6 v2 - 9 = 12 - - > v2 = 90∗6∗2+9 v2 = 90∗6∗2+9 - - > v2 = 9∗ (10∗6∗2+1) v2 = 9∗ (10∗6∗2+1) - - > v2 = 9∗121 v2 = 9∗121 - - > v=3∗11=33 v=3∗11=33;

    Trip downstream took 90 v+3 = 90 33+3 = 2.5 90 v+3 = 90 33+3 = 2.5 hours.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “A boat traveled upstream a distance of 90 miles at an average speed of (v-3) miles per hour and then traveled the same distance downstream ...” in 📗 Mathematics if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers