Ask Question
27 June, 11:16

Roger ran 3.2 miles the first day of the fundraiser, 4.0 miles the second day, and 5.1 the last day. If he earned $0.15 per foot for charity, how much did he earn?

+2
Answers (1)
  1. 27 June, 11:22
    0
    3.2 + 4 + 5.1 = 12.3, we then divide the total amount of miles ran by the cost per mile, 0.15. We divide because it's a decimal. The answer is 82 dollars.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “Roger ran 3.2 miles the first day of the fundraiser, 4.0 miles the second day, and 5.1 the last day. If he earned $0.15 per foot for ...” in 📗 Mathematics if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers