Ask Question
26 March, 08:42

Assume a company pays out $100 in dividends in year 1. dividends increase by 10% a year for 4 years and thereafter stay constant. what is the total amount of dividends paid out (rounded to the nearest $1) during years 1-10?

+3
Answers (1)
  1. 26 March, 08:59
    0
    Dividends increased by 10% (0.1) in the first 4 years.

    Therefore

    Dividends paid in year 1 = $100.00

    Dividends paid in year 2 = $100*1.1 = $110.00

    Diividends paid in year 3 = $110*1.1 = $121.00

    Dividends paid in year 4 = $121*1.1 = $133.10

    Dividends paid in year 5 = $133.1*1.1 = $146.41

    For the next years 5 - 10, dividends remained constant.

    Dividends paid in years 6 - 10 = $146.41*5 = $732.05

    Total dividends paid in years 1-10 is

    100 + 110 + 121 + 133.10 + 146.41 + 732.05 = $1,342.56

    Answer: Total dividends paid in years 1-10 = $1,343 (nearest dollar)
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “Assume a company pays out $100 in dividends in year 1. dividends increase by 10% a year for 4 years and thereafter stay constant. what is ...” in 📗 Business if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers