Ask Question
16 September, 13:02

What social changes occurred in the United States as a result of the war?

+3
Answers (1)
  1. 16 September, 13:16
    0
    With the end of World War II, the US lived a moment of prosperity. North Americans used the money they collected during the war to buy goods that were not available during the conflict. There was a large birth rate due back from the soldiers to their homes and with the increase of the economy unemployment fell a lot.

    There were several social changes in the following years, among which we can mention the increase in the minimum wage, the expansion of social insurance and the foundation of the energy department.

    However over time the high expenses with social programs and the country's involvement in the Vietnam War generated great inflation in the 70's, with this the economy suffered a fall and there was increase in unemployment and instability in the income of the country. Under Reagan's rule, inflation was gradually reduced and controlled, unemployment declined, but the federal budget deficit grew sharply in the 1980s, causing the country's trade balance to worsen and there were more imports than exports.

    The post-war period was also marked by a strong civilian movement where various social groups, especially blacks, sought equal treatment before the law and society after suffering decades of discrimination.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “What social changes occurred in the United States as a result of the war? ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers