Ask Question
26 November, 23:01

What social changes took place in the United States after World War II? What role did the war play in those changes?

+2
Answers (1)
  1. 26 November, 23:04
    0
    There were two marking social changes that took place in the United States after the World War II, and as a result of the same. One of them was that the women started to get much more opportunities in life, especially in the working field, which resulted in bigger economic independence of the women, and improvement of their rights. The other one was that the people of other races, especially the African Americans, also gained they rights, as well as much more opportunities in life, as with the women, especially in the working field. The main reason why this happened was that lot of the men were sent to war, and led to big shortage of labor force. In order for the economy to continue to run and grow, the owners of the companies started to employ the people they had available, and in abundance, and that were the women and the African Americans.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “What social changes took place in the United States after World War II? What role did the war play in those changes? ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers