Ask Question
31 August, 19:59

How did the role of women in the United States change during and after

World War II?

+3
Answers (1)
  1. 31 August, 20:04
    0
    World War II provided unprecedented opportunities for American women to enter into jobs that had never before been open to women, particularly in the defense industry ... After the war, many women were fired from factory jobs. Nevertheless, within a few years, about a third of women older than 14 worked outside the home.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “How did the role of women in the United States change during and after World War II? ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers