Ask Question
21 November, 16:49

Did the war change the role of women in American society

+2
Answers (1)
  1. 21 November, 17:09
    0
    Women's work in WW1. During WWI (1914-1918), large numbers of women were recruited into jobs vacated by men who had gone to fight in the war. New jobs were also created as part of the war effort, for example in munitions factories.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “Did the war change the role of women in American society ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers