Ask Question
16 April, 20:33

How did World War 1 change women's roles in the United States

+5
Answers (2)
  1. 16 April, 20:49
    0
    World War one changed women's roles by giving them the ability to get other jobs other than working at home. Women were able to not only gain some of he rights as a man did but they were able to work if they wanted to. Typically women were expected to stay at home and take care of household things such as cooking, cleaning and taking care of the kid (s) but the war opened opportunities for women to get jobs and do things outside of the typical household.
  2. 16 April, 20:50
    0
    Gave them more jobs and a chance to not just be a "stay at home mom" they got to work in factories (they took over the "mens" jobs
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “How did World War 1 change women's roles in the United States ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers