Ask Question
27 June, 02:58

How did world war i change women's roles in the united states?

+1
Answers (1)
  1. 27 June, 03:10
    0
    Women were greatly valued in their ability to work and jobs for women became easier to come by and pay was increased
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “How did world war i change women's roles in the united states? ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers