Ask Question
5 April, 14:19

How did life change for women in the United States after World War I started?

+4
Answers (2)
  1. 5 April, 14:38
    0
    It changed the lives of women a lot. During the War, most men were drafted into the war, which opened up a ton of jobs for women. Women were able to now volunteer/work for the Marines, Army, and Navy and take jobs left behind by men. So, their lives changed because the War gave them jobs and other oppurtuinites.
  2. 5 April, 14:42
    0
    Because many men were sent off to war, women took the men's positions in the workplace and so after the war, women had more non-traditional roles at work
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “How did life change for women in the United States after World War I started? ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers