Ask Question
20 November, 05:13

How did World War I change the lives of American women

+3
Answers (1)
  1. 20 November, 05:30
    0
    World War One changed the lives of America socially, politically and economically. The war had a massive impact on almost every aspect of society, particularly women, workers and minorities. The American public felt a great sense of nationalism and patriotism during the war as they were unified against a foreign threat.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “How did World War I change the lives of American women ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers