Ask Question
16 February, 20:20

how did womens roles in countries such as the united states and Britain change after world war 1 check all that apply

+4
Answers (1)
  1. 16 February, 20:33
    0
    Society became more open, and women experienced greater freedom.

    Women began to seek out new careers.

    Women challenged old traditions by doing things such as changing their clothing style.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “how did womens roles in countries such as the united states and Britain change after world war 1 check all that apply ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers