Ask Question
26 December, 03:12

What was the role of America after WWI?

+3
Answers (1)
  1. 26 December, 03:15
    0
    Under President Woodrow Wilson, the United States remained neutral until 1917 and then entered the war on the side of the Allied powers (the United Kingdom, France, and Russia). The experience of World War I had a major impact on US domestic politics, culture, and society.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “What was the role of America after WWI? ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers