Ask Question
22 June, 04:56

What happened to German territory in the east after WWI?

+2
Answers (1)
  1. 22 June, 05:09
    0
    Germany lost World War I. In the 1919 Treaty of Versailles, the victorious powers (the United States, Great Britain, France, and other allied states) imposed punitive territorial, military, and economic provisions on defeated Germany ... In the east, Poland received parts of West Prussia and Silesia from Germany.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “What happened to German territory in the east after WWI? ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers