Ask Question
2 February, 03:03

What happened when the First World War ended?

+3
Answers (2)
  1. 2 February, 03:08
    0
    They stopped fighting while terms of peace were negotiated
  2. 2 February, 03:29
    0
    Nations that gained or regained territory or independence after World War I. France: gained Alsace-Lorraine as well as various African colonies from the German Empire, and Middle East territories from the Ottoman Empire. The African and Middle East gains were officially League of Nations Mandates.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “What happened when the First World War ended? ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers