Ask Question
10 December, 10:43

Did britain gain control of the west after the french and indian war

+2
Answers (1)
  1. 10 December, 11:04
    0
    The French and Indian War began in 1754 and ended with the Treaty of Paris in 1763. The war provided Great Britain enormous territorial gains in North America, but disputes over subsequent frontier policy and paying the war's expenses led to colonial discontent, and ultimately to the American Revolution. - Google
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “Did britain gain control of the west after the french and indian war ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers