Ask Question
31 December, 04:52

How did the united States gain the territory if Florida

+5
Answers (1)
  1. 31 December, 05:17
    0
    In 1763 the Treaty of Paris was signed by England, France and Spain and it resulted in England gaining the Florida Territory. But when England formally recognized the colonies' independence (as the United States) in 1783, the Florida Territory was returned to Spain without clear definition of its boundaries.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “How did the united States gain the territory if Florida ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers