Ask Question
19 January, 15:34

How did America get Florida?

+5
Answers (1)
  1. 19 January, 15:57
    0
    In 1763 the Treaty of Paris was signed by England, France and Spain and it resulted in England gaining the Florida Territory. But when England formally recognized the colonies' independence (as the United States) in 1783, the Florida Territory was returned to Spain without clear definition of its boundaries.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “How did America get Florida? ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers