Ask Question
14 August, 14:43

Why did many Americans feel it was important for the United States to gain control of Florida?

+5
Answers (1)
  1. 14 August, 14:48
    0
    Britain and Spain ruled it plus slaveships were brought in.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “Why did many Americans feel it was important for the United States to gain control of Florida? ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers