Ask Question
24 October, 15:10

Hawaii became a US territory after?

American planters overthrew the Hawaiian government.

the United States seized the islands from Spain.

the United States bought the islands from the

royal family.

+1
Answers (1)
  1. 24 October, 15:23
    0
    The correct answer is A. American planters overthrew the Hawaiian government.

    Explanation:

    The territory of Hawaii was officially annexed to the U. S. in 1898; this was only possible because the previous government in Hawaii, which was a monarchy, was overthrown. This occurred in 1893 as planters on the island including mainly American planters and natives organized to end this type of government and make the Queen Liliuokalani leave the throne because they did not agree with the actions of the queen. After this monarchy ended, a new government began in Hawai and some years later the U. S. created a treaty to annex the territory, which was considered a strategic location for war. Thus, Hawaii became a US territory after American planters overthrew the Hawaiian government.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “Hawaii became a US territory after? American planters overthrew the Hawaiian government. the United States seized the islands from Spain. ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers