Ask Question
18 November, 12:33

Which event finally brought the United States into World War II?

a. Japan's attack on Pearl Harbor

b. Germany's invasion of France

c. Britain's attack on Gibraltar

d. Italy's invasion of Greece

+3
Answers (1)
  1. 18 November, 12:53
    0
    It was "a. Japan's attack on Pearl Harbor" that finally brought the United States into World War II, since this was a direct attack on a United States military establishment, which was an indisputable act of war.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “Which event finally brought the United States into World War II? a. Japan's attack on Pearl Harbor b. Germany's invasion of France c. ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers