Ask Question
7 February, 12:48

After world war 2 how did Americans view the role of the United states

+2
Answers (2)
  1. 7 February, 13:03
    0
    The entry of the United States into World War II cause vast changes in virtually most Americans intitle e view their place in the postwar world with optimism
  2. 7 February, 13:14
    0
    Many wanted the U. S. to retreat from global responsibilities.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “After world war 2 how did Americans view the role of the United states ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers