Ask Question
1 October, 07:53

What changes began happening for America once world war 2 was over

+1
Answers (2)
  1. 1 October, 08:12
    0
    Economic prosperity was the best thing that happened after the war for the USA. Other changes included minorities trying to fight for their civil rights including African Americans, Latinos and women.
  2. 1 October, 08:18
    0
    The Red Scare and Cold Was came after WW2. Anti-Communist sentiments grew quite quickly which led to many witch hunts for communists in the government. Famous ones like Hollywood 10 and McCarthyism was popularized
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “What changes began happening for America once world war 2 was over ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers