Ask Question
1 February, 04:47

How did the war change life in the United States?

+3
Answers (1)
  1. 1 February, 04:56
    0
    After the war there was no more slavery. women's rights evolved ...
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “How did the war change life in the United States? ...” in 📗 Social Studies if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers