Ask Question
6 August, 22:14

What did the war do to the relationship between the american colonies and england?

+1
Answers (1)
  1. 6 August, 22:27
    0
    American colonies eventually become independent creating the United States of America and establishing their own laws.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “What did the war do to the relationship between the american colonies and england? ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers