Ask Question
18 April, 04:56

How did the definition of west change in the years 1800 to 1860?

+1
Answers (1)
  1. 18 April, 05:14
    0
    The "West" began as any land between the Appalachians and the Mississippi River. After the Louisiana Purchase in 1803, the West was now any of the new territory beyond the Mississippi and north of Spanish territory in the south (Mexico, Texas, etc). By the time of the Civil War, the United States' territory extended to the Pacific and most of the landmass was what we now recognize as the continental USA.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “How did the definition of west change in the years 1800 to 1860? ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers