Ask Question
30 August, 01:36

What did "the west" mean to Americans in the 1800s.

+1
Answers (1)
  1. 30 August, 02:02
    0
    Area between the Appalachians and the Mississippi river as the western frontier
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “What did "the west" mean to Americans in the 1800s. ...” in 📗 History if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers