The Western United States, commonly referred to as the American West or simply the West, traditionally refers to the region comprising the westernmost states of the United States. Because European settlement in the U.S. expanded westward after its founding, the meaning of the West has evolved over time.   VIEW

The Western United States, commonly referred to as the American West or simply the West, traditionally refers to the region comprising the westernmost states of the United States. Because European settlement in the U.S. expanded westward after its founding, the meaning of the West has evolved over time.

 

VIEW