West United States

The Western United States, also called the American West, the Western States, the Far West, and the West, is the region comprising the westernmost U.S. states. As American settlement in the U.S. expanded westward, the meaning of the term the West changed. Before around 1800, the crest of the Appalachian Mountains was seen as the weste…
The Western United States, also called the American West, the Western States, the Far West, and the West, is the region comprising the westernmost U.S. states. As American settlement in the U.S. expanded westward, the meaning of the term the West changed. Before around 1800, the crest of the Appalachian Mountains was seen as the western frontier. The frontier moved westward and eventually the lands west of the Mississippi River were considered the West.
  • Subregions: Mountain states · Northwestern United States · Pacific states · Southwestern United States
  • Country: United States
  • States: Alaska, Arizona, California, Colorado, Hawaii, Idaho, Montana, Nevada, New Mexico, Oregon, Utah, Washington, Wyoming · as defined by the Census Bureau. Regional definitions may vary slightly from source to source.
  • Demonym: Westerner
Data from: en.wikipedia.org