Definition of Western United States

  • 1. The region of the United States lying to the west of the Mississippi River Noun

Synonyms for word "western united states"

Semanticaly linked words with "western united states"