the Wild West

noun

Definition of the Wild West

: the western United States in the past when there were many cowboys, outlaws, etc. stories about the Wild West often used before another nounWild West storiesa Wild West show

Learn More about the Wild West

Statistics for the Wild West

Cite this Entry

“The Wild West.” Merriam-Webster.com Dictionary, Merriam-Webster, https://www.merriam-webster.com/dictionary/the%20Wild%20West. Accessed 31 Oct. 2020.

Comments on the Wild West

What made you want to look up the Wild West? Please tell us where you read or heard it (including the quote, if possible).

WORD OF THE DAY

Test Your Vocabulary

Here Be Dragons: A Creature Identification Quiz

  • monster werewolf photo
  • Which is a synonym of werewolf?
Spell It

Can you spell these 10 commonly misspelled words?

TAKE THE QUIZ
Citation

Test Your Knowledge - and learn some interesting things along the way.

TAKE THE QUIZ
Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!