the Wild West

noun

Definition of the Wild West

: the western United States in the past when there were many cowboys, outlaws, etc. stories about the Wild West often used before another noun Wild West storiesa Wild West show

Learn More about the Wild West

Statistics for the Wild West

Cite this Entry

“The Wild West.” Merriam-Webster.com Dictionary, Merriam-Webster, https://www.merriam-webster.com/dictionary/the%20Wild%20West. Accessed 17 May. 2021.

Style: MLA
MLACheck Mark Icon ChicagoCheck Mark Icon APACheck Mark Icon Merriam-WebsterCheck Mark Icon

Comments on the Wild West

What made you want to look up the Wild West? Please tell us where you read or heard it (including the quote, if possible).

WORD OF THE DAY

Test Your Vocabulary

Words Used by Nabokov Quiz

  • image1676440788
  • Choose the best definition or synonym for the word in bold: "There are some eructations that sound like cheers—at least, mine did." Lolita
True or False

Test your knowledge - and maybe learn something along the way.

TAKE THE QUIZ
 AlphaBear 2

Spell words. Make bears.

TAKE THE QUIZ
Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!