the Wild West

noun

Definition of the Wild West

: the western United States in the past when there were many cowboys, outlaws, etc. stories about the Wild West often used before another noun Wild West storiesa Wild West show

Learn More about the Wild West

Share the Wild West

Statistics for the Wild West

Look-up Popularity

Comments on the Wild West

What made you want to look up the Wild West? Please tell us where you read or heard it (including the quote, if possible).

WORD OF THE DAY

a shady place in a garden or forest

Get Word of the Day daily email!

Test Your Vocabulary

Original Meanings Quiz

  • rembrandt-painting-a-young-scholar-and-his-tutor
  • Which of the following is the earliest known sense of the word awe?
How Strong Is Your Vocabulary?

Test your vocabulary with our 10-question quiz!

TAKE THE QUIZ
SCRABBLE® Sprint

Test Your Knowledge - and learn some interesting things along the way.

TAKE THE QUIZ

Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!