Wild West

noun

Definition of Wild West

  1. :  the western U.S. in its frontier period characterized by roughness and lawlessness

Wild West

adjective

First Known Use of wild west

1844

Learn More about wild west


Seen and Heard

What made you want to look up Wild West? Please tell us where you read or heard it (including the quote, if possible).

Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!

Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!