Wild West


Wild West

noun

Definition of WILD WEST

:  the western United States in its frontier period characterized by roughness and lawlessness
Wild West adjective

First Known Use of WILD WEST

1844

Browse

Next Word in the Dictionary: wild wheat
Previous Word in the Dictionary: wild wall
All Words Near: Wild West

Seen & Heard

What made you want to look up Wild West? Please tell us where you read or heard it (including the quote, if possible).