Dictionary

Wild West

noun

Definition of WILD WEST

:  the western United States in its frontier period characterized by roughness and lawlessness
Wild West adjective

First Known Use of WILD WEST

1844

Browse

Next Word in the Dictionary: wild wheatPrevious Word in the Dictionary: wild wallAll Words Near: Wild West
July 28, 2015
pachyderm Hear it
an elephant
Take a 3-minute break and test your skills!
How to use a word that (literally) drives some people nuts.
Test your vocab with our fun, fast game
Ailurophobia, and 9 other unusual fears