Wild West

noun

Definition of Wild West

  1. :  the western U.S. in its frontier period characterized by roughness and lawlessness

Wild West

adjective

1844

First Known Use of wild west

1844

Learn More about wild west


Seen and Heard

What made you want to look up Wild West? Please tell us where you read or heard it (including the quote, if possible).

WORD OF THE DAY

feeling or affected by lethargy

Get Word of the Day daily email!