the Wild West

Definition of the Wild West

  1. :  the western United States in the past when there were many cowboys, outlaws, etc. stories about the Wild West —often used before another noun Wild West stories a Wild West show

Word by Word Definitions

wildplay Wild
  1. :  living in a state of nature and not ordinarily tame or domesticated

    :  growing or produced without human aid or care

    :  related to or resembling a corresponding cultivated or domesticated organism

  1. :  a sparsely inhabited or uncultivated region or tract :  wilderness

    :  a wild, free, or natural state or existence

  1. :  in a wild manner: such as

    :  without regulation or control

    :  off an intended or expected course

westplay West
  1. :  to, toward, or in the west

  1. :  situated toward or at the west

    :  coming from the west

  1. :  the general direction of sunset :  the direction to the left of one facing north

    :  the compass point directly opposite to east

    :  regions or countries lying to the west of a specified or implied point of orientation

Seen and Heard

What made you want to look up the Wild West? Please tell us where you read or heard it (including the quote, if possible).


to help become familiar with something

Get Word of the Day daily email!