take the reins

Definition of take the reins

  1. :  to take control The President-elect will officially take the reins in January.

Word by Word Definitions

  1. :  to get into one's hands or into one's possession, power, or control: such as

    :  to seize or capture physically

    :  to get possession of (fish or game) by killing or capturing

  1. :  something that is taken:

    :  the amount of money received :  proceeds, receipts, income

    :  share, cut

  1. :  kidneys

    :  the region of the kidneys :  loins

    :  the seat of the feelings or passions

Seen and Heard

What made you want to look up take the reins? Please tell us where you read or heard it (including the quote, if possible).


a rounded knoll or a ridge of ice

Get Word of the Day daily email!