take the reins
Definition of take the reins
: to take control The President-elect will officially take the reins in January.
Word by Word Definitions
: to get into one's hands or into one's possession, power, or control: such as
: to seize or capture physically
: to get possession of (fish or game) by killing or capturing
: something that is taken:
: the amount of money received : proceeds, receipts, income
: share, cut
: the region of the kidneys : loins
: the seat of the feelings or passions
Seen and Heard
What made you want to look up take the reins? Please tell us where you read or heard it (including the quote, if possible).