Definition of take the reins
- The President-elect will officially take the reins in January.
: to get into one's hands or into one's possession, power, or control: such as
: to seize or capture physically
: to get possession of (fish or game) by killing or capturing
: a distinct or personal point of view, outlook, or assessment
: an act or the action of taking: such as
: the uninterrupted photographing or televising of a scene
: the region of the kidneys : loins
: the seat of the feelings or passions
What made you want to look up take the reins? Please tell us where you read or heard it (including the quote, if possible).