take the rein(s)(redirected from take the reins)
take the rein(s)
To take or assume control (of something). After the CEO announced that she had been diagnosed with dementia, her daughter gradually began taking the reins of the company. I don't know why people are so utterly terrified of letting the federal government take the rein when it comes to things like healthcare.
See also: take