Definition of to tell (you) the truth
—used to say that one is being honest and admitting somethingTo tell the truth, I liked her first book better,To tell you the truth, I don't remember when I saw her last.
Love words? Need even more definitions?Merriam-Webster unabridged
Words at Play
Ask the Editors