to tell (you) the truth
idiom
                                
              
            
                  —used to say that one is being honest and admitting something
To tell the truth, I liked her first book better,
To tell you the truth, I don't remember when I saw her last.
    
                
                    Love words? Need even more definitions?
  
  Merriam-Webster unabridged




Share