to tell the truthidiom
Definition of to tell the truth
—used to say that one is stating what one really thinks I didn't really like the movie, to tell the truth.
Love words? Need even more definitions?Merriam-Webster unabridged
Words at Play
Ask the Editors
- On Contractions of Multiple Words
- A Look at Uncommon Onomatopoeia
- Is Singular 'They' a Better Choice?
- Where in the World? A Quiz Take the quiz
- Advanced Vocabulary Quiz Take the quiz
- Name That Thing Take the quiz
- Word Winder's CrossWinder Play the game