to tell the truth
Definition of to tell the truth
—used to say that one is stating what one really thinks I didn't really like the movie, to tell the truth.
Word by Word Definitions
: to relate in detail : narrate
: to give utterance to : say
: to make known : divulge, reveal
: hill, mound
: an ancient mound in the Middle East composed of remains of successive settlements
: the body of real things, events, and facts : actuality
: the state of being the case : fact
: a transcendent fundamental or spiritual reality
Seen and Heard
What made you want to look up to tell the truth? Please tell us where you read or heard it (including the quote, if possible).