the Deep South

noun

Definition of the Deep South

: the states in the most southern and eastern part of the U.S. and especially Georgia, Alabama, South Carolina, Louisiana, and Mississippi

Learn More about the Deep South

Statistics for the Deep South

Cite this Entry

“The Deep South.” Merriam-Webster.com Dictionary, Merriam-Webster, https://www.merriam-webster.com/dictionary/the%20Deep%20South. Accessed 16 Jan. 2021.

Comments on the Deep South

What made you want to look up the Deep South? Please tell us where you read or heard it (including the quote, if possible).

WORD OF THE DAY

Test Your Vocabulary

Words of Snow and Ice Quiz

  • image1037863653
  • Which of the following refers to thin, bending ice, or to the act of running over such ice?
True or False

Test your knowledge - and maybe learn something along the way.

TAKE THE QUIZ
Typeshift

Anagram puzzles meet word search.

TAKE THE QUIZ
Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!