the Deep South
noun
                                
              
          
                                                      : the states in the most southern and eastern part of the U.S. and especially Georgia, Alabama, South Carolina, Louisiana, and Mississippi                                      
                
                    
Love words? Need even more definitions?
  
  Merriam-Webster unabridged




Share