Medical Dictionary

allied health

noun al·lied health \ˈa-ˌlīd-\

Medical Definition of allied health

  1. :  a broad field of health-care professions made up of specially trained individuals (such as physical therapists, dental hygienists, audiologists, and dietitians) who are typically licensed or certified but are not physicians, dentists, or nurses—often used before another noun allied health sciences the allied health professions

Seen and Heard

What made you want to look up allied health? Please tell us where you read or heard it (including the quote, if possible).


a favoring of the simplest explanation

Get Word of the Day daily email!