Medical Definition of allied health
: a broad field of health-care professions made up of specially trained individuals (such as physical therapists, dental hygienists, audiologists, and dietitians) who are typically licensed or certified but are not physicians, dentists, or nurses—often used before another noun allied health sciences the allied health professions
Seen and Heard
What made you want to look up allied health? Please tell us where you read or heard it (including the quote, if possible).