Medical Definition of allied health
: a broad field of health-care professions made up of specially trained individuals (such as physical therapists, dental hygienists, audiologists, and dietitians) who are typically licensed or certified but are not physicians, dentists, or nurses —often used before another noun allied health sciencesthe allied health professions
Love words? Need even more definitions?Merriam-Webster unabridged
Words at Play
- 7 Words Related to "Work"
- The Parts of the Fruit: Seed, Pericarp, and More
- The Trick to Choosing 'Pawn Off' or 'Palm Off'
- The Good, The Bad, & The Semantically Imprecise - 9/13/19
Ask the Editors
- On Contractions of Multiple Words
- A Look at Uncommon Onomatopoeia
- Is Singular 'They' a Better Choice?