health care


health care

noun

: the prevention or treatment of illness by doctors, dentists, psychologists, etc.

Full Definition of HEALTH CARE

:  efforts made to maintain or restore health especially by trained and licensed professionals —usually hyphenated when used attributively

First Known Use of HEALTH CARE

1940

health care

noun    (Medical Dictionary)

Medical Definition of HEALTH CARE

: the maintaining and restoration of health by the treatment and prevention of disease especially by trained and licensed professionals (as in medicine, dentistry, clinical psychology, and public health)
health–care adjective

Browse

Next Word in the Dictionary: health club
Previous Word in the Dictionary: health
All Words Near: health care

Seen & Heard

What made you want to look up health care? Please tell us where you read or heard it (including the quote, if possible).