health care noun
: the maintaining and restoration of health by the treatment and prevention of disease especially by trained and licensed professionals (as in medicine, dentistry, clinical psychology, and public health)
Seen & Heard
What made you want to look up health care? Please tell us where you read or heard it (including the quote, if possible).