Definition of Western medicine
: the typical methods of healing or treating disease that are taught in Western medical schools
Word by Word Definitions
: coming from the west
: situated or lying toward the west
: of, relating to, or characteristic of a region conventionally designated West: such as
: one that is produced in or characteristic of a western region and especially the western U.S.
: a novel, story, motion picture, or broadcast dealing with life in the western U.S. especially during the latter half of the 19th century
: a substance or preparation used in treating disease
: something that affects well-being
: the science and art dealing with the maintenance of health and the prevention, alleviation, or cure of disease
Seen and Heard
What made you want to look up Western medicine? Please tell us where you read or heard it (including the quote, if possible).