Western medicine

noun

Definition of Western medicine

  1. :  the typical methods of healing or treating disease that are taught in Western medical schools

Word by Word Definitions

westernplay Western
  1. :  coming from the west

    :  situated or lying toward the west

    :  of, relating to, or characteristic of a region conventionally designated West: such as

  1. :  one that is produced in or characteristic of a western region and especially the western U.S.

    :  a novel, story, motion picture, or broadcast dealing with life in the western U.S. especially during the latter half of the 19th century

medicineplay
  1. :  a substance or preparation used in treating disease

    :  something that affects well-being

    :  the science and art dealing with the maintenance of health and the prevention, alleviation, or cure of disease


Seen and Heard

What made you want to look up Western medicine? Please tell us where you read or heard it (including the quote, if possible).

WORD OF THE DAY

quaintly unconventional or refined

Get Word of the Day daily email!