Western medicine

noun

Definition of Western medicine

: the typical methods of healing or treating disease that are taught in Western medical schools

Learn More About Western medicine

Dictionary Entries Near Western medicine

western meadowlark

Western medicine

westernmost

See More Nearby Entries 

Statistics for Western medicine

Cite this Entry

“Western medicine.” Merriam-Webster.com Dictionary, Merriam-Webster, https://www.merriam-webster.com/dictionary/Western%20medicine. Accessed 23 Jan. 2022.

Style: MLA
MLACheck Mark Icon ChicagoCheck Mark Icon APACheck Mark Icon Merriam-WebsterCheck Mark Icon

WORD OF THE DAY

Test Your Vocabulary

Farm Idioms Quiz

  • cow coming home
  • What does 'poke' refer to in the expression 'pig in a poke'?
Spell It

Can you spell these 10 commonly misspelled words?

TAKE THE QUIZ
Universal Daily Crossword

A daily challenge for crossword fanatics.

TAKE THE QUIZ
Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!