Medical Dictionary

tropical medicine

noun

Medical Definition of tropical medicine

  1. :  a branch of medicine dealing with tropical diseases and other medical problems of tropical regions


Seen and Heard

What made you want to look up tropical medicine? Please tell us where you read or heard it (including the quote, if possible).

WORD OF THE DAY

capable of being understood in two ways

Get Word of the Day daily email!