herbal medicine noun
: the art or practice of using herbs and herbal remedies to maintain health and to prevent, alleviate, or cure disease—called also herbalism
Seen & Heard
What made you want to look up herbal medicine? Please tell us where you read or heard it (including the quote, if possible).