Definition of Tropical Medicine

  • 1. The branch of medicine that deals with the diagnosis and treatment of diseases that are found most often in tropical regions Noun

Semanticaly linked words with "tropical medicine"

Hyponims for word "tropical medicine"