Definition of Deutschland

  • 1. A republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990 Noun
Advertising & Sponsored links

Synonyms for word "deutschland"

Advertising & Sponsored links

Semanticaly linked words with "deutschland"