1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites
Bundespräsident Steinmeier in Tansania
Image: Bernd von Jutrczenka/dpa/picture alliance

German colonialism

Until 1918, the German Empire ruled over colonies in Africa, Asia and Oceania. The systematic violent oppression of the people living there is only gradually being dealt with.