#Nazism

Arts & Humanities

Women Doctors in Nazi Germany: Opportunists or Engaged Physicians?

For Jewish doctors, Nazism meant emigration or death, but many non-Jewish German female physicians had careers that spanned the Weimar Republic and the Nazi regime. They upheld the aims of a repressive dictatorship.

Pin It on Pinterest