I know “Not all Germans were Nazi’s”.
I’ve been trying to read up on whether most people in Germany knew about what was happening during the holocaust. From what I remember from school was most did not know about the concentration camps, (or thought they were a different camp from propaganda)
I guess my question is, Germany was defeated, and afterwards we hear how Germany apologized and acknowledges the atrocities that happened. But, (here is where I’m confused) if there were so many soldiers involved in the concentration camps and possibly some public support for Hitler’s genocidal views (again, not every German but assumingely a substantial number did support) after the war wouldn’t they still have the same views as during the war, which would eventually pass down that resentment to there future children? Or did most of the dirty work happen without the majority of public and military knowing what was actually happening.
Source: reddit post