After Japan bombed Pearl Harbor, the United States declared war on Japan, and then Germany declared war on the United States. Hitler didn’t want the United States in the war in the first place, for obvious reasons. Shouldn’t he have just let Japan suffer for their own mistakes, and stay out of it? Sure, Japan and Germany were “allies”, but it’s not like Hitler had a great record of honoring treaties. (NAP with the Soviet Union)

And the American people didn't really care about the Nazis. They were just Europe being Europe to the Americans. And I know the Americans hated the Japanese. And FDR wanted to fight the Nazis, but everyone wanted to fight just the Japanese. So why did Hitler feel the need to bring the wrath of the United States into Europe?

EDIT: Thanks to all who replied! I enjoyed all the replies, and plan to dive deeper into the links you have all shared. r/history is easily my favorite Subreddit! You guys are awesome! I'll probably be back with different discussions, questions, and hopefully answers to your guy's future posts!

Source: reddit post


Read:  A rant on primary sources

LEAVE A REPLY

Please enter your comment!
Please enter your name here