answersLogoWhite

0

In my opinion WWII was unavoidable. Germany was wronged, in many people's views, in the aftermath of WW1, thus they needed the war to get out of one of the worst depressions in history and claimed what they believed should belong to them. Many people forget that most of Europe and America didn't want war fearing the devastation of WW1. Chamberlain and other nations developed an appeasement policy to avoid going to war. Hitler acted accepted and took many smaller countries surrounding Europe. Although Hitler took the land that "rightfully belonged to Germany" he soon invaded and conquered France. Only then did England put Churchill in charge and along with Roosevelt and Stalin, liberated France and most of Europe from the Third Reich's grip.

User Avatar

Wiki User

16y ago

What else can I help you with?