answersLogoWhite

0

America's entering the war on the side of the Allies ensured the defeat of Germany and thus the end of the war. It also made America look towards the wider world and, temporarily, abandon its isolationist policies. Sadly, America's cosmopolitan outlook didn't last, and was only revived by WWII.

User Avatar

Wiki User

16y ago

What else can I help you with?

Related Questions