answersLogoWhite

0

Its not black and white. After WWI the U.S.A emerged as a world power. However, the aftermath of WWI left many disillusioned with the nature of the U.S government (hence the lost generation of the 1920s). Despite this, pre-great depression the U.S became one of the foremost countries.

User Avatar

Wiki User

14y ago

Still curious? Ask our experts.

Chat with our AI personalities

ViviVivi
Your ride-or-die bestie who's seen you through every high and low.
Chat with Vivi
RossRoss
Every question is just a happy little opportunity.
Chat with Ross
MaxineMaxine
I respect you enough to keep it real.
Chat with Maxine
More answers

yes

User Avatar

Wiki User

13y ago
User Avatar

Add your answer:

Earn +20 pts
Q: Did world war 1 help the united states or hurt?
Write your answer...
Submit
Still have questions?
magnify glass
imp