answersLogoWhite

0

Its not black and white. After WWI the U.S.A emerged as a world power. However, the aftermath of WWI left many disillusioned with the nature of the U.S government (hence the lost generation of the 1920s). Despite this, pre-great depression the U.S became one of the foremost countries.

User Avatar

Wiki User

14y ago

Still curious? Ask our experts.

Chat with our AI personalities

JudyJudy
Simplicity is my specialty.
Chat with Judy
BeauBeau
You're doing better than you think!
Chat with Beau
LaoLao
The path is yours to walk; I am only here to hold up a mirror.
Chat with Lao
More answers

yes

User Avatar

Wiki User

13y ago
User Avatar

Add your answer:

Earn +20 pts
Q: Did world war 1 help the united states or hurt?
Write your answer...
Submit
Still have questions?
magnify glass
imp