answersLogoWhite

0

Because after ww1 they became international manufactures of of a huge variety of goods, and were able to compete more easily then the Europeans as they had not suffered thesame damage durin the first war. after ww2 they made a fortune out of the sale of weapons and equipment to the rest of their ally's all this wealth is bound to have an effect. It made many Americans question slavery.

User Avatar

Wiki User

15y ago

Still curious? Ask our experts.

Chat with our AI personalities

ProfessorProfessor
I will give you the most educated answer.
Chat with Professor
JudyJudy
Simplicity is my specialty.
Chat with Judy
SteveSteve
Knowledge is a journey, you know? We'll get there.
Chat with Steve
More answers

Because a lot of men enlisted in the army, the women back home assumed some roles of men. Women generally gained more rights. A few years after WWI, women in many areas gain the right to vote.

User Avatar

Wiki User

16y ago
User Avatar

Which change in U.S. society was a result of World War I?

User Avatar

Wiki User

16y ago
User Avatar

ummm...

User Avatar

Wiki User

13y ago
User Avatar

Add your answer:

Earn +20 pts
Q: Why did American society change so dramatically after World War 1?
Write your answer...
Submit
Still have questions?
magnify glass
imp