answersLogoWhite

0

Because after ww1 they became international manufactures of of a huge variety of goods, and were able to compete more easily then the Europeans as they had not suffered thesame damage durin the first war. after ww2 they made a fortune out of the sale of weapons and equipment to the rest of their ally's all this wealth is bound to have an effect. It made many Americans question slavery.

User Avatar

Wiki User

15y ago

Still curious? Ask our experts.

Chat with our AI personalities

RafaRafa
There's no fun in playing it safe. Why not try something a little unhinged?
Chat with Rafa
FranFran
I've made my fair share of mistakes, and if I can help you avoid a few, I'd sure like to try.
Chat with Fran
RossRoss
Every question is just a happy little opportunity.
Chat with Ross
More answers

Because a lot of men enlisted in the army, the women back home assumed some roles of men. Women generally gained more rights. A few years after WWI, women in many areas gain the right to vote.

User Avatar

Wiki User

16y ago
User Avatar

Which change in U.S. society was a result of World War I?

User Avatar

Wiki User

16y ago
User Avatar

ummm...

User Avatar

Wiki User

13y ago
User Avatar

Add your answer:

Earn +20 pts
Q: Why did American society change so dramatically after World War 1?
Write your answer...
Submit
Still have questions?
magnify glass
imp