Because after ww1 they became international manufactures of of a huge variety of goods, and were able to compete more easily then the Europeans as they had not suffered thesame damage durin the first war. after ww2 they made a fortune out of the sale of weapons and equipment to the rest of their ally's all this wealth is bound to have an effect. It made many Americans question slavery.
Chat with our AI personalities
Because a lot of men enlisted in the army, the women back home assumed some roles of men. Women generally gained more rights. A few years after WWI, women in many areas gain the right to vote.
I believe it was from the Baby Boom the country's population grew dramatically -slim
Because most of America's soldiers returned and took back their jobs. So all the women who had taken over went back to their homes and did what they did before the war. The United States also did not have to produce war equipment and went back to their regular production.
It increased significantly.
What do you call the society of post ww1?
The Progressive Era was characterized by attempts to embrace, accommodate, reach a new balance with, or fight against life