Most Americans do not think that women should be in wars. If they are in wars, they should work in the background only and not involved in the fighting.
Chat with our AI personalities
Yes
Education, healthcare, and social work
Men were away fighting so the only people left to work were women and minorities.
Most work in occupations such as factory work, was done by men. When they left to go fight the war, there was a large increase in women who went to work in these factories. This helped lead to an increase in women workers, and a shift in the social status of women after the war.
During World War One many woman shared something in common with African-Americans. They had to work all the jobs that had previously been done by the men who were now at war.