answersLogoWhite

0


Best Answer

Oberlin College opened its doors to women in 1848.

User Avatar

Wiki User

13y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: When did women get the right to have an education in America?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

Educateion is important for woman?

first spell EDUCATION right that sucks if you are a women because education is obviously important for you....... first spell EDUCATION right that sucks if you are a women because education is obviously important for you.......


What access to education do women have in Scotland?

Exactly the same access as men. Education is a right in Scotland.


In the 1800s did women have the right to an education?

Yes they could become doctors.


When did the women of America gain the right to vote?

The women of Oregon gained the right to vote in 1912.


Women can vote?

Yes women in America have the right to vote. Women were granted the right to vote in 1919 and is the 19th Amendment of the US Constitution.


How did the 19th amendment change life in America?

It gave women the right to vote; Women's Sufferage.


What was the role of women in America in the 1930?

The role of black women in America in 1930 would be determined by the type of life she lived and the type of education she had. Most were either live in or live out "house servants".


Women in America obtained the right to vote in national elections in?

1920


What is one basic right which most American women were denied in the 1800s?

One basic right which most American women were denied in the 1800s was the right to vote and education opportunities. *DERPY DERP


What was the role of black women in America in 1930?

The role of black women in America in 1930 would be determined by the type of life she lived and the type of education she had. Most were either live in or live out "house servants".


What year and what country did women first gain the right to vote?

women gained the right to vote here, in America in 1920. established in the 19th amendment.


What year did women start working?

In the 1930's when women gained the right to have an education due to WWII