answersLogoWhite

0

To find better jobs and get away from segregation. Also, blacks were bitterly treated still after the war from the southern elite and white poors' so they probably wanted to be with the people who had fought a war for them.
It's slightly subconscious. Ever since the Civil War ended with the triumph of the North and slavery was abolished, racism still persisted. For a lot of African Americans the North represents a place with an open minded mentality in comparison to the old fashioned thinking of the South.
Because they were seeking better lives

User Avatar

Wiki User

11y ago

What else can I help you with?