answersLogoWhite

0

Medical aid Like HIV/AIDS

and helping education in Africa

and pop culture effecting teens in Africa

and they're wearing shirts and pants instead of traditional clothing

rohit arja

User Avatar

Wiki User

12y ago

Still curious? Ask our experts.

Chat with our AI personalities

ProfessorProfessor
I will give you the most educated answer.
Chat with Professor
TaigaTaiga
Every great hero faces trials, and you—yes, YOU—are no exception!
Chat with Taiga
DevinDevin
I've poured enough drinks to know that people don't always want advice—they just want to talk.
Chat with Devin
More answers

Of course; every group than immigrates changes the culture. African immigrants (whether slaves or not) brought many changes to American culture. The most obvious are probably in food and music; in both areas, we can still see African influences today.

To some degree, you might attribute some of the changes in American culture that came from slavery to Africans, although it could be that the same changes would have occurred regardless of who the slaves were (for example, if they were from Asia rather than Africa). These changes include the rise of plantations, and all the other issues, large and small, that come from having slaves.

User Avatar

Wiki User

11y ago
User Avatar

Add your answer:

Earn +20 pts
Q: Did Africans influence American culture
Write your answer...
Submit
Still have questions?
magnify glass
imp