answersLogoWhite

0

It didnt really end in Africa, it was towards the end but it didnt end the war.

Operation Torch was conducted by both American and British forces. Hitler wanted to control all of Africa to control the Mediterranean. The US was after Casablanca and to remove all threat from the area as well. The American troops ultimately pushed hard and General Juin surrendered later that day. Therefore the American troops ended the conflict in Africa before it could even begin

User Avatar

Wiki User

14y ago

Still curious? Ask our experts.

Chat with our AI personalities

CoachCoach
Success isn't just about winning—it's about vision, patience, and playing the long game.
Chat with Coach
ReneRene
Change my mind. I dare you.
Chat with Rene
FranFran
I've made my fair share of mistakes, and if I can help you avoid a few, I'd sure like to try.
Chat with Fran
More answers

World War 1 mostly took place in along the east and west borderlines of Germany and Belgium in Europe, so no it did not take place in Africa.

User Avatar

Wiki User

16y ago
User Avatar

they didnt

User Avatar

Wiki User

12y ago
User Avatar

Add your answer:

Earn +20 pts
Q: Did World War I take place in Africa?
Write your answer...
Submit
Still have questions?
magnify glass
imp