answersLogoWhite

0

It didnt really end in Africa, it was towards the end but it didnt end the war.

Operation Torch was conducted by both American and British forces. Hitler wanted to control all of Africa to control the Mediterranean. The US was after Casablanca and to remove all threat from the area as well. The American troops ultimately pushed hard and General Juin surrendered later that day. Therefore the American troops ended the conflict in Africa before it could even begin

User Avatar

Wiki User

14y ago

Still curious? Ask our experts.

Chat with our AI personalities

RossRoss
Every question is just a happy little opportunity.
Chat with Ross
SteveSteve
Knowledge is a journey, you know? We'll get there.
Chat with Steve
LaoLao
The path is yours to walk; I am only here to hold up a mirror.
Chat with Lao
More answers

World War 1 mostly took place in along the east and west borderlines of Germany and Belgium in Europe, so no it did not take place in Africa.

User Avatar

Wiki User

16y ago
User Avatar

they didnt

User Avatar

Wiki User

12y ago
User Avatar

Add your answer:

Earn +20 pts
Q: Did World War I take place in Africa?
Write your answer...
Submit
Still have questions?
magnify glass
imp