answersLogoWhite

0

It didnt really end in Africa, it was towards the end but it didnt end the war.

Operation Torch was conducted by both American and British forces. Hitler wanted to control all of Africa to control the Mediterranean. The US was after Casablanca and to remove all threat from the area as well. The American troops ultimately pushed hard and General Juin surrendered later that day. Therefore the American troops ended the conflict in Africa before it could even begin

User Avatar

Wiki User

13y ago

Still curious? Ask our experts.

Chat with our AI personalities

ProfessorProfessor
I will give you the most educated answer.
Chat with Professor
TaigaTaiga
Every great hero faces trials, and you—yes, YOU—are no exception!
Chat with Taiga
SteveSteve
Knowledge is a journey, you know? We'll get there.
Chat with Steve
More answers

World War 1 mostly took place in along the east and west borderlines of Germany and Belgium in Europe, so no it did not take place in Africa.

User Avatar

Wiki User

16y ago
User Avatar

they didnt

User Avatar

Wiki User

12y ago
User Avatar

Add your answer:

Earn +20 pts
Q: Did World War I take place in Africa?
Write your answer...
Submit
Still have questions?
magnify glass
imp