answersLogoWhite

0


Best Answer

It didnt really end in Africa, it was towards the end but it didnt end the war.

Operation Torch was conducted by both American and British forces. Hitler wanted to control all of Africa to control the Mediterranean. The US was after Casablanca and to remove all threat from the area as well. The American troops ultimately pushed hard and General Juin surrendered later that day. Therefore the American troops ended the conflict in Africa before it could even begin

User Avatar

Wiki User

13y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

15y ago

World War 1 mostly took place in along the east and west borderlines of Germany and Belgium in Europe, so no it did not take place in Africa.

This answer is:
User Avatar

User Avatar

Wiki User

11y ago

they didnt

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Did World War I take place in Africa?
Write your answer...
Submit
Still have questions?
magnify glass
imp