answersLogoWhite

0

No. The colonies were in the 1600s and 1700s. World War 1 happened from 1914 to 1918.

User Avatar

Wiki User

16y ago

Still curious? Ask our experts.

Chat with our AI personalities

TaigaTaiga
Every great hero faces trials, and you—yes, YOU—are no exception!
Chat with Taiga
CoachCoach
Success isn't just about winning—it's about vision, patience, and playing the long game.
Chat with Coach
ProfessorProfessor
I will give you the most educated answer.
Chat with Professor
More answers

yes. if you learned about Imperialism, you would have learned that before ww1, all the European powers split up Africa and china, and everyone, including France, got a part.

User Avatar

Wiki User

13y ago
User Avatar

During World War I, the U.S. held several territories (essentially colonies) including the Phillippines, Guam, American Samoa, Hawaii, Puerto Rico, and Alaska.

User Avatar

Anonymous

5y ago
User Avatar

Add your answer:

Earn +20 pts
Q: Did France have colonies during World War 1?
Write your answer...
Submit
Still have questions?
magnify glass
imp