answersLogoWhite

0

No. The colonies were in the 1600s and 1700s. World War 1 happened from 1914 to 1918.

User Avatar

Wiki User

16y ago

Still curious? Ask our experts.

Chat with our AI personalities

ProfessorProfessor
I will give you the most educated answer.
Chat with Professor
BeauBeau
You're doing better than you think!
Chat with Beau
MaxineMaxine
I respect you enough to keep it real.
Chat with Maxine
More answers

yes. if you learned about Imperialism, you would have learned that before ww1, all the European powers split up Africa and china, and everyone, including France, got a part.

User Avatar

Wiki User

13y ago
User Avatar

During World War I, the U.S. held several territories (essentially colonies) including the Phillippines, Guam, American Samoa, Hawaii, Puerto Rico, and Alaska.

User Avatar

Anonymous

4y ago
User Avatar

Add your answer:

Earn +20 pts
Q: Did France have colonies during World War 1?
Write your answer...
Submit
Still have questions?
magnify glass
imp