answersLogoWhite

0


Best Answer

Yes because the Americans invaded and that's just stupid LOL :p ;)

The rise of the US onto the world stage impacts heavily against the European powers. Britain & France had to put a lot of energy, manpower & economic resources into the war against Germany, & Russia was impoverished also. I suppose Britains high water mark as an Empire was at the beginning of WW1, I think the straight answer to the Q is Yes, which I am sure was what the 1st respondent was going on to say.... Weren't you Packard.....

Technically, no. Europe never literally dominated the world, and it is impossible to end something that never even began.

User Avatar

Wiki User

12y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Was world war 1 the end of European domination of the world?
Write your answer...
Submit
Still have questions?
magnify glass
imp