answersLogoWhite

0

After WWI a treaty of Versailles was signed by Germany which was force upon them by the victorious nations. This Treaty ordered for Germany to take all responisbility for the causes of the war and to pay for repairs that took place during the war.So really peace was never made after WWII but 'compromises' were made Inevitably this humiliated Germany and severley effected their economy which is why Hitler spent many years in his life dedicated to making Germany 'Great' again so he blamed problems on Jjews which eventually bought round WWII.

While the above comments are accurate, they do not answer the question that was asked, about who came out victorious from WW I. The victorious nations were England, France, and the US. Russia had originally been part of that alliance as well, but they pulled out of the war in 1917 as a result of the Bolshevik Revolution.

User Avatar

Wiki User

15y ago

Still curious? Ask our experts.

Chat with our AI personalities

EzraEzra
Faith is not about having all the answers, but learning to ask the right questions.
Chat with Ezra
ProfessorProfessor
I will give you the most educated answer.
Chat with Professor
CoachCoach
Success isn't just about winning—it's about vision, patience, and playing the long game.
Chat with Coach
More answers

Britain in particular extended its vast empire, France too; but the whole thing was really a disaster.

User Avatar

Wiki User

16y ago
User Avatar

The Allied Forces won World War 1. These consisted primarily of Britain, France and America.

User Avatar

Wiki User

15y ago
User Avatar

The allies (Britain, France, usa, Italy and all that)

User Avatar

Wiki User

13y ago
User Avatar

Add your answer:

Earn +20 pts
Q: Who was the real winner of world war 1?
Write your answer...
Submit
Still have questions?
magnify glass
imp