answersLogoWhite

0

They went back to normal and became horrible

If you mean after the Communists let go of their claim on East Germany, then that was the re-unification of East Germany and West Germany when the Berlin Wall collapsed, making Germany whole again.

User Avatar

Wiki User

13y ago

Still curious? Ask our experts.

Chat with our AI personalities

BeauBeau
You're doing better than you think!
Chat with Beau
JudyJudy
Simplicity is my specialty.
Chat with Judy
JordanJordan
Looking for a career mentor? I've seen my fair share of shake-ups.
Chat with Jordan
More answers

After Germany surrendered on Nov.11, 1918; the League of Nations organized a peace talk in Paris, 1919. During which, Germany signed the Treaty Of Versailles, in which Germany accepted all blame for the war, and had to pay damages to their victors.

User Avatar

Wiki User

16y ago
User Avatar

It was divided Communist East Germany and Free West Germany.

User Avatar

Wiki User

16y ago
User Avatar

the Germany war ended because Hitler killed himself and america bombed Japan then Japan surrended and the British won

User Avatar

Wiki User

13y ago
User Avatar

Add your answer:

Earn +20 pts
Q: What happened to Germany when the war ended in Europe?
Write your answer...
Submit
Still have questions?
magnify glass
imp