answersLogoWhite

0

This is an enormous question!

In some respects not that much changed. The lack of territorial acquistions in World War 1 meant that the Empire was shrinking. England ended up losing Ireland and many of their territories in Africa. The War had detrimental effects on the economy, resulting in one of their highest rates of unemployment.

Post World War I also resulted in the increase in support and power of the labour-liberalist party. Many of the rich tried to carry on as if nothing had changed.

Lloyd George's promises about turning the country into a 'land fit for heroes' turned out to be nonsense and the 1920s saw a remarkable growth in socialism in Britain.

Pacifism became widespread and profoundly affected British policy in the 1930s.

Women aged 30+ were given the vote in 1918 and ten years later it was extended to women aged 21+. Apart from this, the post 1918 period saw only very limited improvements in the position of women.

Among many intellectuals the confidence of thee period before 1914 had gone - but this is very diffuse.

Joncey

User Avatar

Wiki User

13y ago

Still curious? Ask our experts.

Chat with our AI personalities

BeauBeau
You're doing better than you think!
Chat with Beau
ProfessorProfessor
I will give you the most educated answer.
Chat with Professor
ViviVivi
Your ride-or-die bestie who's seen you through every high and low.
Chat with Vivi
More answers

Charles was beheadded so England had a new King (once the war was over obviously)

User Avatar

Wiki User

14y ago
User Avatar

Well it changed as parliament had the power not the king and that is still like it today that is how it changed England otherwise we would still be mostly under the power of the king or queen

User Avatar

Wiki User

12y ago
User Avatar

Parliament took charge like they have since today

User Avatar

Anonymous

4y ago
User Avatar

Add your answer:

Earn +20 pts
Q: What changed in England after World War I?
Write your answer...
Submit
Still have questions?
magnify glass
imp