This is an enormous question!
In some respects not that much changed. The lack of territorial acquistions in World War 1 meant that the Empire was shrinking. England ended up losing Ireland and many of their territories in Africa. The War had detrimental effects on the economy, resulting in one of their highest rates of unemployment.
Post World War I also resulted in the increase in support and power of the labour-liberalist party. Many of the rich tried to carry on as if nothing had changed.
Lloyd George's promises about turning the country into a 'land fit for heroes' turned out to be nonsense and the 1920s saw a remarkable growth in socialism in Britain.
Pacifism became widespread and profoundly affected British policy in the 1930s.
Women aged 30+ were given the vote in 1918 and ten years later it was extended to women aged 21+. Apart from this, the post 1918 period saw only very limited improvements in the position of women.
Among many intellectuals the confidence of thee period before 1914 had gone - but this is very diffuse.
Joncey
Yes , the uniforms had changed from world war to world war such as the field uniform of those who served in the first world war went from khaki colored to that of olive drab , the helmets changed shapes , field packs changed , dress uniforms changed , etc ... ~ see link below .
i am not sure but i think it is england
World War I changed the lives of many women in England. Many entered the workplace while their men were in service. They had to deal with rationing.
Japan
Revoultionary war: france war of 1812: no one world war 1: france, england, world war 2: france, england, italy, and russia
World War 1 affected England and Germany.
Leslie Howard immigrated from Prussia to England. His family was Jewish but changed their name during the first world war.
Yes , the uniforms had changed from world war to world war such as the field uniform of those who served in the first world war went from khaki colored to that of olive drab , the helmets changed shapes , field packs changed , dress uniforms changed , etc ... ~ see link below .
i am not sure but i think it is england
It changed the world of war and made the world a little more safe
World War I changed the lives of many women in England. Many entered the workplace while their men were in service. They had to deal with rationing.
the jews
World War 2
In France they planted a tree when world war 1 was ended and the tree fell at world war 2 so what changed no more tree
England and Germany did fight in world war one and two
i am not sure but i think it is england
World War I and World War II