answersLogoWhite

0

The civil war changed the nation in many ways, in the north the war changed the way people thought about our country. The south or Union, people began to think of the united states as a single nation rather than I collection of states. The war also caused the national government to expand.

User Avatar

Wiki User

15y ago

Still curious? Ask our experts.

Chat with our AI personalities

FranFran
I've made my fair share of mistakes, and if I can help you avoid a few, I'd sure like to try.
Chat with Fran
ProfessorProfessor
I will give you the most educated answer.
Chat with Professor
BlakeBlake
As your older brother, I've been where you are—maybe not exactly, but close enough.
Chat with Blake
More answers

During the US Civil War, the individual Southern states had internal issues with state's rights. This element, however, was not a leading cause for the South's defeat. In fact, it had a positive effect in that the individual states aided the war effort using their abilities at the local and state level to supplement the work of the central government in Richmond.

User Avatar

Wiki User

8y ago
User Avatar

The United States divided and eventually started the Civil War. During the Civil War, more Americans died than in any other war.

User Avatar

Wiki User

14y ago
User Avatar

Add your answer:

Earn +20 pts
Q: What effect did the American Civil War have upon the nation?
Write your answer...
Submit
Still have questions?
magnify glass
imp