answersLogoWhite

0

No, the United States are a rebelled colony of the British Empire. If this you believe this to be an opinion rather than fact. Many historians will agree that the United States is not now or has been an empire as the general agreement for the definition for an Empire is that it requires a monarch.

User Avatar

Wiki User

16y ago

Still curious? Ask our experts.

Chat with our AI personalities

JudyJudy
Simplicity is my specialty.
Chat with Judy
BlakeBlake
As your older brother, I've been where you are—maybe not exactly, but close enough.
Chat with Blake
DevinDevin
I've poured enough drinks to know that people don't always want advice—they just want to talk.
Chat with Devin
More answers

Yes, indeed the U.S.A is an Empire, America became an empire before the turn of the century in 1898 during the Spanish American War. During the war and even after it, the U.S. held power over multiple islands like, Cuba, the Philippines, and Puerto Rico.

User Avatar

Wiki User

14y ago
User Avatar

No, it is a democratic republic since 1789 when the constitution was approved. The people of the United States vote for people to represent them. In an empire there is a king involved and he is not selected by the people.

User Avatar

Wiki User

8y ago
User Avatar

Add your answer:

Earn +20 pts
Q: Is the US an empire
Write your answer...
Submit
Still have questions?
magnify glass
imp