answersLogoWhite

0

No, the United States are a rebelled colony of the British Empire. If this you believe this to be an opinion rather than fact. Many historians will agree that the United States is not now or has been an empire as the general agreement for the definition for an Empire is that it requires a monarch.

User Avatar

Wiki User

16y ago

Still curious? Ask our experts.

Chat with our AI personalities

ViviVivi
Your ride-or-die bestie who's seen you through every high and low.
Chat with Vivi
SteveSteve
Knowledge is a journey, you know? We'll get there.
Chat with Steve
LaoLao
The path is yours to walk; I am only here to hold up a mirror.
Chat with Lao
More answers

Yes, indeed the U.S.A is an Empire, America became an empire before the turn of the century in 1898 during the Spanish American War. During the war and even after it, the U.S. held power over multiple islands like, Cuba, the Philippines, and Puerto Rico.

User Avatar

Wiki User

14y ago
User Avatar

No, it is a democratic republic since 1789 when the constitution was approved. The people of the United States vote for people to represent them. In an empire there is a king involved and he is not selected by the people.

User Avatar

Wiki User

8y ago
User Avatar

Add your answer:

Earn +20 pts
Q: Is the US an empire
Write your answer...
Submit
Still have questions?
magnify glass
imp