The Us acquired Florida from Spain.
Spain needed money to fight its war against Napoleon
1848
most of the present day west (arizona, california, nevada, etc) from mexico Lousiana Purchase from france -- all the way to the rockies oregon expansion by polk -- from British
Answer in the related links...
Florida was not originally part of the United States or Britain. Rather it was actually a Spanish colony that the US acquired. They acquired West Florida through the Treaty of Paris in 1783.
The Us acquired Florida from Spain.
Spain
Spain
was the Secretary of State who helped acquire Florida for the United States
they sing
1819
florida
The US acquired the territory of Florida from Spain ceding it to the US.
Through a treaty with Spain.
The United States acquired Florida from Spain, while the state was split into East and West Florida.
The US bought the state of Florida from the Spanish.