answersLogoWhite

0

If you mean in WWII then this is what I have got.

Germany did not actually declare war on the Americans.

When the Japanese bombed Perl Harbour in America the Americans wanted to declare war on the Japanese but the British persuaded the Americans that it was the Germans that started it and so the Americans declared war on the Germans and any other army with them including the Japanese on 11th December 1941.

Hope that helps! =)

User Avatar

Wiki User

14y ago

Still curious? Ask our experts.

Chat with our AI personalities

JudyJudy
Simplicity is my specialty.
Chat with Judy
EzraEzra
Faith is not about having all the answers, but learning to ask the right questions.
Chat with Ezra
JordanJordan
Looking for a career mentor? I've seen my fair share of shake-ups.
Chat with Jordan
More answers

The US never declared war on Germany, the US declared war on Japan and then Germany declared war on the US

User Avatar

Wiki User

16y ago
User Avatar

Add your answer:

Earn +20 pts
Q: What date did Germany declare war on the US?
Write your answer...
Submit
Still have questions?
magnify glass
imp