answersLogoWhite

0

If you mean in WWII then this is what I have got.

Germany did not actually declare war on the Americans.

When the Japanese bombed Perl Harbour in America the Americans wanted to declare war on the Japanese but the British persuaded the Americans that it was the Germans that started it and so the Americans declared war on the Germans and any other army with them including the Japanese on 11th December 1941.

Hope that helps! =)

User Avatar

Wiki User

14y ago

Still curious? Ask our experts.

Chat with our AI personalities

TaigaTaiga
Every great hero faces trials, and you—yes, YOU—are no exception!
Chat with Taiga
LaoLao
The path is yours to walk; I am only here to hold up a mirror.
Chat with Lao
CoachCoach
Success isn't just about winning—it's about vision, patience, and playing the long game.
Chat with Coach
More answers

The US never declared war on Germany, the US declared war on Japan and then Germany declared war on the US

User Avatar

Wiki User

16y ago
User Avatar

Add your answer:

Earn +20 pts
Q: What date did Germany declare war on the US?
Write your answer...
Submit
Still have questions?
magnify glass
imp