If you mean in WWII then this is what I have got.
Germany did not actually declare war on the Americans.
When the Japanese bombed Perl Harbour in America the Americans wanted to declare war on the Japanese but the British persuaded the Americans that it was the Germans that started it and so the Americans declared war on the Germans and any other army with them including the Japanese on 11th December 1941.
Hope that helps! =)
Chat with our AI personalities
Germany didn't declare war on the US in world war 1. It was the US that declared war on Germany on April 6th 1917 as a result of the unrestricted submarine war introduced by Germany in January that year. - I Warner
The US did not declare war on Germany in 1941, Germany declared war on the US as a measure of solidarity with Japan after their attack on Pearl Harbor and the Philippines.
USA didn't declare war on Germany, Germany declared war on the USA. Germany declared war on the USA because the USA declared war on japan when the Japanese attacked pearl harbour in Hawaii
Germany, Italy and the other minor Axis nations. The only nation the US did declare war on was Japan, the day after Japan attacked the US Naval Base at Pearl Harbor. Three days after the US declared war on Japan, Germany and Italy declared war against the US. That was enough to bring about a state of war between the US and Germany and Italy. There was no need for Congress to declare war back.
According to Article 1, Section 8 of the Constitution, Congress has the power to declare war. But the President is the commander-in-chief of the armed forces.