Its not black and white. After WWI the U.S.A emerged as a world power. However, the aftermath of WWI left many disillusioned with the nature of the U.S government (hence the lost generation of the 1920s). Despite this, pre-great depression the U.S became one of the foremost countries.
Chat with our AI personalities
The Treaty of Alliance in 1778 made France and the United States allies against British attacks indefinitely. However, the United States had renewed trade with Britain by 1793. They could not honor the treaty with France without damaging their economic health.
The New Deal was a set of domestic programs in the United States in response to the Great Depression. A few of these programs attempted to give the African American minority some help, but they did nothing to deal with the racism and segregation already around, and some of the programs actually hurt the African American community further.
no!
They were nurses and helped the hurt soldier's
During World War I, the Germans destroyed much American commerce on the Atlantic Ocean. While the pleas from Europe were part of the equation, the destruction of American shipping was the main factor in the United States entering the war.