What was the role of the United States in World War I?

History World War I Questions



63 Short 66 Medium 48 Long Answer Questions Question Index

What was the role of the United States in World War I?

The United States played a significant role in World War I. Initially, the U.S. adopted a policy of neutrality, but as the war progressed, several factors led to its involvement. The sinking of the British passenger ship Lusitania by a German submarine in 1915, which resulted in the deaths of American citizens, and the interception of the Zimmermann Telegram in 1917, revealing Germany's proposal to Mexico for a military alliance against the U.S., were key events that swayed public opinion. In April 1917, the U.S. declared war on Germany and joined the Allied Powers. The American Expeditionary Forces, led by General John J. Pershing, were sent to Europe to fight alongside the Allies. The U.S. played a crucial role in the final year of the war, contributing to the Allied victory with its troops, resources, and economic support.