By the 1880s, many American leaders had become convinced that the United States should join the imperialist powers of Europe and establish colonies.
America becomes a world power: Imperialism
Magazine