The Rise of American Imperialism. Imperialism – Defined The period at the end of the 19 th century when the United States extended its economic, political,