American imperialism
During the 19th century, the United States' imperialism era has reached one of the strongest points in the history. One of the most prominent events during the American imperialism era is the annexation of certain territories that were critical to the United States positioning to world domination.