Answer:
American imperialism is a term that refers to the economic, military, and cultural influence of the United States on other countries. First popularized during the presidency of James K. Polk, the concept of an American Empire was made a reality throughout the latter half of the 1800s. During this time, industrialization caused American businessmen to seek new international markets in which to sell their goods. In addition, the increasing influence of social Darwinism led to the belief that the United States was inherently responsible for bringing concepts such as industry, democracy, and Christianity to less developed savage societies. The combination of these attitudes and other factors led the United States toward imperialism.