History & Social Studies

What is American Imperialism in the nineteenth century?

.

Asked by
Last updated by Jill W
1 Answers
Log in to answer

“American imperialism” refers to the economic, military, and cultural influence of the United States internationally.