Tag: American imperialism
American imperialism, as taught in high school textbooks, was a brief period from 1898 to 1945 when the United States acquired overseas territories such as the Philippines and Puerto Rico. This era ended in 1945 when the Philippines gained full...