American Expansionism After The 1890s
American expansionism after the 1890s marked a turning point in the nation’s history, when the United States began to move beyond its continental borders and assert itself as a global power. Prior to this era, much of the country’s growth was focused on westward expansion across the North American continent. However, by the late 19th … Read more
Read More