colonialism, Western


colonialism, Western

a political-economic phenomenon whereby various European nations explored, conquered, settled, and exploited large areas of the world. The purposes of colonialism included economic exploitation of the colony's natural resources, creation of new markets for the colonizer, and extension of the colonizer's way of life beyond its national borders. In the years 1500–1900, Europe colonized all of North and South America and Australia, most of Africa, and much of Asia by sending settlers to populate the land or by taking control of governments. The first colonies were established in the Western Hemisphere by the Spanish and Portuguese in the 15th–16th century. The Dutch colonized Indonesia in the 16th century, and Britain colonized North America and India in the 17th–18th century. Later British settlers colonized Australia and New Zealand. Colonization of Africa only began in earnest in the 1880s, but by 1900 virtually the entire continent was controlled by Europe. The colonial era ended gradually after World War II; the only territories still governed as colonies today are small islands. See also decolonization, dependency, imperialism.

This entry comes from Encyclopædia Britannica Concise.
For the full entry on colonialism, Western, visit Britannica.com.

Seen & Heard

What made you look up colonialism, Western? Please tell us what you were reading, watching or discussing that led you here.

Get Our Free Apps
Voice Search, Favorites,
Word of the Day, and More
Join Us on FB & Twitter
Get the Word of the Day and More