imperialism


im·pe·ri·al·ism

noun \im-ˈpir-ē-ə-ˌli-zəm\

: a policy or practice by which a country increases its power by gaining control over other areas of the world

: the effect that a powerful country or group of countries has in changing or influencing the way people live in other, poorer countries

Full Definition of IMPERIALISM

1
:  imperial government, authority, or system
2
:  the policy, practice, or advocacy of extending the power and dominion of a nation especially by direct territorial acquisitions or by gaining indirect control over the political or economic life of other areas; broadly :  the extension or imposition of power, authority, or influence <union imperialism>
im·pe·ri·al·ist \-list\ noun or adjective
im·pe·ri·al·is·tic \-ˌpir-ē-ə-ˈlis-tik\ adjective
im·pe·ri·al·is·ti·cal·ly \-ti-k(ə-)lē\ adverb

Examples of IMPERIALISM

  1. British imperialism created the enormous British Empire.

First Known Use of IMPERIALISM

1800

Other Government and Politics Terms

agent provocateur, agitprop, autarky, cabal, egalitarianism, federalism, hegemony, plenipotentiary, popular sovereignty, socialism

imperialism

noun    (Concise Encyclopedia)

State policy, practice, or advocacy of extending power and dominion, especially by direct territorial acquisition or by gaining political and economic control of other areas. Because imperialism always involves the use of power, often in the form of military force, it is widely considered morally objectionable, and the term accordingly has been used by states to denounce and discredit the foreign policies of their opponents. Imperialism in ancient times is clear in the unending succession of empires in China, western Asia, and the Mediterranean. Between the 15th century and the middle of the 18th, England, France, the Netherlands, Portugal, and Spain built empires in the Americas, India, and the East Indies. Russia, Italy, Germany, the United States, and Japan became imperial powers in the period from the middle of the 19th century to World War I. The imperial designs of Japan, fascist Italy, and Nazi Germany in the 1930s culminated in the outbreak of World War II. After the war the Soviet Union consolidated its military and political control of the states of eastern Europe (see Iron Curtain). From the early 20th century the U.S. was accused of imperialism for intervening in the affairs of developing countries in order to protect the interests of U.S.-owned international corporations (see United Fruit Co.). Economists and political theorists have debated whether imperialism benefits the states that practice it and whether such benefits or other reasons ever justify a state in pursuing imperialist polices. Some theorists, such as Niccolò Machiavelli, have argued that imperialism is the justified result of the natural struggle for survival among peoples. Others have asserted that it is necessary in order to ensure national security. A third justification for imperialism, offered only infrequently after World War II, is that it is a means of liberating peoples from tyrannical rule or bringing them the blessings of a superior way of life. See also colonialism; sphere of influence.

Browse

Next Word in the Dictionary: imperialization
Previous Word in the Dictionary: imperial green
All Words Near: imperialism

Seen & Heard

What made you want to look up imperialism? Please tell us where you read or heard it (including the quote, if possible).