Anticolonialism "is the doctrine that rich countries of the West got rich by invading, occupying and looting poor countries of Asia, Africa and South America. Anticolonialism was the rallying cry of Third World politics for much of the second half of the 20th century."
The concept has historically been called anti-imperialism. In the 1870s, critics of British Prime Minister Benjamin Disraeli coined the term "imperialism" to describe his foreign policy of seeking to exert political and economic control over what was then called the British Empire. In the 19th Century, advocates of imperialism believed that by colonizing other nations, their native people would gain the benefits of science, technology, religion, and political order. Other advocates viewed the wealth and commerce that came from having a large network of colonies as the key to winning the competition with other European nations for economic supremacy.
Following World War II, the world faced the question of how to handle the areas colonized by Germany and Japan as well as the countries that the Axis had occupied during that war. This evolved into a principle of national self-determination, and by the 1960s most nations converted their network of colonies into independent nations.
As for the United States, it converted the Philippines into an independent nation, Puerto Rico into an independent commonwealth, returned the Panama Canal Zone to Panama, and made Alaska and Hawaii states. The United States still administers protectorates over some very small islands in the South Pacific and over the United States Virgin Islands in the Caribbean. Most people would characterize United States foreign policy as opposing imperialism since World War II.
- How Obama Thinks - Dinesh D'Souza - Forbes dated September 27, 2010