Colonialism is the expansion of a nation's sovereignty beyond its borders through the establishment of colonies. It is usually seen as a substantial part of imperialism, especially when involving sovereignty over foreign cultures or peoples. The word today is usually tied to the European expansion during and after the Age of Exploration and involved lands that were far away from the nation ruling them.
Colonialism typically involves the spread of a nation or people's culture through the establishment of new settlements (eg. English settlement of the New England states or opulation transfers to existing settlements (eg. Dutch dominance of South Africa.)
Misuse of the Term
Anti-American sentiment and liberals in general commonly deride American foreign policy as colonial or imperial, which is deceitful, as the United States has no colonies (having relinquished the Philippines, which had previously been Spanish, not sovereign), made Hawaii a state, smaller lands (Guam, etc...) protectorates, and purchased or legally acquired other territories (Guantanamo, Panama Canal). Similarly, liberals also generally imply with colonialism that it was generally parasitic to the original people and stealing resources, when that is actually rare among colonialization.