Colonialism historically has been a nation seeking to force it's dominance on another nation, exerting rule on a distant territory. Proponents argued that it seeked to assimilate the native peoples into a greater culture, but more often than not it was oppressive. Colonialization was the catalyst for the regime in South Africa which was so blatantly ruthless and biggoted.
North America was colonized by many of the colonial powers. I don't think we colonize in the classical sense, but a neo-colonialism definitely. Corporations are the new colonial powers. Moving to areas of cheap labour, resulting in disproportionate profit sharing between the corporation and employees.