Colonizers (the West) call themselves 'civilized' after wiping out real cultures and deep-rooted civilizations.