Colonizers (the West) call themselves 'civilized' after wiping out real cultures and deep-rooted civilizations.
If you belive that framing i have bad news for you about those 'real cultures and deep-rooted civilisation s."...
Tell me
The history of the human race.