Relays see everything. You can't be "cancelled" but you can most certainly be deplatformed.
I'm wondering how they technically handle it. I doubt a human is manually combing through all of it. I've wondered if they apply existing filter technology or something. I've had a difficult time figuring out relays in general.
You better believe relay owners don't want bad actors distributing illegal content on their relays. In addition to filters etc, don't think for a second they don't manually dig in when necessary, because they do.
I'm comforted by that, but it also makes me less bullish on the idea of regular people running nodes, which in my opinion is the whole idea of decentralization. I have concerns about potentially storing or relaying child porn, for example. Maybe social media companies deserve a slight break in this regard. It's a big job to police such things and I've had a difficult time figuring out how to run a node even for myself without screwing up and accidentally ending up with bad stuff in my computer. That's the full context of my post. Maybe I'm over thinking it, but over is usually better than under.
There is a adult content on the relays but it is filtered by clients. If you search for the right hashtags you’ll find it in most clients. On slidestr I filter by content warning, hashtags and some „known npubs“ - that works surprisingly well. It needs some manual maintenance though.
It's the illegal non-adult stuff I'm mostly worried about. I want to run my own node, but I'm in the process of considering the possible legal implications of doing so. Lots to learn. Thanks!