You are supporting it. It's already on the public relays people are using.
Nostr instructs that the content is not removed though, because removing content is apparently "censorship". As soon as you start saying it's fine to remove content, the purpose of the protocol ceases to matter and you may as well use http.
And so as a result the relays are used by the existing apps are not moderated and so you literally share a platform with illegal content. This is not theoretical, illegal content exists on the platform and short of only using moderated relays that remove content, nothing can be done to stop it.
It should also be noted that from a legal standpoint it's not just the relays at risk, if the front ends and apps support access to the it then they also have a legal duty to remove illegal content for any jurisdiction they operate in. I doubt "it's a protocol" will carry much weight in court when developers are defending their apps allowing access to child sexual assault images.