"Administrative moderation tooling is also fairly limited: for example, while Mastodon allows user reports and has moderator tools to review them, it has no built-in mechanism to report CSAM to the relevant child safety organizations. It also has no tooling to help moderators in the event of being exposed to traumatic content—for example, grayscaling and fine-grained blurring mechanisms." https://stacks.stanford.edu/file/druid:vb515nd6874/20230724-fediverse-csam-report.pdf