I guess the mainstream image hosts can use AI to flag and take down images. This would cause any bad material to go to hosts that accept child porn material. This would potentially have the effect of 1) take away the majority of bad stuff from global 2) concentrate material and bad actors into several image hosts making it simpler for law enforcement (if they even do anything).