To be honest, Daniel, I don’t know. This shouldn’t be happening because the fact that we never had this problem (assuming I’m right) in IOS before is because Apple handles this automatically in iOS.
I’m not an expert, but I’ve seen very similar problems in desktop apps like Photoshop. When you work the image in Photoshop in a different color space than the one used in the app in which the image is displayed afterwards. You see your image fine in Photoshop while you work in it, but when you display that image in the web, for instance, the colors appear different.
I wonder, then, if something has changed in IOS in terms of how it handles color spaces. Maybe they now have exposed them to the user, for them to have better control like in Photoshop, but they have left confusing/wrong default settings? 🤔
Maybe someone more knowledgeable than me can help you out better.