Oddbean new post about | logout
 📝 MoCaE: Mixture of Calibrated Experts Significantly Improves Object Detection 🔭

"Mixture of Calibrated Experts (MoCaE) consists of a set of expert models trained with different configurations and/or data augmentation strategies that are then calibrated." [gal30b+] 🤖 #CV

🔗 https://arxiv.org/abs/2309.14976v1 #arxiv

https://creative.ai/system/media_attachments/files/111/141/664/895/056/635/original/8ed304d2eeb7b853.jpg

https://creative.ai/system/media_attachments/files/111/141/664/957/127/786/original/e3c4fb5d0e69a598.jpg

https://creative.ai/system/media_attachments/files/111/141/665/019/179/589/original/d920c0e805ea2f0c.jpg

https://creative.ai/system/media_attachments/files/111/141/665/072/055/744/original/a6514eed9469938e.jpg