Oddbean new post about | logout
 📝 Exploring Sparse MoE in GANs for Text-Conditioned Image Synthesis 🔭

"A mixture-of-experts (MoE) based generative text-to-image (T2I) model that employs a collection of experts to process the feature, together with a sparse router to help select the most suitable expert for each feature point." [gal30b+] 🤖 #CV

🔗 https://arxiv.org/abs/2309.03904v1 #arxiv

https://creative.ai/system/media_attachments/files/111/034/940/928/681/904/original/9dffa549f79e198c.jpg

https://creative.ai/system/media_attachments/files/111/034/940/980/698/084/original/7656c69ef7e6947a.jpg

https://creative.ai/system/media_attachments/files/111/034/941/037/667/729/original/a0e80249799bf85a.jpg

https://creative.ai/system/media_attachments/files/111/034/941/114/177/816/original/f4494cc81cf9f690.jpg