📝 PPT: Token Pruning and Pooling for Efficient Vision Transformers 🔭 "By heuristically integrating both token pruning and token pooling techniques in ViTs without additional trainable parameters, PPT reduces the model complexity while maintaining its predictive accuracy on vision tasks." [gal30b+] 🤖 #CV 🔗 https://arxiv.org/abs/2310.01812v1 #arxiv https://creative.ai/system/media_attachments/files/111/178/673/754/572/961/original/bcab5ebce41584ae.jpg https://creative.ai/system/media_attachments/files/111/178/673/803/345/270/original/58d91cb6ec19e128.jpg https://creative.ai/system/media_attachments/files/111/178/673/875/923/314/original/2ee27515f00eb146.jpg https://creative.ai/system/media_attachments/files/111/178/673/947/666/063/original/ab1d69598b881459.jpg