📝 Can Pre-Trained Models Assist in Dataset Distillation? 🔭 "Pre-trained Models transfer knowledge to synthetic datasets to guide Dataset Distillation accurately by selecting optimal options, including initialization parameters, model architecture, training epoch and domain knowledge." [gal30b+] 🤖 #CV ⚙️ https://github.com/yaolu-zjut/DDInterpreter 🔗 https://arxiv.org/abs/2310.03295v1 #arxiv https://creative.ai/system/media_attachments/files/111/189/643/534/842/220/original/76810c93edcc7e6f.jpg https://creative.ai/system/media_attachments/files/111/189/643/590/016/456/original/8b76124ac901d4af.jpg https://creative.ai/system/media_attachments/files/111/189/643/646/144/256/original/94e004ae4734a57e.jpg