Few Shot Learning via Multiple Teacher Knowledge Distillation

Yuwen Jie

2024

Abstract

Machine learning enables computers to discover patterns in large datasets and use them for recognition or prediction. A model's performance typically depends on the amount of data available; sufficient training samples ensure good generalization. However, in many applications, acquiring large amounts of labeled data is costly and time-consuming, and in fields like healthcare or autonomous driving, labeled samples can be scarce. This raises the important challenge of training effective models with only a few samples. Few-shot learning addresses this challenge by aiming to perform tasks with very few training examples, learning and generalizing from just a few or even a single sample. The paper designs a novel method that combines Knowledge Distillation and Few-Shot Learning to improve model performance with limited data. By leveraging intermediate features from the teacher model and applying Multiple Teacher-Student Architecture, the paper’s approach enhances feature extraction and adaptation in few-shot scenarios. This method achieves premier results on the various dataset, demonstrating the effectiveness of feature distillation in Few-Shot Learning tasks.

Download


Paper Citation


in Harvard Style

Jie Y. (2024). Few Shot Learning via Multiple Teacher Knowledge Distillation. In Proceedings of the 2nd International Conference on Data Analysis and Machine Learning - Volume 1: DAML; ISBN 978-989-758-754-2, SciTePress, pages 239-245. DOI: 10.5220/0013515000004619


in Bibtex Style

@conference{daml24,
author={Yuwen Jie},
title={Few Shot Learning via Multiple Teacher Knowledge Distillation},
booktitle={Proceedings of the 2nd International Conference on Data Analysis and Machine Learning - Volume 1: DAML},
year={2024},
pages={239-245},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0013515000004619},
isbn={978-989-758-754-2},
}


in EndNote Style

TY - CONF

JO - Proceedings of the 2nd International Conference on Data Analysis and Machine Learning - Volume 1: DAML
TI - Few Shot Learning via Multiple Teacher Knowledge Distillation
SN - 978-989-758-754-2
AU - Jie Y.
PY - 2024
SP - 239
EP - 245
DO - 10.5220/0013515000004619
PB - SciTePress