in the same way as the data that is not yet labeled, and
the closest class will be assigned to be used as a
prediction at the test samples, an algorithm that helps
to conversion of distributions for individual classes to
Gaussian classes.
Figure 4: Mapping Process (Tomáš, Daniel, Pavel, 2021).
The performance of the method was tested
through small sample datasets CIFAR-FS and CUB,
and the method proposed in the literature was
extremely accurate under one to five tests,
outperforming other forms of feature preprocessing,
and achieved the second place in the METADI
competition for the year 2020.
5 CONCLUSION
The transferable knowledge in small sample learning
is highly dependent on the distribution of training
data. If the distribution of training data is significantly
different from that of the new class, the model may
gain biased transmission ability. Therefore, studying
the impact of data bias on model performance is of
great significance. In addition, after summarizing and
analyzing different methods, it was found that better
transferable knowledge can be learned from training
data related to the content. Case intensive related data
and diverse categories can further enrich the diversity
and relevance of the data. With the continuous
development of artificial intelligence deep learning,
small sample recognition will be significantly
improved. Deep learning models, especially
convolutional neural networks and converters, will be
able to better learn and extract effective feature
representations from limited data. The key to small
sample recognition lies in the model's generalization
ability, that is, whether the model can accurately
recognize unseen images when only a small number
of samples are seen. Future research will focus on
developing models with stronger generalization
ability. By introducing techniques such as meta
learning and data augmentation, the model's ability to
adapt to new samples will be improved. The core
challenge of small sample learning lies in the scarcity
of data. To address this challenge.
Researchers need to develop more effective
mechanisms for data collection, tagging, and sharing.
Meanwhile, transfer learning and other techniques
can also be used to transfer models trained on large
datasets to small sample tasks.
In the future, artificial intelligence small sample
recognition technology will play an important role in
many fields, promoting the intelligent transformation
of related industries. With the continuous expansion
of application scenarios, the paper believe that small
sample image recognition technology will achieve
even more brilliant achievements. At the same time,
researchers also need to pay attention to issues such
as data and privacy protection to ensure the healthy
development of technology. In summary, artificial
intelligence small sample image recognition
technology has broad development prospects and
enormous application potential. Future research will
be dedicated to solving technical challenges and
application scenarios, and better serving human
society.
AUTHORS CONTRIBUTION
All the authors contributed equally and their names
were listed in alphabetical order.
REFERENCES
Alec, R., Jong, W. K., Chris, H., Aditya, R., Gabriel, G.,
Sandhini, A., Girish, S., Amanda, A., Pamela, M., Jack,
C., Gretchen, K., Ilya, S., 2021. Learning Transferable
Visual Models From Natural Language Supervision.
arXiv:2103.00020.
Antreas, A., Harrison, E., Amos, S.,2018, How to train your
MAML. arXiv:1810.09502.
Arnav, C., Zhuang, L., Deepak, G. et al. 2023. One-for-all:
Generalized lora for parameter. ArXiv:2306.07967.
Brandon, T., Kyle, D., Max, G., Ruslan, S., 2023. Effective
Data Augmentation with Diffusion Models.
arXiv:2302.07944.
Gao, P., Han, J. M., Zhang, R. R. et al., 2023. Llama-
adapter v2: Parameter-efficient visual instruction model.
ArXiv:2304.15010.
Ge, Y. Z., Liu, H., Wang, Y. et al. 2022. Survey on Deep
Learning Image Recognition in Dilemma of Small
Samples. Software.