Future possible directions: Domains such as
comics or animation: Future style transfer technology
may see significant innovation in these animation
domains. Wang, facing this potential trend, proposed
a method for cartoonizing images (Wang, Yu, 2020).
This field has a strong demand for style conversion.
For example, people who enjoy this field often like
works to be interpreted in their favorite style, and the
application of deep learning in style transfer can
better learn and understand unique features,
especially line style and color selection; improve the
preservation of details of different artistic styles to
ensure that the converted images still have the
recognizability of different artistic styles; combined
with the current large models based on Transformer
that can generate style images from text, it can greatly
enrich the personalized experience of users.
5 CONCLUSION
In today’s continuous development of style transfer
technology, its application value is constantly being
explored, providing a new path for the pursuit of
aesthetics. This study focuses on the method of
achieving style transfer using CNN technology and
mainly adopts the VGG model for exploration.
Feeding manipulated images into the VGG model
allows for the extraction of style and content
characteristics from the images, and then the
perceptual loss function can be optimized. Using
these optimized features, the style transfer of images
is achieved through a feed-forward network, yielding
fairly good results, with FID values generally below
5, indicating that actual images and the generated
images after style transfer are still very close in
content. The future development prospects are then
discussed, and it can be found that there are different
methods to optimize style transfer technology, such
as introducing attention mechanisms to optimize style
transfer results or using other network structures. It is
expected that the application technology of style
transfer will continue to be developed and improved
in the future, bringing users richer and more
personalized experiences.
REFERENCES
Gatys, L.A., Ecker, A.S. and Bethge, M., 2016. Image style
transfer using convolutional neural networks. In: 2016
IEEE Conference on Computer Vision and Pattern
Recognition. IEEE Press, pp. 2414-2423.
Gatys, L., Ecker, A. and Bethge, M., 2016. A neural
algorithm of artistic style. Journal of Vision.
Johnson, J., Alahi, A. and Fei-Fei, L., 2016. Perceptual
losses for real-time style transfer and super-resolution.
Jing, Y.C., Mao, Y.N., Yang, Y.D. et al., 2022. Learning
graph neural networks for image style transfer. In:
European Conference on Computer Vision. Cham:
Springer, pp. 111-128.
Liu, J.X. and Zhou, J., 2023. A review of style transfer
methods based on deep learning. School of Computer
and Information Science / Software College, Southwest
University.
Si, Z.Y. and Wang, J.H., 2024. Research on image style
transfer method based on improved generative
adversarial networks. Journal of Fuyang Teachers
College (Natural Science Edition), 41(02), pp. 30-37.
Sun, F.W., Li, C.Y., Xie, Y.Q., Li, Z.B., Yang, C.D. and Qi,
J., 2022. A review of deep learning applications in
occluded target detection algorithms. Computer
Science and Exploration, 16(06), pp. 1243-1259.
Wang, X.R. and Yu, J.Z., 2020. Learning to cartoonize
using white-box cartoon representations. In: 2020
IEEE/CVF Conference on Computer Vision and
Pattern Recognition (CVPR). Seattle: IEEE, pp. 8087-
8096.
Wang, Z.P., 2023. Research on image style transfer
algorithms based on deep learning. East China Normal
University.
Zhou, Z., Wu, Y. and Zhou, Y., 2023. Consistent arbitrary
style transfer using consistency training and self-
attention module. IEEE Transactions on Neural
Networks and Learning Systems, [Epub ahead of print].
Zhu, J.Y., Park, T., Isola, P. et al., 2017. Unpaired image-
to-image translation using cycle-consistent adversarial
networks. In: 2017 IEEE International Conference on
Computer Vision (ICCV). New York: IEEE, pp. 2242-
2251.
Zhang, H.H. and Huang, L.J., 2023. Efficient style transfer
based on convolutional neural networks. Guangdong
University of Business and Technology Art College,
Zhaoqing, Guangdong 526000; Computer College.