6 CONCLUSION
FedDyn possesses significant algorithmic advantages,
capable of addressing optimization issues caused by
data heterogeneity in federated learning, such as
efficiently handling Non-IID data. By employing a
second-order approximation through Taylor
expansion, it better captures the local characteristics
of client data, thereby enhancing model performance.
It also allows for dynamic regularization design,
which gradually aligns local models with the global
model during training, reducing the deviation
between clients. Moreover, it has broad applicability,
suitable for various federated learning scenarios,
especially excelling in situations where data
distribution is highly heterogeneous. It is important to
note its limitations, as the second-order Taylor
expansion introduces additional computational
overhead. Although the algorithm shows
improvement in convergence speed, further
optimization of computational efficiency is still
required in scenarios with limited communication
bandwidth. Through experimental results, it has
verified the superior performance of the FedDyn
algorithm on the CIFAR-10 and FEMNIST datasets.
Experiments indicate that FedDyn not only
significantly improves the final accuracy of the model
but also accelerates the convergence speed. This
method demonstrates robustness and efficiency in
environments with data heterogeneity and non-
independent and identically distributed data,
indicating a wide range of application prospects.
REFERENCES
Acar, O., FedDyn: A dynamic regularization approach for
federated learning. Proceedings of NeurIPS, 2021.
Jadhav, B., Kulkarni, A., Khang, A., Kulkarni, P. and
Kulkarni, S., 2025. Beyond the horizon: Exploring the
future of artificial intelligence (ai) powered sustainable
mobility in public transportation system. In Driving
Green Transportation System Through Artificial
Intelligence and Automation: Approaches,
Technologies and Applications (pp. 397-409). Cham:
Springer Nature Switzerland.
Li, T., Sahu, A.K., Talwalkar, A. and Smith, V., 2020.
Federated learning: Challenges, methods, and future
directions. IEEE signal processing magazine, 37(3),
pp.50-60.
Li, T., Sahu, A.K., Zaheer, M., Sanjabi, M., Talwalkar, A.
and Smith, V., 2020. Federated optimization in
heterogeneous networks. Proceedings of Machine
learning and systems, 2, pp.429-450.
Ludwig, H. and Baracaldo, N. eds., 2022. Federated
learning: A comprehensive overview of methods and
applications (pp. 13-19). Cham: Springer.
McMahan, B., Moore, E., Ramage, D., Hampson, S. and y
Arcas, B.A., 2017, April. Communication-efficient
learning of deep networks from decentralized data. In
Artificial intelligence and statistics (pp. 1273-1282).
PMLR.
McMahan, B., Moore, E., Ramage, D., Hampson, S., and y
Arcas, B. A. 2018. A survey on federated learning.
Proceedings of ACM Computing Surveys.
Xu, H., Li, X. 2021. Improved FedDyn for federated
learning: A higher-order optimization perspective.
Journal of Machine Learning Research.
Zhao, Y., Li, M., Lai, L., Suda, N., Civin, D. and Chandra,
V., 2018. Federated learning with non-iid data. arXiv
preprint arXiv:1806.00582.
Zhao, Y., Liu, Y., and Tong, H. 2021. Towards federated
learning with heterogeneous data: A dynamic model
aggregation approach. IEEE Transactions on Machine
Learning.