Table 4: Two-Network Approach Perf ormance Measure-
ment.
Benchmarks Example 1 Example 2
t
min
(s) 32.43 16.77
t
mean
(s) 33.90 18.33
t
std
(s) 2.33 1.74
L
min
5.4 ×10
−3
2.53 ×10
−2
L
mean
5.8 ×10
−3
2.91 ×10
−2
L
std
3.6 ×10
−4
2.1 ×10
−3
D
max
(%) 77.1 86.2
D
mean
(%) 24.7 44.3
D
std
(%) 15.8 24.5
ison with a Hessian-based approa c h co nfirms the effi-
ciency of the proposed training scheme. The ability to
validate Lyapunov functions with out explicitly com-
puting the Hessian pr ovides a valuable alter native, es-
pecially in cases where a differentiable expression of
the system dynamics is not readily available.
This research opens up possibilities for broade r
application of the Lyapun ov princip le , making it more
accessible and applicable in various domains. Further
investigations can explore the extension of those ap-
proach e s to more complex systems and practical sce-
narios, ensuring their robustness and reliability, or can
focus on hyperparameters optimization to increase the
probability to find the largest Region of Attra ction for
the system considered.
REFERENCES
Bocquillon, B., Feyel, P., Sandou, G., and Rodriguez-
Ayerbe, P. (2022). A comprehensive framework to
determine lyapunov functions for a set of continuous
time stability problems. In IECON 2022 – 48th An-
nual Conference of the IEEE Industrial Electronics
Society, pages 1–6.
Chang, Y.-C. and Gao, S. (2021). Stabilizing Neural Con-
trol Using Self- Learned Almost Lyapunov Critics. I n
2021 IEEE International Conference on Robotics and
Automation (ICRA), pages 1803–1809.
Chang, Y.-C., Roohi, N. , and Gao, S. (2020). Neural Lya-
punov Control. arXiv:2005.00611 [cs, eess, stat].
Cybenko, G. (1989). Approximation by superpositions of a
sigmoidal function. Mathematics of Control, Signals
and Systems, 2(4):303–314.
Gaby, N., Zhang, F., and Ye, X. (2022). Lyapunov-net: A
deep neural network architecture for lyapunov func-
tion approximation. In 2022 IEEE 61st Conference
on Decision and Control (CDC), pages 2091–2096.
Giesl, P. and Hafstein, S. (2015). R eview on computational
methods for Lyapunov functions. Discrete and Con-
tinuous Dynamical Systems - B, 20(8):2291.
Hornik, K. ( 1991). Approximation capabilities of mul-
tilayer feedforward networks. Neural Networks,
4(2):251 – 257.
Hornik, K., Stinchcombe, M., and White, H. (1989). Multi-
layer feedforward networks are universal approxima-
tors. Neural Networks, 2(5):359 – 366.
Khalil, H. K . (2001). Nonlinear Systems. Pearson, Upper
Saddle River, NJ, 3 edition.
Kingma, D. P. and Ba, J. (2017). Adam: A Method for
Stochastic Optimization. arXiv:1412.6980 [ cs].
Liang, S. and Srikant, R. (2017). Why deep neural net-
works for function approximation? 5th International
Conference on Learning Representations, ICLR 2017
- Conference Track Proceedings.
Matlab (2019). Update parameters using adaptive moment
estimation (Adam) - MATLAB adamupdate.
Mawhin, J. (2005). Alexandr Mikhailovich Liapunov, The
general problem of the stability of motion (1892),
pages 664–676. Elsevier.
Petridis, V. and Petridis, S. (2006). Construction of neu-
ral network based lyapunov functions. IEEE Interna-
tional Conference on Neural Networks - Conference
Proceedings, pages 5059 – 5065.
Prokhorov, D. (1994). A Lyapunov machine for stabilit y
analysis of nonlinear systems. In Proceedings of 1994
IEEE International Conference on Neural Networks
(ICNN’94), volume 2, pages 1028–1031 vol.2.
Richards, S. M., Berkenkamp, F., and Krause, A. (2018).
The Lyapunov Neural Network: Adaptive Stability
Certification for Safe Learning of Dynamical Sys-
tems. arXiv:1808.00924 [cs].