Fooling Neural Networks for Motion Forecasting via Adversarial Attacks

Edgar Medina, Leyong Loh

2024

Abstract

Human motion prediction is still an open problem, which is extremely important for autonomous driving and safety applications. Although there are great advances in this area, the widely studied topic of adversarial attacks has not been applied to multi-regression models such as GCNs and MLP-based architectures in human motion prediction. This work intends to reduce this gap using extensive quantitative and qualitative experiments in state-of-the-art architectures similar to the initial stages of adversarial attacks in image classification. The results suggest that models are susceptible to attacks even on low levels of perturbation. We also show experiments with 3D transformations that affect the model performance, in particular, we show that most models are sensitive to simple rotations and translations which do not alter joint distances. We conclude that similar to earlier CNN models, motion forecasting tasks are susceptible to small perturbations and simple 3D transformations.

Download


Paper Citation


in Harvard Style

Medina E. and Loh L. (2024). Fooling Neural Networks for Motion Forecasting via Adversarial Attacks. In Proceedings of the 19th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP; ISBN 978-989-758-679-8, SciTePress, pages 232-242. DOI: 10.5220/0012562800003660


in Bibtex Style

@conference{visapp24,
author={Edgar Medina and Leyong Loh},
title={Fooling Neural Networks for Motion Forecasting via Adversarial Attacks},
booktitle={Proceedings of the 19th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP},
year={2024},
pages={232-242},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0012562800003660},
isbn={978-989-758-679-8},
}


in EndNote Style

TY - CONF

JO - Proceedings of the 19th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP
TI - Fooling Neural Networks for Motion Forecasting via Adversarial Attacks
SN - 978-989-758-679-8
AU - Medina E.
AU - Loh L.
PY - 2024
SP - 232
EP - 242
DO - 10.5220/0012562800003660
PB - SciTePress