Adversarial Evasion Attacks to Deep Neural Networks in ECR Models

Shota Nemoto, Subhash Rajapaksha, Despoina Perouli

2022

Abstract

Evasion attacks produce adversarial examples by adding human imperceptible perturbations and causing a machine learning model to label the input incorrectly. These black box attacks do not require knowledge of the internal workings of the model or access to inputs. Although such adversarial attacks have been shown to be successful in image classification problems, they have not been adequately explored in health care models. In this paper, we produce adversarial examples based on successful algorithms in the literature and attack a deep neural network that classifies heart rhythms in electrocardiograms (ECGs). Several batches of adversarial examples were produced, with each batch having a different limit on the number of queries. The adversarial ECGs with the median distance to their original counterparts were found to have slight but noticeable perturbations when compared side-by-side with the original. However, the adversarial ECGs with the minimum distance in the batches were practically indistinguishable from the originals.

Download


Paper Citation


in Harvard Style

Nemoto S., Rajapaksha S. and Perouli D. (2022). Adversarial Evasion Attacks to Deep Neural Networks in ECR Models. In Proceedings of the 15th International Joint Conference on Biomedical Engineering Systems and Technologies (BIOSTEC 2022) - Volume 5: HEALTHINF; ISBN 978-989-758-552-4, SciTePress, pages 135-141. DOI: 10.5220/0010848700003123


in Bibtex Style

@conference{healthinf22,
author={Shota Nemoto and Subhash Rajapaksha and Despoina Perouli},
title={Adversarial Evasion Attacks to Deep Neural Networks in ECR Models},
booktitle={Proceedings of the 15th International Joint Conference on Biomedical Engineering Systems and Technologies (BIOSTEC 2022) - Volume 5: HEALTHINF},
year={2022},
pages={135-141},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0010848700003123},
isbn={978-989-758-552-4},
}


in EndNote Style

TY - CONF

JO - Proceedings of the 15th International Joint Conference on Biomedical Engineering Systems and Technologies (BIOSTEC 2022) - Volume 5: HEALTHINF
TI - Adversarial Evasion Attacks to Deep Neural Networks in ECR Models
SN - 978-989-758-552-4
AU - Nemoto S.
AU - Rajapaksha S.
AU - Perouli D.
PY - 2022
SP - 135
EP - 141
DO - 10.5220/0010848700003123
PB - SciTePress