loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Paper Unlock
Building Synthetic Simulated Environments for Configuring and Training Multi-camera Systems for Surveillance Applications

Topics: Categorization and Scene Understanding; Deep Learning for Visual Understanding ; Event and Human Activity Recognition; Image Formation, Acquisition Devices and Sensors; Object Detection and Localization

Authors: Nerea Aranjuelo 1 ; 2 ; Jorge García 2 ; Luis Unzueta 2 ; Sara García 2 ; Unai Elordi 1 ; 2 and Oihana Otaegui 2

Affiliations: 1 Basque Country University (UPV/EHU), San Sebastian, Spain ; 2 Vicomtech, Basque Research and Technology Alliance (BRTA), San Sebastian, Spain

Keyword(s): Simulated Environments, Synthetic Data, Deep Neural Networks, Object Detection, Video Surveillance.

Abstract: Synthetic simulated environments are gaining popularity in the Deep Learning Era, as they can alleviate the effort and cost of two critical tasks to build multi-camera systems for surveillance applications: setting up the camera system to cover the use cases and generating the labeled dataset to train the required Deep Neural Networks (DNNs). However, there are no simulated environments ready to solve them for all kind of scenarios and use cases. Typically, ‘ad hoc’ environments are built, which cannot be easily applied to other contexts. In this work we present a methodology to build synthetic simulated environments with sufficient generality to be usable in different contexts, with little effort. Our methodology tackles the challenges of the appropriate parameterization of scene configurations, the strategies to generate randomly a wide and balanced range of situations of interest for training DNNs with synthetic data, and the quick image capturing from virtual cameras considering the rendering bottlenecks. We show a practical implementation example for the detection of incorrectly placed luggage in aircraft cabins, including the qualitative and quantitative analysis of the data generation process and its influence in a DNN training, and the required modifications to adapt it to other surveillance contexts. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.144.35.148

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Aranjuelo, N.; García, J.; Unzueta, L.; García, S.; Elordi, U. and Otaegui, O. (2021). Building Synthetic Simulated Environments for Configuring and Training Multi-camera Systems for Surveillance Applications. In Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2021) - Volume 5: VISAPP; ISBN 978-989-758-488-6; ISSN 2184-4321, SciTePress, pages 80-91. DOI: 10.5220/0010232400800091

@conference{visapp21,
author={Nerea Aranjuelo. and Jorge García. and Luis Unzueta. and Sara García. and Unai Elordi. and Oihana Otaegui.},
title={Building Synthetic Simulated Environments for Configuring and Training Multi-camera Systems for Surveillance Applications},
booktitle={Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2021) - Volume 5: VISAPP},
year={2021},
pages={80-91},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0010232400800091},
isbn={978-989-758-488-6},
issn={2184-4321},
}

TY - CONF

JO - Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2021) - Volume 5: VISAPP
TI - Building Synthetic Simulated Environments for Configuring and Training Multi-camera Systems for Surveillance Applications
SN - 978-989-758-488-6
IS - 2184-4321
AU - Aranjuelo, N.
AU - García, J.
AU - Unzueta, L.
AU - García, S.
AU - Elordi, U.
AU - Otaegui, O.
PY - 2021
SP - 80
EP - 91
DO - 10.5220/0010232400800091
PB - SciTePress