loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Paper Unlock

Authors: Bernhard Bermeitinger 1 ; Tomas Hrycej 2 and Siegfried Handschuh 1

Affiliations: 1 Institute for Computer Science, University of St.Gallen, St.Gallen, Switzerland, Faculty for Computer Science, University Passau, Passau and Germany ; 2 Institute for Computer Science, University of St.Gallen, St.Gallen and Switzerland

Keyword(s): Deep Learning, Artificial Neural Networks, Optimization, Conjugate Gradient.

Related Ontology Subjects/Areas/Topics: Artificial Intelligence ; Computational Intelligence ; Evolutionary Computing ; Knowledge Discovery and Information Retrieval ; Knowledge-Based Systems ; Machine Learning ; Soft Computing ; Symbolic Systems

Abstract: There is some theoretical evidence that deep neural networks with multiple hidden layers have a potential for more efficient representation of multidimensional mappings than shallow networks with a single hidden layer. The question is whether it is possible to exploit this theoretical advantage for finding such representations with help of numerical training methods. Tests using prototypical problems with a known mean square minimum did not confirm this hypothesis. Minima found with the help of deep networks have always been worse than those found using shallow networks. This does not directly contradict the theoretical findings—it is possible that the superior representational capacity of deep networks is genuine while finding the mean square minimum of such deep networks is a substantially harder problem than with shallow ones.

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.144.230.82

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Bermeitinger, B.; Hrycej, T. and Handschuh, S. (2019). Representational Capacity of Deep Neural Networks: A Computing Study. In Proceedings of the 11th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management (IC3K 2019) - KDIR; ISBN 978-989-758-382-7; ISSN 2184-3228, SciTePress, pages 532-538. DOI: 10.5220/0008364305320538

@conference{kdir19,
author={Bernhard Bermeitinger. and Tomas Hrycej. and Siegfried Handschuh.},
title={Representational Capacity of Deep Neural Networks: A Computing Study},
booktitle={Proceedings of the 11th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management (IC3K 2019) - KDIR},
year={2019},
pages={532-538},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0008364305320538},
isbn={978-989-758-382-7},
issn={2184-3228},
}

TY - CONF

JO - Proceedings of the 11th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management (IC3K 2019) - KDIR
TI - Representational Capacity of Deep Neural Networks: A Computing Study
SN - 978-989-758-382-7
IS - 2184-3228
AU - Bermeitinger, B.
AU - Hrycej, T.
AU - Handschuh, S.
PY - 2019
SP - 532
EP - 538
DO - 10.5220/0008364305320538
PB - SciTePress