loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Paper Unlock

Author: Avi Bleiweiss

Affiliation: BShalem Research, Sunnyvale and U.S.A.

Keyword(s): Kinematics, Recurrent Neural Networks, Long Short-term Memory, Sequence Model, Attention.

Related Ontology Subjects/Areas/Topics: Artificial Intelligence ; Biomedical Engineering ; Biomedical Signal Processing ; Computational Intelligence ; Health Engineering and Technology Applications ; Higher Level Artificial Neural Network Based Intelligent Systems ; Human-Computer Interaction ; Learning Paradigms and Algorithms ; Methodologies and Methods ; Neural Networks ; Neurocomputing ; Neurotechnology, Electronics and Informatics ; Pattern Recognition ; Physiological Computing Systems ; Sensor Networks ; Signal Processing ; Soft Computing ; Theory and Methods

Abstract: Automating the tasks of generating test questions and analyzing content for assessment of written student responses has been one of the more sought-after applications to support classroom educators. However, a major impediment to algorithm advances in developing such tools is the lack of large and publicly available domain corpora. In this paper, we explore deep learning of physics word problems performed at scale using the transformer, a state-of-the-art self-attention neural architecture. Our study proposes an intuitive novel approach to a tree-based data generation that relies mainly on physical knowledge structure and defers compositionality of natural language clauses to the terminal nodes. Applying our method to the simpler kinematics domain that describes motion properties of an object at a uniform acceleration rate and using our neural sequence model pretrained on a dataset of ten thousand machine-produced problems, we achieved BLEU scores of 0.54 and 0.81 for predicting deri vation expressions on real-world and synthetic test sets, respectively. Notably increasing the number of trained problems resulted in a diminishing return on performance. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.138.200.66

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Bleiweiss, A. (2019). Neural Sequence Modeling in Physical Language Understanding. In Proceedings of the 11th International Joint Conference on Computational Intelligence (IJCCI 2019) - NCTA; ISBN 978-989-758-384-1; ISSN 2184-3236, SciTePress, pages 464-472. DOI: 10.5220/0008071104640472

@conference{ncta19,
author={Avi Bleiweiss.},
title={Neural Sequence Modeling in Physical Language Understanding},
booktitle={Proceedings of the 11th International Joint Conference on Computational Intelligence (IJCCI 2019) - NCTA},
year={2019},
pages={464-472},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0008071104640472},
isbn={978-989-758-384-1},
issn={2184-3236},
}

TY - CONF

JO - Proceedings of the 11th International Joint Conference on Computational Intelligence (IJCCI 2019) - NCTA
TI - Neural Sequence Modeling in Physical Language Understanding
SN - 978-989-758-384-1
IS - 2184-3236
AU - Bleiweiss, A.
PY - 2019
SP - 464
EP - 472
DO - 10.5220/0008071104640472
PB - SciTePress