Authors:
Vasileios Tsakalos
and
Roberto Henriques
Affiliation:
NOVA IMS Information Management School, Universidade Nova de Lisboa, 1070-312, Lisboa and Portugal
Keyword(s):
Recursive Neural Network, Gated Recurrent Units, Natural Language Processing, Sentiment Classification.
Related
Ontology
Subjects/Areas/Topics:
Artificial Intelligence
;
Business Analytics
;
Clustering and Classification Methods
;
Computational Intelligence
;
Context Discovery
;
Data Analytics
;
Data Engineering
;
Evolutionary Computing
;
Knowledge Discovery and Information Retrieval
;
Knowledge-Based Systems
;
Machine Learning
;
Soft Computing
;
Symbolic Systems
Abstract:
Recurrent Neural Networks(RNN) is a good way of modeling sequences. However this type of Artificial Neural Networks(ANN) has two major drawbacks, it is not good at capturing long range connections and it is not robust at the vanishing gradient problem(Hochreiter, 1998). Luckily, there have been invented RNNs that can deal with these problems. Namely, Gated Recurrent Units(GRU) networks(Chung et al., 2014)(Gülçehre et al., 2013) and Long Short Term Memory(LSTM) networks(Hochreiter and Schmidhuber, 1997). Many problems in Natural Language Processing can be approximated with a sequence model. But, it is known that the syntactic rules of natural language have a recursive structure(Socher et al., 2011b). Therefore a Recursive Neural Network(Goller and Kuchler, 1996) can be a great alternative. Kai Sheng Tai (Tai et al., 2015) has come up with an architecture that gives the good properties of LSTM in a Recursive Neural Network. In this report, we will present another alternative of Recursi
ve Neural Networks combined with GRU which performs very similar on binary and fine-grained Sentiment Classification (on Stanford Sentiment Treebank dataset) with N-ary Tree-Structured LSTM but is trained faster.
(More)