loading
Documents

Research.Publish.Connect.

Paper

Authors: Xiaoyang Gao 1 and Ryutaro Ichise 2

Affiliations: 1 Peking University, China ; 2 National Institute of Informatics and National Institute of Advanced Industrial Science and Technology, Japan

ISBN: 978-989-758-220-2

Keyword(s): NLP, Word Embeddings, Deep Learning, Neural Network.

Related Ontology Subjects/Areas/Topics: Applications ; Artificial Intelligence ; Biomedical Engineering ; Biomedical Signal Processing ; Computational Intelligence ; Health Engineering and Technology Applications ; Human-Computer Interaction ; Knowledge Engineering and Ontology Development ; Knowledge-Based Systems ; Methodologies and Methods ; Natural Language Processing ; Neural Networks ; Neurocomputing ; Neurotechnology, Electronics and Informatics ; Pattern Recognition ; Physiological Computing Systems ; Sensor Networks ; Signal Processing ; Soft Computing ; Symbolic Systems ; Theory and Methods

Abstract: Continuous representations language models have gained popularity in many NLP tasks. To measure the similarity of two words, we have to calculate the cosine distances. However the qualities of word embeddings are due to the selected corpus. As for Word2Vec, we observe that the vectors are far apart to each other. Furthermore, synonym words with low occurrences or with multiple meanings are even further. In these cases, cosine similarities are not appropriate to evaluate how similar the words are. And considering about the structures of most of the language models, they are not as deep as we supposed. Based on these observations, we implement a mixed deep neural networks with two kinds of architectures. We show that adjustment can be done on word embeddings in both unsupervised and supervised ways. Remarkably, this approach improves the cases we mentioned by largely increasing almost all of synonyms similarities. It is also easy to train and adapt to certain tasks by changing the train ing target and dataset. (More)

PDF ImageFull Text

Download
Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 54.166.141.69

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Gao X. and Ichise R. (2017). Adjusting Word Embeddings by Deep Neural Networks.In Proceedings of the 9th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART, ISBN 978-989-758-220-2, pages 398-406. DOI: 10.5220/0006120003980406

@conference{icaart17,
author={Xiaoyang Gao and Ryutaro Ichise},
title={Adjusting Word Embeddings by Deep Neural Networks},
booktitle={Proceedings of the 9th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART,},
year={2017},
pages={398-406},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0006120003980406},
isbn={978-989-758-220-2},
}

TY - CONF

JO - Proceedings of the 9th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART,
TI - Adjusting Word Embeddings by Deep Neural Networks
SN - 978-989-758-220-2
AU - Gao X.
AU - Ichise R.
PY - 2017
SP - 398
EP - 406
DO - 10.5220/0006120003980406

Login or register to post comments.

Comments on this Paper: Be the first to review this paper.