Improving Semantic Similarity of Words by Retrofitting Word Vectors in Sense Level

Rui Zhang, Peter Schneider-Kamp, Arthur Zimek

Abstract

This paper presents an approach for retrofitting pre-trained word representations into sense level representations to improve semantic distinction of words. We use semantic relations as positive and negative examples to refine the results of a pre-trained model instead of integrating them into the objective functions used during training. We experimentally evaluate our approach on two word similarity tasks by retrofitting six datasets generated from three widely used techniques for word representation using two different strategies. Our approach significantly and consistently outperforms three state-of-the-art retrofitting approaches.

Download


Paper Citation


in Harvard Style

Zhang R., Schneider-Kamp P. and Zimek A. (2020). Improving Semantic Similarity of Words by Retrofitting Word Vectors in Sense Level.In Proceedings of the 12th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART, ISBN 978-989-758-395-7, pages 108-119. DOI: 10.5220/0008953001080119


in Bibtex Style

@conference{icaart20,
author={Rui Zhang and Peter Schneider-Kamp and Arthur Zimek},
title={Improving Semantic Similarity of Words by Retrofitting Word Vectors in Sense Level},
booktitle={Proceedings of the 12th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART,},
year={2020},
pages={108-119},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0008953001080119},
isbn={978-989-758-395-7},
}


in EndNote Style

TY - CONF

JO - Proceedings of the 12th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART,
TI - Improving Semantic Similarity of Words by Retrofitting Word Vectors in Sense Level
SN - 978-989-758-395-7
AU - Zhang R.
AU - Schneider-Kamp P.
AU - Zimek A.
PY - 2020
SP - 108
EP - 119
DO - 10.5220/0008953001080119