Context sensitive lemmatization using two successive bidirectional gated recurrent networks

Document Type

Conference Article

Publication Title

ACL 2017 - 55th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers)

Abstract

We introduce a composite deep neural network architecture for supervised and language independent context sensitive lemmatization. The proposed method considers the task as to identify the correct edit tree representing the transformation between a word-lemma pair. To find the lemma of a surface word, we exploit two successive bidirectional gated recurrent structures - the first one is used to extract the character level dependencies and the next one captures the contextual information of the given word. The key advantages of our model compared to the state-of-the-art lemmatizers such as Lemming and Morfette are - it is independent of human decided features except the gold lemma, no other expensive morphological attribute is required for joint learning. We evaluate the lemmatizer on nine languages - Bengali, Catalan, Dutch, Hindi, Hungarian, Italian, Latin, Romanian and Spanish. It is found that except Bengali, the proposed method outperforms Lemming and Morfette on the other languages. To train the model on Bengali, we develop a gold lemma annotated dataset1 (having sentences with a total of word tokens), which is an additional contribution of this work.

First Page

1481

Last Page

1491

DOI

10.18653/v1/P17-1136

Publication Date

1-1-2017

Comments

Open Access, Hybrid Gold, Green

This document is currently not available here.

Share

COinS