keyboard_arrow_up
Effective Vector Representations for Variable Length Symbol Sequences

Authors

Gustavo Lado and Enrique Carlos Segura, University of Buenos Aires, Argentina

Abstract

Machine learning techniques have demonstrated their versatility and have been successfully applied to a wide variety of problems. However, one of their major limitations is the treatment of sequential information. In general the input and output for these methods is expressed as fixed-dimension vectors, but in many problem domains, as in natural language processing, the information is represented by variable-length sequences. In most cases, it is possible to use some methods that transform these variable length sequences into fixed dimension vectors, but each of these methods has its own disadvantages. In this paper we propose an alternative to obtain vector representations of fixed dimension from sequences of symbols of variable length and their potential applications for natural language processing.

Keywords

Neural Networks, Natural Language Processing, Sequential Learning, Deep Architectures

Full Text  Volume 7, Number 6