keyboard_arrow_up
Integrating Graph-Based Representations with Deep Contextual Models for Text Classification

Authors

Sumit Mamtani, New York University, USA

Abstract

Accurate text classification requires both deep contextual understanding and structural representation of language. This study explores a hybrid approach that integrates transformer-based embeddings with graph-based neural architectures to enhance text classification performance. By leveraging pre-trained language models for feature extraction and applying graph convolution techniques for relational modeling, the proposed method captures both semantic meaning and structural dependencies in text. Experimental results demonstrate improved classification accuracy over traditional approaches, highlighting the effectiveness of combining deep contextual learning with graph-based representations in NLP tasks.

Keywords

Text Classification, Graph Neural Networks, BERT, Hybrid Embeddings, Document Modeling

Full Text  Volume 15, Number 14