A BERT–CNN–BIGRU HYBRID MODEL BASED ON INTEGRATION OF CONTEXTUAL AND LOCAL SEMANTIC FEATURES IN TEXT CLASSIFICATION
Keywords:
Text classification, contextual vector representation, convolutional neural network, bidirectional GRU, hybrid model, semantic integration, deep learning, natural language processing.Abstract
In this paper, a hybrid neural architecture integrating contextual and local semantic features is proposed to improve accuracy and robustness in text classification. The proposed model combines BERT-based contextual vector representations (embeddings), local semantic features extracted using a Convolutional Neural Network (CNN), and global sequence connections learned using a Bidirectional Gated Recurrent Unit (BiGRU). In the model, semantic features at different levels are combined into a single spatial representation through a feature fusion mechanism, and the final classification result is determined using a softmax activation function. Experimental results show that the proposed BERT–CNN–BiGRU model achieves high accuracy and F1-criterion indicators compared to traditional word vector-based models. This approach can be effectively applied to tasks such as sentiment analysis, topic classification, and automatic information analysis.
