A BERT–CNN–BIGRU HYBRID MODEL BASED ON INTEGRATION OF CONTEXTUAL AND LOCAL SEMANTIC FEATURES IN TEXT CLASSIFICATION

Authors

  • Muhamediyeva D. T. Author
  • Mamatov A. A. National Research University "Tashkent Institute of Irrigation and Agricultural Mechanization Engineers", Namangan State University Author

Keywords:

Text classification, contextual vector representation, convolutional neural network, bidirectional GRU, hybrid model, semantic integration, deep learning, natural language processing.

Abstract

In this paper, a hybrid neural architecture integrating contextual and local semantic features is proposed to improve accuracy and robustness in text classification. The proposed model combines BERT-based contextual vector representations (embeddings), local semantic features extracted using a Convolutional Neural Network (CNN), and global sequence connections learned using a Bidirectional Gated Recurrent Unit (BiGRU). In the model, semantic features at different levels are combined into a single spatial representation through a feature fusion mechanism, and the final classification result is determined using a softmax activation function. Experimental results show that the proposed BERT–CNN–BiGRU model achieves high accuracy and F1-criterion indicators compared to traditional word vector-based models. This approach can be effectively applied to tasks such as sentiment analysis, topic classification, and automatic information analysis.

Downloads

Published

2026-03-16

Issue

Section

Articles

How to Cite

A BERT–CNN–BIGRU HYBRID MODEL BASED ON INTEGRATION OF CONTEXTUAL AND LOCAL SEMANTIC FEATURES IN TEXT CLASSIFICATION. (2026). Modern American Journal of Engineering, Technology, and Innovation, 2(3), 7-17. https://usajournals.org/index.php/2/article/view/2074