Multilabel Text Classification in News Articles Using Long-Term Memory with Word2Vec

Klasifikasi Teks Multilabel pada Artikel Berita Menggunakan Long Short-Term Memory dengan Word2Vec

  • Winda Kurnia Sari Universitas Sriwijaya
  • Dian Palupi Rini Universitas Sriwijaya
  • Reza Firsandaya Malik Communication Network and Information Security Research Lab
  • Iman Saladin B. Azhar Universitas Sriwijaya
Keywords: recurrent neural network, long short-term memory, multi-label classification, GloVe

Abstract

Multilabel text classification is a task of categorizing text into one or more categories. Like other machine learning, multilabel classification performance is limited to the small labeled data and leads to the difficulty of capturing semantic relationships. It requires a multilabel text classification technique that can group four labels from news articles. Deep Learning is a proposed method for solving problems in multilabel text classification techniques. Some of the deep learning methods used for text classification include Convolutional Neural Networks, Autoencoders, Deep Belief Networks, and Recurrent Neural Networks (RNN). RNN is one of the most popular architectures used in natural language processing (NLP) because the recurrent structure is appropriate for processing variable-length text. One of the deep learning methods proposed in this study is RNN with the application of the Long Short-Term Memory (LSTM) architecture. The models are trained based on trial and error experiments using LSTM and 300-dimensional words embedding features with Word2Vec. By tuning the parameters and comparing the eight proposed Long Short-Term Memory (LSTM) models with a large-scale dataset, to show that LSTM with features Word2Vec can achieve good performance in text classification. The results show that text classification using LSTM with Word2Vec obtain the highest accuracy is in the fifth model with 95.38, the average of precision, recall, and F1-score is 95. Also, LSTM with the Word2Vec feature gets graphic results that are close to good-fit on seventh and eighth models.

Downloads

Download data is not yet available.

References

Li, L., Xiao, L., Jin, W., Zhu, H., & Yang, G. (2018). Text Classification Based on Word2vec and Convolutional Neural Network. International Conference on Neural Information Processing, 3, 450–460. https://doi.org/10.1007/978-3-030-04221-9.

Lilleberg, J., Zhu, Y., & Zhang, Y. (2015). Support vector machines and Word2vec for text classification with semantic features. Proceedings of 2015 IEEE 14th International Conference on Cognitive Informatics and Cognitive Computing, ICCI*CC 2015, 136–140. https://doi.org/10.1109/ICCI-CC.2015.7259377.

Rossi, R. G., Lopes, A. D. A., & Rezende, S. O. (2016). Optimization and label propagation in bipartite heterogeneous networks to improve transductive classification of texts. Information Processing and Management, 52(2), 217–257. https://doi.org/10.1016/j.ipm.2015.07.004

Heidarysafa, M., Kowsari, K., Brown, D. E., Meimandi, K. J., & Barnes, L. E. (2018). An Improvement of Data Classification Using Random Multimodel Deep Learning (RMDL). International Journal of Machine Learning and Computing, 8(4), 298–310. https://doi.org/10.18178/ijmlc.2018.8.4.703.

Goudjil, M., Koudil, M., Bedda, M., & Ghoggali, N. (2018). A Novel Active Learning Method Using SVM for Text Classification. International Journal of Automation and Computing, 15(3), 290–298. https://doi.org/10.1007/s11633-015-0912-z.

Zhang, X., & LeCun, Y. (2015). Character-level convolutional networks for text classification. In Advances in Neural Information Processing Systems, 1–9. Retrieved from http://arxiv.org/abs/1502.01710.

Joulin, A., Grave, E., Bojanowski, P., & Mikolov, T. (2017). Bag of tricks for efficient text classification. 15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017 - Proceedings of Conference, 2, 427–431. https://doi.org/10.18653/v1/e17-2068.

Shen, D., Wang, G., Wang, W., Min, M. R., Su, Q., Zhang, Y., Carin, L. (2018). Baseline Needs More Love: On Simple Word-Embedding-Based Models and Associated Pooling Mechanisms. ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers), 1, 440–450. https://doi.org/10.18653/v1/p18-1041.

Kowsari, K., Brown, D. E., Heidarysafa, M., Jafari Meimandi, K., Gerber, M. S., & Barnes, L. E. (2017). HDLTex: Hierarchical Deep Learning for Text Classification. Proceedings - 16th IEEE International Conference on Machine Learning and Applications, ICMLA 2017, 2017-Decem, 364–371. https://doi.org/10.1109/ICMLA.2017.0-134.

Yan, Y., Wang, Y., Gao, W. C., Zhang, B. W., Yang, C., & Yin, X. C. (2018). LSTM2: Multi-Label Ranking for Document Classification. Neural Processing Letters, 47(1), 117–138. https://doi.org/10.1007/s11063-017-9636-0.

Conneau, A., Schwenk, H., Barrault, L., & Lecun, Y. (2016). Very Deep Convolutional Neural Networks for Text Classification. ArXiv Preprint ArXiv:1606.01781 (2016)., 11727 LNCS, 193–207. https://doi.org/10.1007/978-3-030-30487-4_16.

Wang, G., Li, C., Wang, W., Zhang, Y., Shen, D., Zhang, X., Carin, L. (2018). Joint embedding of words and labels for text classification. ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers), 1, 2321–2331. https://doi.org/10.18653/v1/p18-1216.

Choi, K., Fazekas, G., & Sandler, M. K. C. (2017). Convolutional recurrent neural networks for music classification. IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). https://doi.org/10.1109/ICASSP.2017.7952585.

Zen, H., & Sak, H. (2015). Unidirectional long short-term memory recurrent neural network with recurrent output layer for low-latency speech synthesis. ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings, 2015-August, 4470–4474. https://doi.org/10.1109/ICASSP.2015.7178816.

Chiu, C., Sainath, T. N., Wu, Y., Prabhavalkar, R., Nguyen, P., Chen, Z., … Bacchiani, M. (2018). State-Of-The-Art Speech Recognition With Sequence-To-Sequence Models. IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 4774–4778. https://doi.org/10.1109/ICASSP.2018.8462105.

Kumar, A., & Rastogi, R. (2019). Attentional Recurrent Neural Networks for Sentence Classification. In Innovations in Infrastructure, 549-559. Springer, Singapore. https://doi.org/10.1007/978-981-13-1966-2_49.

Published
2020-04-19
How to Cite
Winda Kurnia Sari, Rini, D. P., Reza Firsandaya Malik, & Iman Saladin B. Azhar. (2020). Multilabel Text Classification in News Articles Using Long-Term Memory with Word2Vec. Jurnal RESTI (Rekayasa Sistem Dan Teknologi Informasi), 4(2), 276 - 285. https://doi.org/10.29207/resti.v4i2.1655
Section
Artikel Teknologi Informasi