A Transformer-based Approach for Persian Text Chunking
عنوان مقاله: A Transformer-based Approach for Persian Text Chunking
شناسه ملی مقاله: JR_JADM-10-3_007
منتشر شده در در سال 1401
شناسه ملی مقاله: JR_JADM-10-3_007
منتشر شده در در سال 1401
مشخصات نویسندگان مقاله:
P. Kavehzadeh - Computer Engineering Department, Amirkabir University of Technology, Tehran, Iran.
M. M. Abdollah Pour - Computer Engineering Department, Amirkabir University of Technology, Tehran, Iran.
S. Momtazi - Computer Engineering Department, Amirkabir University of Technology, Tehran, Iran.
خلاصه مقاله:
P. Kavehzadeh - Computer Engineering Department, Amirkabir University of Technology, Tehran, Iran.
M. M. Abdollah Pour - Computer Engineering Department, Amirkabir University of Technology, Tehran, Iran.
S. Momtazi - Computer Engineering Department, Amirkabir University of Technology, Tehran, Iran.
Over the last few years, text chunking has taken a significant part in sequence labeling tasks. Although a large variety of methods have been proposed for shallow parsing in English, most proposed approaches for text chunking in Persian language are based on simple and traditional concepts. In this paper, we propose using the state-of-the-art transformer-based contextualized models, namely BERT and XLM-RoBERTa, as the major structure of our models. Conditional Random Field (CRF), the combination of Bidirectional Long Short-Term Memory (BiLSTM) and CRF, and a simple dense layer are employed after the transformer-based models to enhance the model's performance in predicting chunk labels. Moreover, we provide a new dataset for noun phrase chunking in Persian which includes annotated data of Persian news text. Our experiments reveal that XLM-RoBERTa achieves the best performance between all the architectures tried on the proposed dataset. The results also show that using a single CRF layer would yield better results than a dense layer and even the combination of BiLSTM and CRF.
کلمات کلیدی: Persian text chunking, sequence labeling, deep learning, contextualized word representation
صفحه اختصاصی مقاله و دریافت فایل کامل: https://civilica.com/doc/1525740/