CIVILICA We Respect the Science
(ناشر تخصصی کنفرانسهای کشور / شماره مجوز انتشارات از وزارت فرهنگ و ارشاد اسلامی: ۸۹۷۱)

A Transformer Self-attention Model for Time Series Forecasting

عنوان مقاله: A Transformer Self-attention Model for Time Series Forecasting
شناسه ملی مقاله: JR_JECEI-9-1_001
منتشر شده در در سال 1400
مشخصات نویسندگان مقاله:

R. Mohammdi Farsani - Artificial Intelligence Department, Faculty of Computer Engineering , Shahid RajaeeTeacher Training University, Tehran, Iran.
E. Pazouki - Artificial Intelligence Department, Faculty of Computer Engineering , Shahid RajaeeTeacher Training University, Tehran, Iran.

خلاصه مقاله:
Background and Objectives: Many real-world problems are time series forecasting (TSF) problem. Therefore, providing more accurate and flexible forecasting methods have always been a matter of interest to researchers. An important issue in forecasting the time series is the predicated time interval. Methods: In this paper, a new method is proposed for time series forecasting that can make more accurate predictions at larger intervals than other existing methods. Neural networks are an effective tool for estimating time series due to their nonlinearity and their ability to be used for different time series without specific information of those.  A variety of neural networks have been introduced so far, some of which have been used in forecasting time series.  Encoder decoder Networks are an example of networks that can be used in time series forcasting. an encoder network encodes the input data based on a particular pattern and then a decoder network decodes the output based on the encoded input to produce the desired output. Since these networks have a better understanding of the context, they provide a better performance. An example of this type of network is transformer.  A transformer neural network based on the self-attention is presented that has special capability in forecasting time series problems. Results: The proposed model has been evaluated through experimental results on two benchmark real-world TSF datasets from different domain. The experimental results states that, in terms of long-term estimation Up to eight times more resistant and in terms of estimation accuracy about 20 percent improvement, compare to other well-known methods, is obtained. Computational complexity has also been significantly reduced. Conclusion: The proposed tool could perform better or compete with other introduced methods with less computational complexity and longer estimation intervals. It was also found that with better configuration of the network and better adjustment of attention, it is possible to obtain more desirable results in any specific problem.  

کلمات کلیدی:
Time series forecasting (TSF), self-attention model, transformer Neural Network

صفحه اختصاصی مقاله و دریافت فایل کامل: https://civilica.com/doc/1184642/