Effective Data Reduction for Time-Aware Recommender Systems
Publish Year: 1402
Type: Journal paper
Language: English
View: 258
This Paper With 21 Page And PDF Format Ready To Download
- Certificate
- I'm the author of the paper
Export:
Document National Code:
JR_COAM-8-1_003
Index date: 11 June 2023
Effective Data Reduction for Time-Aware Recommender Systems abstract
In recent decades, the amount and variety of data have grown rapidly. As a result, data storage, compression, and analysis have become critical subjects in data mining and machine learning. It is essential to achieve accurate compression without losing important data in the process. Therefore, this work proposes an effective data compression method for recommender systems based on the attention mechanism. The proposed method performs data compression on two levels: features and records. It is time-aware and based on time windows, taking into account users' activity and preventing the loss of important data. The resulting technique can be efficiently utilized for deep networks, where the amount of data is a significant challenge. Experimental results demonstrate that this technique not only reduces the amount of data and processing time but also achieves acceptable accuracy.
Effective Data Reduction for Time-Aware Recommender Systems Keywords:
Effective Data Reduction for Time-Aware Recommender Systems authors
Hadis Ahmadian Yazdi
Department of Computer Engineering, Neyshabur Branch, Islamic Azad University, Neyshabur, Iran
Seyed Javad Seyyed Mahdavi Chabok
Department of Electrical Engineering, Mashhad Branch, Islamic Azad University, Mashhad, Iran
Maryam KheirAbadi
Department of Computer Engineering, Neyshabur Branch, Islamic Azad University, Neyshabur, Iran
مراجع و منابع این Paper:
لیست زیر مراجع و منابع استفاده شده در این Paper را نمایش می دهد. این مراجع به صورت کاملا ماشینی و بر اساس هوش مصنوعی استخراج شده اند و لذا ممکن است دارای اشکالاتی باشند که به مرور زمان دقت استخراج این محتوا افزایش می یابد. مراجعی که مقالات مربوط به آنها در سیویلیکا نمایه شده و پیدا شده اند، به خود Paper لینک شده اند :