Improved Facial Action Unit Recognition using Local and Global Face Features
Publish Year: 1402
Type: Journal paper
Language: English
View: 143
This Paper With 9 Page And PDF Format Ready To Download
- Certificate
- I'm the author of the paper
این Paper در بخشهای موضوعی زیر دسته بندی شده است:
Export:
Document National Code:
JR_JADM-11-2_004
Index date: 18 July 2023
Improved Facial Action Unit Recognition using Local and Global Face Features abstract
Every facial expression involves one or more facial action units appearing on the face. Therefore, action unit recognition is commonly used to enhance facial expression detection performance. It is important to identify subtle changes in face when particular action units occur. In this paper, we propose an architecture that employs local features extracted from specific regions of face while using global features taken from the whole face. To this end, we combine the SPPNet and FPN modules to architect an end-to-end network for facial action unit recognition. First, different predefined regions of face are detected. Next, the SPPNet module captures deformations in the detected regions. The SPPNet module focuses on each region separately and can not take into account possible changes in the other areas of the face. In parallel, the FPN module finds global features related to each of the facial regions. By combining the two modules, the proposed architecture is able to capture both local and global facial features and enhance the performance of action unit recognition task. Experimental results on DISFA dataset demonstrate the effectiveness of our method.
Improved Facial Action Unit Recognition using Local and Global Face Features Keywords:
Improved Facial Action Unit Recognition using Local and Global Face Features authors
Foad Ghaderi
Human-Computer Interaction Lab., Faculty of Electrical and Computer Engineering, Tarbiat Modares University, Tehran, Iran.
Amin Rahmati
Human-Computer Interaction Lab., Faculty of Electrical and Computer Engineering, Tarbiat Modares University, Tehran, Iran.
مراجع و منابع این Paper:
لیست زیر مراجع و منابع استفاده شده در این Paper را نمایش می دهد. این مراجع به صورت کاملا ماشینی و بر اساس هوش مصنوعی استخراج شده اند و لذا ممکن است دارای اشکالاتی باشند که به مرور زمان دقت استخراج این محتوا افزایش می یابد. مراجعی که مقالات مربوط به آنها در سیویلیکا نمایه شده و پیدا شده اند، به خود Paper لینک شده اند :