A New Accelerated Proxiaml Point Algorithm For Minimizing Smooth DC Functions

Publish Year: 1400
نوع سند: مقاله کنفرانسی
زبان: English
View: 186

متن کامل این Paper منتشر نشده است و فقط به صورت چکیده یا چکیده مبسوط در پایگاه موجود می باشد.
توضیح: معمولا کلیه مقالاتی که کمتر از ۵ صفحه باشند در پایگاه سیویلیکا اصل Paper (فول تکست) محسوب نمی شوند و فقط کاربران عضو بدون کسر اعتبار می توانند فایل آنها را دریافت نمایند.

  • Certificate
  • من نویسنده این مقاله هستم

استخراج به نرم افزارهای پژوهشی:

لینک ثابت به این Paper:

شناسه ملی سند علمی:

ICIORS14_088

تاریخ نمایه سازی: 12 دی 1400

Abstract:

Several optimization schemes have been known for convex optimization problems. A significant progress to go beyond convexity was made by considering the class of functions representable as difference of convex functions which constitute the backbone of nonconvex programming and global optimization. In this paper, we first, introduce a new algorithm to minimize smooth difference of convex functions that accelerate the convergence of the classical proximal point algorithm. One of the main drawbacks of proximal point methods is their speed of convergence, which is known to be slow. Then, we present a new accelerate proximal point algorithm, which preserves the computational simplicity of proximal point algorithm but with a global rate of convergence which is proven to be significantly better, both theoretically and practically. Convergence of the algorithms is proved and the rate of convergence is analyzed under the Lipschitz continuous gradient of the second term of objective function.

Keywords:

Authors

Amir Hamzeh Alizadeh Tabrizian

Faculty of Mathematical Sciences Yazd University, Yazd, Iran

Narges Bidabadi

Faculty of Mathematical Sciences Yazd University, Yazd, Iran