by 郭素妙 | 2025-02-05 11:59:20
Prof. Adil Bagirov
Unsupervised learning, semi-supervised learning, supervised learning, regression analysis and clusterwise regression analysis problems are among most important problems in machine learning. There are various optimization models of these problems. Nonsmooth optimization approaches lead to better models with significantly less decision variables than those based on other optimization approaches. In this talk, we discuss nonsmooth optimization, including nonsmooth difference of convex (DC) optimization, models and methods for solving various machine learning problems. We also compare nonsmooth optimization approaches with those based on other optimization approaches.
Prof. Adrian Petrusel
In this talk, starting from the main metric fixed point results for multi-valued operators, we will introduce a general class of multi-valued operators, and we will prove some stability properties (data dependence, well-posedness in the sense of Reich and Zaslavski, Ulam-Hyers stability, Ostrowski stability) for the fixed point set. The coupled fixed point problem and the coincidence point problem with multi-valued operators are also discussed and some applications are pointed out.
Dr. Duong Thi Kim Huyen
This talk provides an overview of Submodular Optimization, highlighting the importance of studying submodular optimization problems and discussing various approaches found in the literature. Additionally, we introduce several recent results from researchers at ORLab, School of Computing, PHENIKAA University, and propose a new promising problem.
Source URL: https://cantor.math.ntnu.edu.tw/index.php/2025/02/05/talk20250219/
Copyright ©2025 unless otherwise noted.