The optimal value function approach provides a useful reformulation of the bilevel optimization problem, but its utility is often hindered by the nonsmoothness of the value function even in cases when the associated lower-level function is smooth. In this talk, we present two smoothing strategies for the value function associated with lower-level functions that are not necessarily smooth but are Lipschitz continuous. The first method employs a quadratic regularization approach for partially convex lower-level functions, while the second utilizes entropic regularization for general lower-level objective functions. Meanwhile, the property known as gradient consistency is crucial in ensuring that a designed smoothing algorithm is globally subsequentially convergent to stationary points of the equivalent reformulation of the bilevel problem. With this in mind, we show that the proposed smooth approximations satisfy the gradient consistent property under certain conditions on the lower-level function.
Organizer: Jein-Shan Chen