Unsupervised learning, semi-supervised learning, supervised learning, regression analysis and clusterwise regression analysis problems are among most important problems in machine learning. There are various optimization models of these problems. Nonsmooth optimization approaches lead to better models with significantly less decision variables than those based on other optimization approaches. In this talk, we discuss nonsmooth optimization, including nonsmooth difference of convex (DC) optimization, models and methods for solving various machine learning problems. We also compare nonsmooth optimization approaches with those based on other optimization approaches.