In this talk, we consider a multi-parameter (L0-L2) regularization for the linear regression with sparse contraints. A necessary and sufficient condition for the coordinate-wise minimizer is a discontinuous inclusion. A smoothing Newton algorithm with linear search is proposed. The global convergence of this approach is proved, and several numerical examples are given to illustrate the effiency of the algorithm.