Maximum likelihood optimal and robust Support Vector Regression with lncosh loss function


NEURAL NETWORKS, cilt.94, ss.1-12, 2017 (SCI İndekslerine Giren Dergi)


In this paper, a novel and continuously differentiable convex loss function based on natural logarithm of hyperbolic cosine function, namely lncosh loss, is introduced to obtain Support Vector Regression (SVR) models which are optimal in the maximum likelihood sense for the hyper-secant error distributions. Most of the current regression models assume that the distribution of error is Gaussian, which corresponds to the squared loss function and has helpful analytical properties such as easy computation and analysis. However, in many real world applications, most observations are subject to unknown noise distributions, so the Gaussian distribution may not be a useful choice. The developed SVR model with the parameterized lncosh loss provides a possibility of learning a loss function leading to a regression model which is maximum likelihood optimal for a specific input-output data. The SVR models obtained with different parameter choices of lncosh loss with e-insensitiveness feature, possess most of the desirable characteristics of well-known loss functions such as Vapnik's loss, the Squared loss, and Huber's loss function as special cases. In other words, it is observed in the extensive simulations that the mentioned lncosh loss function is entirely controlled by a single adjustable. parameter and as a result, it allows switching between different losses depending on the choice of.. The effectiveness and feasibility of lncosh loss function are validated through a number of synthetic and real world benchmark data sets for various types of additive noise distributions. (C) 2017 Elsevier Ltd. All rights reserved.