In the first part of this paper we show a similarity between the principle of Structural Risk Minimization Principle (SRM) (Vapnik, 1982) and the idea of Sparse Approximation, as defined in (Chen, Donoho and Saunders, 1995) and Olshausen and Field (1996). Then we focus on two specific (approximate) implementations of SRM and Sparse Approximation, which have been used to solve the problem of function approximation. For SRM we consider the Support Vector Machine technique proposed by V. Vapnik and his team at AT\&T Bell Labs, and for Sparse Approximation we consider a modification of the Basis Pursuit De-Noising algorithm proposed by Chen, Donoho and Saunders (1995). We show that, under certain conditions, these two techniques are equivalent: they give the same solution and they require the solution of the same quadratic programming problem.
Cited By
- Vidnerová P and Neruda R Sensor data air pollution prediction by kernel models Proceedings of the 16th IEEE/ACM International Symposium on Cluster, Cloud, and Grid Computing, (666-673)
- Hooshmand Moghaddam V and Hamidzadeh J (2016). New Hermite orthogonal polynomial kernel and combined kernels in Support Vector Machine classifier, Pattern Recognition, 60:C, (921-935), Online publication date: 1-Dec-2016.
- Kianmehr K and Alhajj R (2018). EFFECTIVENESS OF SUPPORT VECTOR MACHINE FOR CRIME HOT-SPOTS PREDICTION, Applied Artificial Intelligence, 22:5, (433-458), Online publication date: 1-May-2008.
- Kudová P and Šámalová T Sum and product kernel regularization networks Proceedings of the 8th international conference on Artificial Intelligence and Soft Computing, (56-65)
- Altun Y, Hofmann T and Smola A Gaussian process classification for segmenting and annotating sequences Proceedings of the twenty-first international conference on Machine learning
- Kudová P and Neruda R Kernel based learning methods Proceedings of the First international conference on Deterministic and Statistical Methods in Machine Learning, (124-136)
- Gunn S and Kandola J (2019). Structural Modelling with Sparse Kernels, Machine Language, 48:1-3, (137-163), Online publication date: 30-Sep-2002.
- Català A and Angulo C (2019). A Comparison between the Tikhonov and the Bayesian Approaches to Calculate Regularisation Matrices, Neural Processing Letters, 11:3, (185-195), Online publication date: 1-Jun-2000.
- Shashua A (1999). On the Relationship Between the Support Vector Machine for Classification and Sparsified Fisher‘s Linear Discriminant, Neural Processing Letters, 9:2, (129-139), Online publication date: 1-Apr-1999.
- Hearst M (1998). Support Vector Machines, IEEE Intelligent Systems, 13:4, (18-28), Online publication date: 1-Jul-1998.
- Girosi F (1998). An equivalence between sparse approximation and support vector machines, Neural Computation, 10:6, (1455-1480), Online publication date: 15-Aug-1998.
Recommendations
Sparse pinball twin support vector machines
AbstractThe original twin support vector machine (TWSVM) formulation works by solving two smaller quadratic programming problems (QPPs) as compared to the traditional hinge-loss SVM (C-SVM) which solves a single large QPP — this makes the ...
Highlights- A novel twin support vector machine with sparse pinball loss (SPTWSVM) is proposed.
Wavelet twin support vector machines based on glowworm swarm optimization
Twin support vector machine is a machine learning algorithm developing from standard support vector machine. The performance of twin support vector machine is always better than support vector machine on datasets that have cross regions. Recently ...
Support vector machines with adaptive Lq penalty
The standard support vector machine (SVM) minimizes the hinge loss function subject to the L"2 penalty or the roughness penalty. Recently, the L"1 SVM was suggested for variable selection by producing sparse solutions [Bradley, P., Mangasarian, O., ...