skip to main content
research-article

Sparse pinball twin support vector machines

Published: 01 May 2019 Publication History

Abstract

The original twin support vector machine (TWSVM) formulation works by solving two smaller quadratic programming problems (QPPs) as compared to the traditional hinge-loss SVM (C-SVM) which solves a single large QPP — this makes the TWSVM training and testing process faster than the C-SVM. However, these TWSVM problems are based on the hinge-loss function and, hence, are sensitive to feature noise and unstable for re-sampling. The pinball-loss function, on the other hand, maximizes quantile distances which grants noise insensitivity but this comes at the cost of losing sparsity by penalizing correctly classified samples as well. To overcome the limitations of TWSVM, we propose a novel sparse pinball twin support vector machines (SPTWSVM) based on the ϵ-insensitive zone pinball loss function to rid the original TWSVM of its noise insensitivity and ensure that the resulting TWSVM problems retain sparsity which makes computations relating to predictions just as fast as the original TWSVM. We further investigate the properties of our SPTWSVM including sparsity, noise insensitivity, and time complexity. Exhaustive testing on several benchmark datasets demonstrates that our SPTWSVM is noise insensitive, retains sparsity and, in most cases, outperforms the results obtained by the original TWSVM.

Highlights

A novel twin support vector machine with sparse pinball loss (SPTWSVM) is proposed.
Our proposed SPTWSVM is noise insensitive, retain sparsity and more stable for re-sampling.
The proposed SPTWSVM’s time complexity is approximately four times faster than the Pin-SVM.
Numerical results have shown better generalization performance for noise corrupted datasets.
The proposed SPTWSVM can be easily extended to other variants of TWSVM.

References

[1]
Cortes C., Vapnik V.N., Support-vector networks, Mach. Learn. 20 (3) (1995) 273–297.
[2]
Vapnik V.N., An overview of statistical learning theory, IEEE Trans. Neural Netw. 10 (5) (1999) 988–999.
[3]
Madzarov G., Gjorgjevikj D., Chorbev I., A multi-class svm classifier utilizing binary decision tree, Informatica 33 (2) (2009).
[4]
Elisseeff A., Weston J., A kernel method for multi-labelled classification, in: Advances in Neural Information Processing Systems, 2002, pp. 681–687.
[5]
Godbole S., Sarawagi S., Discriminative methods for multi-labeled classification, in: Pacific-Asia Conference on Knowledge Discovery and Data Mining, Springer, 2004, pp. 22–30.
[6]
Angulo C., Ruiz F.J., González L., Ortega J.A., Multi-classification by using tri-class svm, Neural Process. Lett. 23 (1) (2006) 89–101.
[7]
Platt J.C., Cristianini N., Shawe-Taylor J., Large margin dags for multiclass classification, in: Advances in Neural Information Processing Systems, 2000, pp. 547–553.
[8]
Cheong S., Oh S.H., Lee S.-Y., Support vector machines with binary tree architecture for multi-class classification, Neural Inf. Process. Lett. Rev. 2 (3) (2004) 47–51.
[9]
Crammer K., Singer Y., On the learnability and design of output codes for multiclass problems, Mach. Learn. 47 (2–3) (2002) 201–233.
[10]
Dietterich T.G., Bakiri G., Solving multiclass learning problems via error-correcting output codes, J. Artif. Intell. Res. 2 (1994) 263–286.
[11]
Hsu C.-W., Lin C.-J., A comparison of methods for multiclass support vector machines, IEEE Trans. Neural Netw. 13 (2) (2002) 415–425.
[12]
J. Weston, C. Watkins, Multi-class support vector machines, Technical Report, Citeseer, 1998.
[13]
Déniz O., Castrillon M., Hernández M., Face recognition using independent component analysis and support vector machines, Pattern Recognit. Lett. 24 (13) (2003) 2153–2157.
[14]
Adankon M.M., Cheriet M., Model selection for the ls-svm. application to handwriting recognition, Pattern Recognit. 42 (12) (2009) 3264–3270.
[15]
Li S., Kwok J.T., Zhu H., Wang Y., Texture classification using the support vector machines, Pattern Recognit. 36 (12) (2003) 2883–2893.
[16]
Lal T.N., Schroder M., Hinterberger T., Weston J., Bogdan M., Birbaumer N., Scholkopf B., Support vector channel selection in bci, IEEE Trans. Biomed. Eng. 51 (6) (2004) 1003–1010.
[17]
Molina G.N.G., Ebrahimi T., Vesin J.-M., Joint time-frequency-space classification of eeg in a brain-computer interface application, EURASIP J. Appl. Signal Process. 2003 (2003) 713–729.
[18]
Tanveer M., Mangal M., Ahmad I., Shao Y.-H., One norm linear programming support vector regression, Neurocomputing 173 (2016) 1508–1518.
[19]
Valentini G., Muselli M., Ruffino F., Cancer recognition with bagged ensembles of support vector machines, Neurocomputing 56 (2004) 461–466.
[20]
Mangasarian O.L., Wild E.W., Multisurface proximal support vector machine classification via generalized eigenvalues, IEEE Trans. Pattern Anal. Mach. Intell. 28 (1) (2006) 69–74.
[21]
Jayadeva, Khemchandani R., Chandra S., Twin support vector machines for pattern classification, IEEE Trans. Pattern Anal. Mach. Intell. 29 (5) (2007) 905–910.
[22]
Kumar M.A., Khemchandani R., Gopal M., Chandra S., Knowledge based least squares twin support vector machines, Inform. Sci. 180 (23) (2010) 4606–4618.
[23]
Kumar M.A., Gopal M., Application of smoothing technique on twin support vector machines, Pattern Recognit. Lett. 29 (13) (2008) 1842–1848.
[24]
Tanveer M., Khan M.A., Ho S.-S., Robust energy-based least squares twin support vector machines, Appl. Intell. 45 (1) (2016) 174–186.
[25]
Gao S., Ye Q., Ye N., 1-norm least squares twin support vector machines, Neurocomputing 74 (17) (2011) 3590–3597.
[26]
Tanveer M., Linear programming twin support vector regression, Filomat 31 (7) (2017) 2123–2142.
[27]
Xu Y., Wang L., K-nearest neighbor-based weighted twin support vector regression, Appl. Intell. 41 (1) (2014) 299–309.
[28]
Tanveer M., Shubham K., Aldhaifallah M., Ho S.S., An efficient regularized k-nearest neighbor based weighted twin support vector regression, Knowl.-Based Syst. 94 (2016) 70–87.
[29]
Huang X., Shi L., Suykens J.A., Support vector machine classifier with pinball loss, IEEE Trans. Pattern Anal. Mach. Intell. 36 (5) (2014) 984–997.
[30]
Tanveer M., Robust and sparse linear programming twin support vector machines, Cogn. Comput. 7 (1) (2015) 137–149.
[31]
Tanveer M., Application of smoothing techniques for linear programming twin support vector machines, Knowl. Inf. Syst. 45 (1) (2015) 191–214.
[32]
Tian Y., Ping Y., Large-scale linear nonparallel support vector machine solver, Neural Netw. 50 (2014) 166–174.
[33]
Shao Y.-H., Chen W.-J., Huang W.-B., Yang Z.-M., Deng N.-Y., The best separating decision tree twin support vector machine for multi-class classification, Procedia Comput. Sci. 17 (2013) 1032–1038.
[34]
Wang Z., Shao Y.-H., Bai L., Li C.-N., Liu L.-M., Deng N.-Y., Mblda: A novel multiple between-class linear discriminant analysis, Inform. Sci. 369 (2016) 199–220.
[35]
Xu Y., Yang Z., Pan X., A novel twin support-vector machine with pinball loss, IEEE Trans. Neural Netw. Learn. Syst. 28 (2) (2017) 359–370.
[36]
C. Saunders, A. Gammerman, V. Vovk, Ridge regression learning algorithm in dual variables (1998).
[37]
D. Dheeru, E. Karra Taniskidou, UCI machine learning repository, 2017.

Cited By

View all

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Applied Soft Computing
Applied Soft Computing  Volume 78, Issue C
May 2019
722 pages

Publisher

Elsevier Science Publishers B. V.

Netherlands

Publication History

Published: 01 May 2019

Author Tags

  1. Optimization
  2. Convex programming
  3. Quadratic programming
  4. Support vector machine
  5. Pinball loss
  6. Classification

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 14 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Fuzzy Twin Support Vector Machines With Distribution InputsIEEE Transactions on Fuzzy Systems10.1109/TFUZZ.2023.329650332:1(240-254)Online publication date: 1-Jan-2024
  • (2024)Kernel support vector machine classifiers with ℓ 0-norm hinge lossNeurocomputing10.1016/j.neucom.2024.127669589:COnline publication date: 7-Jul-2024
  • (2024)Sparse least-squares Universum twin bounded support vector machine with adaptive L p-norms and feature selectionExpert Systems with Applications: An International Journal10.1016/j.eswa.2024.123378248:COnline publication date: 15-Aug-2024
  • (2024)Support vector machine with eagle loss functionExpert Systems with Applications: An International Journal10.1016/j.eswa.2023.122168238:PEOnline publication date: 27-Feb-2024
  • (2024)Fused robust geometric nonparallel hyperplane support vector machine for pattern classificationExpert Systems with Applications: An International Journal10.1016/j.eswa.2023.121331236:COnline publication date: 1-Feb-2024
  • (2024)Fast sparse twin learning framework for large-scale pattern classificationEngineering Applications of Artificial Intelligence10.1016/j.engappai.2023.107730130:COnline publication date: 1-Apr-2024
  • (2024)OEC: an online ensemble classifier for mining data streams with noisy labelsData Mining and Knowledge Discovery10.1007/s10618-023-00990-038:3(1101-1124)Online publication date: 1-May-2024
  • (2024)A unified kernel sparse representation framework for supervised learning problemsNeural Computing and Applications10.1007/s00521-023-09321-236:9(4907-4930)Online publication date: 1-Mar-2024
  • (2024)EEG signal classification using improved intuitionistic fuzzy twin support vector machinesNeural Computing and Applications10.1007/s00521-022-07655-x36:1(163-179)Online publication date: 1-Jan-2024
  • (2023)Capped Asymmetric Elastic Net Support Vector Machine for Robust Binary ClassificationInternational Journal of Intelligent Systems10.1155/2023/22013302023Online publication date: 1-Jan-2023
  • Show More Cited By

View Options

View options

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media