A Dual-Norm Support Vector Machine: Integrating L1 and L∞ Slack Penalties for Robust and Sparse Classification

Xiaoyong Liu,Qingyao Liu,Shunqiang Liu,Genglong Yan,Fabin Zhang,Chengbin Zeng,Xiaoliu Yang

Published 2025 in Processes

ABSTRACT

This paper presents a novel support vector machine (SVM) classification approach that simultaneously accounts for both overall and extreme misclassification errors via a dual-norm regularization strategy. Traditional SVMs minimize the L1-norm of slack variables to control global misclassification, while least squares SVM (LSSVM) minimizes the sum of squared errors. In contrast, our method preserves the classical L1-norm penalty to maintain overall classification fidelity and incorporates an additional L∞-norm term to penalize the largest slack variable, thereby constraining the worst-case margin violation. This composite objective yields a more robust and generalizable classifier, particularly effective when occasional large deviations disproportionately affect decision boundaries. The resulting optimization problem minimizes a regularized objective combining the model norm, the sum of slack variables, and the maximum slack variable, with two hyperparameters, C1 and C2, balancing global error against extremal robustness. By formulating the problem under convex constraints, the optimization remains tractable and guarantees a globally optimal solution. Experimental evaluations on benchmark datasets demonstrate that the proposed method achieves comparable or superior classification accuracy while reducing the impact of outliers and maintaining a sparse model structure. These results underscore the advantage of jointly enforcing L1 and L∞ penalties, providing an effective mechanism to balance average performance with worst-case error sensitivity in support vector classification.

PUBLICATION RECORD

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-24 of 24 references · Page 1 of 1

CITED BY

  • No citing papers are available for this paper.

Showing 0-0 of 0 citing papers · Page 1 of 1