GS-GA-SVM: A Novel Hybrid Grid Search-Genetic Algorithm Framework for SVM Parameter Optimization in Medical Diagnosis

Authors

  • Satish Kumar Kalagotla
  • Thoudam Basanta
  • Mutum Bidyarani Devi

Abstract

Background: Support Vector Machines (SVMs) are among the most powerful classifiers for medical diagnosis, but their performance critically depends on proper hyperparameter selection. Traditional grid search is computationally expensive and suffers from discretization errors, while genetic algorithms may converge prematurely and require extensive tuning. Hybrid approaches combining both methods offer a promising solution.

Objective: This paper proposes GS-GA-SVM, a novel hybrid framework that integrates Grid Search and Genetic Algorithms for efficient and effective SVM parameter optimization in medical diagnosis.

Methods: The proposed GS-GA-SVM framework operates in two stages: (1) coarse grid search (20×20) identifies promising regions of the parameter space (C ∈ [0.1, 100], γ ∈ [0.001, 10] in log scale); (2) genetic algorithm with adaptive operators performs fine-grained optimization within the identified region. The framework was evaluated on four benchmark medical datasets (Wisconsin Breast Cancer, PIMA Indian Diabetes, Hepatitis, and Mammographic Mass) and compared against standard grid search, random search, pure GA, and Bayesian optimization using 10-fold cross-validation with five repeats.

Results: GS-GA-SVM achieved 98.24% accuracy on the Wisconsin dataset, outperforming grid search (97.12%), random search (97.58%), pure GA (97.98%), and Bayesian optimization (97.89%). The hybrid approach reduced computational time by 63.8% compared to exhaustive grid search (124.8s vs 345.2s) while improving accuracy by 1.12%. On the PIMA dataset, accuracy improved from 86.34% (grid search) to 87.12% with 62.4% time savings. Hepatitis dataset showed 88.24% accuracy with 14.4%-time increase (small dataset overhead), and Mammographic dataset achieved 90.12% accuracy with 76.1%-time savings. Parameter sensitivity analysis revealed optimal configuration: 20×20 grid, GA population size 50, 50 generations, crossover rate 0.8, adaptive mutation, and elitism preservation. The framework demonstrated robust performance across different random seeds and data splits.

Conclusion: GS-GA-SVM provides an efficient and effective framework for SVM parameter optimization, achieving superior accuracy with significant computational savings compared to exhaustive grid search. The two-stage hybrid approach successfully balances exploration and exploitation, making it particularly suitable for developing high-performance diagnostic systems where both accuracy and computational efficiency are critical.

References

V. N. Vapnik, The Nature of Statistical Learning Theory. New York, NY, USA: Springer, 1995.

C. Cortes and V. Vapnik, “Support-vector networks,” Machine Learning, vol. 20, no. 3, pp. 273–297, 1995.

B. Schölkopf and A. J. Smola, Learning with Kernels. Cambridge, MA, USA: MIT Press, 2002.

C. J. C. Burges, “A tutorial on support vector machines for pattern recognition,” Data Mining and Knowledge Discovery, vol. 2, no. 2, pp. 121–167, 1998.

S. S. Keerthi and C. J. Lin, “Asymptotic behaviors of support vector machines with Gaussian kernel,” Neural Computation, vol. 15, no. 7, pp. 1667–1689, 2003.

O. Chapelle, V. Vapnik, O. Bousquet, and S. Mukherjee, “Choosing multiple parameters for support vector machines,” Machine Learning, vol. 46, no. 1, pp. 131–159, 2002.

C. W. Hsu, C. C. Chang, and C. J. Lin, “A practical guide to support vector classification,” Technical Report, National Taiwan University, 2003.

M. Momma and K. P. Bennett, “A pattern search method for model selection of support vector regression,” in Proceedings of the SIAM International Conference on Data Mining, 2002, pp. 261–274.

C. Staelin, “Parameter selection for support vector machines,” HP Laboratories, Technical Report HPL-2002-354, 2002.

J. Bergstra and Y. Bengio, “Random search for hyper-parameter optimization,” Journal of Machine Learning Research, vol. 13, pp. 281–305, 2012.

J. Bergstra, R. Bardenet, Y. Bengio, and B. Kégl, “Algorithms for hyper-parameter optimization,” in Advances in Neural Information Processing Systems, vol. 24, 2011, pp. 2546–2554.

D. E. Goldberg, Genetic Algorithms in Search, Optimization, and Machine Learning. Reading, MA, USA: Addison-Wesley, 1989.

A. C. Lorena and A. C. de Carvalho, “Evolutionary tuning of SVM parameter values in multiclass problems,” Neurocomputing, vol. 71, nos. 16–18, pp. 3326–3334, 2008.

J. Snoek, H. Larochelle, and R. P. Adams, “Practical Bayesian optimization of machine learning algorithms,” in Advances in Neural Information Processing Systems, vol. 25, 2012, pp. 2951–2959.

B. Shahriari, K. Swersky, Z. Wang, R. P. Adams, and N. De Freitas, “Taking the human out of the loop: A review of Bayesian optimization,” Proceedings of the IEEE, vol. 104, no. 1, pp. 148–175, 2015.

J. H. Min and C. Jeong, “A hybrid tuning method for the architecture of support vector machine,” Journal of the Korean Operations Research Society, vol. 34, no. 4, pp. 214–226, 2009.

E. G. Talbi, Metaheuristics: From Design to Implementation. Hoboken, NJ, USA: John Wiley & Sons, 2009.

C. L. Huang and C. J. Wang, “A GA-based feature selection and parameters optimization for support vector machines,” Expert Systems with Applications, vol. 31, no. 2, pp. 231–240, 2006.

F. Friedrichs and C. Igel, “Evolutionary tuning of multiple SVM parameters,” Neurocomputing, vol. 64, pp. 107–117, 2005.

Y. Shi, R. C. Eberhart, and Y. Chen, “Implementation of evolutionary fuzzy systems,” IEEE Transactions on Fuzzy Systems, vol. 7, no. 2, pp. 109–119, 2005.

S. W. Lin, K. C. Ying, S. C. Chen, and Z. J. Lee, “Particle swarm optimization for parameter determination and feature selection of support vector machines,” Expert Systems with Applications, vol. 35, no. 4, pp. 1817–1824, 2008.

N. B. Nghien, C. N. Cong, N. N. Thi, and V. Q. Dung, “Comparison among search algorithms for hyperparameter of support vector machine optimization,” IAES International Journal of Artificial Intelligence, vol. 14, no. 5, 2025.

V. A. Phan and L. T. Bui, “Genetic algorithm and application for supporting working schedule at hospitals,” LQDTU Journal of Science and Technology, vol. 2, no. 4, pp. 92–104, 2013.

B. L. Miller and D. E. Goldberg, “Genetic algorithms, tournament selection, and the effects of noise,” Complex Systems, vol. 9, no. 3, pp. 193–212, 1995.

L. J. Eshelman and J. D. Schaffer, “Real-coded genetic algorithms and interval-schemata,” in Foundations of Genetic Algorithms, vol. 2, 1993, pp. 187–202.

F. Herrera, M. Lozano, and A. M. Sánchez, “A taxonomy for the crossover operator for real-coded genetic algorithms: An experimental study,” International Journal of Intelligent Systems, vol. 18, no. 3, pp. 309–338, 2003.

T. Bäck, Evolutionary Algorithms in Theory and Practice. Oxford, United Kingdom: Oxford University Press, 1996.

A. E. Eiben and J. E. Smith, Introduction to Evolutionary Computing, 2nd ed. Berlin, Germany: Springer, 2015.

K. A. De Jong, “An analysis of the behavior of a class of genetic adaptive systems,” Ph.D. dissertation, University of Michigan, Ann Arbor, MI, USA, 1975.

E. Cantú-Paz, Efficient and Accurate Parallel Genetic Algorithms. Boston, MA, USA: Springer, 2000. K. Deb, Multi-Objective Optimization Using Evolutionary Algorithms. Hoboken, NJ, USA: John Wiley & Sons, 2001.

W. H. Wolberg and O. L. Mangasarian, “Multisurface method of pattern separation for medical diagnosis applied to breast cytology,” Proceedings of the National Academy of Sciences, vol. 87, no. 23, pp. 9193–9196, 1990. doi: 10.1073/pnas. 87.23.9193.

J. W. Smith, J. E. Everhart, W. C. Dickson, et al., “Using the ADAP learning algorithm to forecast the onset of diabetes mellitus,” in Proceedings of the Symposium on Computer Applications and Medical Care, 1988, pp. 261–265.

G. Cestnik, I. Kononenko, and I. Bratko, “Assistant-86: A knowledge-elicitation tool for sophisticated users,” in Proceedings of the European Working Session on Learning, 1987, pp. 31–45.

M. Elter, R. Schulz-Wendtland, and T. Wittenberg, “The prediction of breast cancer biopsy outcomes using two CAD approaches that both emphasize an intelligible decision process,” Medical Physics, vol. 34, no. 11, pp. 4164–4174, 2007.

J. Brest, S. Greiner, B. Boskovic, M. Mernik, and V. Zumer, “Self-adapting control parameters in differential evolution: A comparative study on numerical benchmark problems,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 6, pp. 646–657, 2006.

E. Alba and M. Tomassini, “Parallelism and evolutionary algorithms,” IEEE Transactions on Evolutionary Computation, vol. 6, no. 5, pp. 443–462, 2002.

C. A. Coello Coello, “Evolutionary multi-objective optimization: A historical view of the field,” IEEE Computational Intelligence Magazine, vol. 1, no. 1, pp. 28–36, 2006.

S. J. Pan and Q. Yang, “A survey on transfer learning,” IEEE Transactions on Knowledge and Data Engineering, vol. 22, no—10, pp. 1345–1359, 2010.

J. B. Gomes, M. M. Gaber, P. A. Sousa, and E. Menasalvas, “Mining recurring concepts in a dynamic feature space,” IEEE Transactions on Neural Networks and Learning Systems, vol. 25, no. 1, pp. 95–110, 2011.

G. Rudolph, “Convergence analysis of canonical genetic algorithms,” IEEE Transactions on Neural Networks, vol. 5, no. 1, pp. 96–101, 1994.

J. Wiens, S. Saria, M. Sendak, et al., “Do no harm: A roadmap for responsible machine learning for health care,” Nature Medicine, vol. 25, no. 8, pp. 1337–1340, 2019.

Published

2026-03-31

How to Cite

Satish Kumar Kalagotla, Thoudam Basanta, & Mutum Bidyarani Devi. (2026). GS-GA-SVM: A Novel Hybrid Grid Search-Genetic Algorithm Framework for SVM Parameter Optimization in Medical Diagnosis. Journal of Computer Based Parallel Programming, 11(1), 30–55. Retrieved from https://matjournals.net/engineering/index.php/JoCPP/article/view/3330

Issue

Section

Articles