Fractional Gradient Based Optimization for Nonlinear Separable Data

Dian Puspita Hapsari, Muhammad Fahrur Rozi

Abstract


The Support Vector Machine or SVM classifier is one of the machine learning algorithms whose job is to predict data. Traditional classifier has limitations in the process of training large-scale data, tends to be slow. This study aims to increase the efficiency of the SVM classifier using a fractional gradient descent optimization algorithm, so that the speed of the data training process can be increased when using large-scale data. There are ten numerical data sets used in the simulation that are used to test the performance of the SVM classifier that has been optimized using the Caputo type fractional gradient descent algorithm. In this paper, we use the Caputo derivative formula to calculate the fractional-order gradient descent from the error function with respect to weights and obtain a deterministic convergence to increase the speed of the Caputo type fractional-order derivative convergence. The test results show that the optimized SVM classifier achieves a faster convergence time with iterations and a small error value. For further research, the optimized SVM linear classifier with fractional gradient descent is implemented on the problem of unbalanced class data.


Full Text:

PDF

References


Wu, Hsiao and Nian, "Using supervised machine learning on large-scale online forums to classify course-related Facebook messages in predicting learning achievement within the the personal learning environment" - Interactive Learning Environments, Taylor & Francis, 2020

Hochreiter and Schmidhuber, "Long short-term memory", Neural computation, ieeexplore.ieee.org, 1997

Flake and Lawrence, "Self-organization and identification of web communities", Computer, ieeexplore.ieee.org, 2002

Vert and Vert, " Consistency and Convergence Rates of One-Class SVMs and Related Algorithms", Journal of Machine Learning Research, jmlr.org, 2006

Hsieh et al., "LIBLINEAR: A library for large linear classification", the Journal of machine, jmlr.org, 2008

J. Liu and X. Wu, “New three-term conjugate gradient method for solving unconstrained optimization problems,” ScienceAsia, 2014

Bottou, “Large-scale machine learning with stochastic gradient descent,” in Proceedings of COMPSTAT 2010 - 19th International Conference on Computational Statistics, Keynote, Invited and Contributed Papers, 2010

S. Ruder, “Overview Optimization Gradients,” arXiv Prepr. arXiv1609.04747, 2016

Khan et al., “Design of Momentum Fractional Stochastic Gradient Descent for Recommender Systems,” IEEE Access, 2019

S. Guo, S. Chen, and Y. Li, “Face recognition based on convolutional neural network & support vector machine,” in 2016 IEEE International Conference on Information and Automation, IEEE ICIA 2016, 2017

Wang, Wen, et al., “Convergence Analysis of Caputo-Type Fractional Order Complex-Valued Neural Networks,” IEEE Access, 2017

Caputo and Fabrizio, “A new definition of fractional derivative without singular kernel,” Prog. Fract. Differ. Appl., 2015

Y. Wei, Y. Chen, S. Cheng, and Y. Wang, “Discussion on fractional order derivatives,” IFAC-PapersOnLine, 2017

Wang, Yang, et al. al., “Fractional-order gradient descent learning of BP neural networks with Caputo derivative,” Neural Networks, 2017

Chen and Zhao, “An Improved Adagrad Gradient Descent Optimization Algorithm,” in Proceedings 2018 Chinese Automation Congress, 2019.




DOI: https://doi.org/10.31284/j.jasmet.2022.v3i1.2881

Refbacks

  • There are currently no refbacks.


Copyright (c) 2022 Dian Puspita Hapsari, Muhammad Fahrur Rozii

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Mailing Address: Journal of Applied Sciences, Management and Engineering Technology - ITATS Institut Teknologi Adhi Tama Surabaya Jl. Arief Rahman Hakim No.100, Surabaya 60117 email: [email protected] Website : https://ejurnal.itats.ac.id/jasmet/index Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.