Need help with Hyperparameter-Optimization-of-Machine-Learning-Algorithms?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

LiYangHart
562 Stars 132 Forks MIT License 33 Commits 0 Opened issues

Description

Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)

Services available

!
?

Need anything else?

Contributors list

# 44,042
archite...
pytorch
Jupyter...
particl...
31 commits

Hyperparameter Optimization of Machine Learning Algorithms

This code provides a hyper-parameter optimization implementation for machine learning algorithms, as described in the paper "On Hyperparameter Optimization of Machine Learning Algorithms: Theory and Practice".

To fit a machine learning model into different problems, its hyper-parameters must be tuned. Selecting the best hyper-parameter configuration for machine learning models has a direct impact on the model's performance. In this paper, optimizing the hyper-parameters of common machine learning models is studied. We introduce several state-of-the-art optimization techniques and discuss how to apply them to machine learning algorithms. Many available libraries and frameworks developed for hyper-parameter optimization problems are provided, and some open challenges of hyper-parameter optimization research are also discussed in this paper. Moreover, experiments are conducted on benchmark datasets to compare the performance of different optimization methods and provide practical examples of hyper-parameter optimization.

This paper and code will help industrial users, data analysts, and researchers to better develop machine learning models by identifying the proper hyper-parameter configurations effectively.

Paper

On Hyperparameter Optimization of Machine Learning Algorithms: Theory and Practice
One-column version: arXiv
Two-column version: Elsevier

Quick Navigation

Section 3: Important hyper-parameters of common machine learning algorithms
Section 4: Hyper-parameter optimization techniques introduction
Section 5: How to choose optimization techniques for different machine learning models
Section 6: Common Python libraries/tools for hyper-parameter optimization
Section 7: Experimental results (sample code in "HPORegression.ipynb" and "HPOClassification.ipynb")
Section 8: Open challenges and future research directions
Summary table for Sections 3-6: Table 2: A comprehensive overview of common ML models, their hyper-parameters, suitable optimization techniques, and available Python libraries
Summary table for Sections 8: Table 10: The open challenges and future directions of HPO research

Implementation

Sample code for hyper-parameter optimization implementation for machine learning algorithms is provided in this repository.

Sample code for Regression problems

HPO_Regression.ipynb
Dataset used: Boston-Housing

Sample code for Classification problems

HPO_Classification.ipynb
Dataset used: MNIST

Machine Learning & Deep Learning Algorithms

  • Random forest (RF)
  • Support vector machine (SVM)
  • K-nearest neighbor (KNN)
  • Artificial Neural Networks (ANN)

Hyperparameter Configuration Space

| ML Model | Hyper-parameter | Type | Search Space | |-----------------------|--------------------------|--------------------|---------------------------------------------| | RF Classifier | nestimators | Discrete | [10,100] | | | maxdepth | Discrete | [5,50] | | | minsamplessplit | Discrete | [2,11] | | | minsamplesleaf | Discrete | [1,11] | | | criterion | Categorical | 'gini', 'entropy' | | | maxfeatures | Discrete | [1,64] | | SVM Classifier | C | Continuous | [0.1,50] | | | kernel | Categorical | 'linear', 'poly', 'rbf', 'sigmoid' | | KNN Classifier | nneighbors | Discrete | [1,20] | | ANN Classifier | optimizer | Categorical | 'adam', 'rmsprop', 'sgd' | | | activation | Categorical | 'relu', 'tanh' | | | batchsize | Discrete | [16,64] | | | neurons | Discrete | [10,100] | | | epochs | Discrete | [20,50] | | | patience | Discrete | [3,20] | | RF Regressor | nestimators | Discrete | [10,100] | | | maxdepth | Discrete | [5,50] | | | minsamplessplit | Discrete | [2,11] | | | minsamplesleaf | Discrete | [1,11] | | | criterion | Categorical | 'mse', 'mae' | | | maxfeatures | Discrete | [1,13] | | SVM Regressor | C | Continuous | [0.1,50] | | | kernel | Categorical | 'linear', 'poly', 'rbf', 'sigmoid' | | | epsilon | Continuous | [0.001,1] | | KNN Regressor | nneighbors | Discrete | [1,20] | | ANN Regressor | optimizer | Categorical | 'adam', 'rmsprop' | | | activation | Categorical | 'relu', 'tanh' | | | loss | Categorical | 'mse', 'mae' | | | batchsize | Discrete | [16,64] | | | neurons | Discrete | [10,100] | | | epochs | Discrete | [20,50] | | | patience | Discrete | [3,20] |

HPO Algorithms

  • Grid search
  • Random search
  • Hyperband
  • Bayesian Optimization with Gaussian Processes (BO-GP)
  • Bayesian Optimization with Tree-structured Parzen Estimator (BO-TPE)
  • Particle swarm optimization (PSO)
  • Genetic algorithm (GA)

Requirements

Contact-Info

Please feel free to contact me for any questions or cooperation opportunities. I'd be happy to help. * Email: [email protected] * GitHub: LiYangHart and Western OC2 Lab * LinkedIn: Li Yang
* Google Scholar: Li Yang and OC2 Lab

Citation

If you find this repository useful in your research, please cite this article as:

L. Yang and A. Shami, “On hyperparameter optimization of machine learning algorithms: Theory and practice,” Neurocomputing, vol. 415, pp. 295–316, 2020, doi: https://doi.org/10.1016/j.neucom.2020.07.061.

@article{YANG2020295,
title = "On hyperparameter optimization of machine learning algorithms: Theory and practice",
author = "Li Yang and Abdallah Shami",
volume = "415",
pages = "295 - 316",
journal = "Neurocomputing",
year = "2020",
issn = "0925-2312",
doi = "https://doi.org/10.1016/j.neucom.2020.07.061",
url = "http://www.sciencedirect.com/science/article/pii/S0925231220311693"
}

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.