Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models

Overview

Hyperparameter Optimization of Machine Learning Algorithms

This code provides a hyper-parameter optimization implementation for machine learning algorithms, as described in the paper:
L. Yang and A. Shami, “On hyperparameter optimization of machine learning algorithms: Theory and practice,” Neurocomputing, vol. 415, pp. 295–316, 2020, doi: https://doi.org/10.1016/j.neucom.2020.07.061.

To fit a machine learning model into different problems, its hyper-parameters must be tuned. Selecting the best hyper-parameter configuration for machine learning models has a direct impact on the model's performance. In this paper, optimizing the hyper-parameters of common machine learning models is studied. We introduce several state-of-the-art optimization techniques and discuss how to apply them to machine learning algorithms. Many available libraries and frameworks developed for hyper-parameter optimization problems are provided, and some open challenges of hyper-parameter optimization research are also discussed in this paper. Moreover, experiments are conducted on benchmark datasets to compare the performance of different optimization methods and provide practical examples of hyper-parameter optimization.

This paper and code will help industrial users, data analysts, and researchers to better develop machine learning models by identifying the proper hyper-parameter configurations effectively.

Paper

On Hyperparameter Optimization of Machine Learning Algorithms: Theory and Practice
One-column version: arXiv
Two-column version: Elsevier

Quick Navigation

Section 3: Important hyper-parameters of common machine learning algorithms
Section 4: Hyper-parameter optimization techniques introduction
Section 5: How to choose optimization techniques for different machine learning models
Section 6: Common Python libraries/tools for hyper-parameter optimization
Section 7: Experimental results (sample code in "HPO_Regression.ipynb" and "HPO_Classification.ipynb")
Section 8: Open challenges and future research directions
Summary table for Sections 3-6: Table 2: A comprehensive overview of common ML models, their hyper-parameters, suitable optimization techniques, and available Python libraries
Summary table for Sections 8: Table 10: The open challenges and future directions of HPO research

Implementation

Sample code for hyper-parameter optimization implementation for machine learning algorithms is provided in this repository.

Sample code for Regression problems

HPO_Regression.ipynb
Dataset used: Boston-Housing

Sample code for Classification problems

HPO_Classification.ipynb
Dataset used: MNIST

Machine Learning & Deep Learning Algorithms

  • Random forest (RF)
  • Support vector machine (SVM)
  • K-nearest neighbor (KNN)
  • Artificial Neural Networks (ANN)

Hyperparameter Configuration Space

ML Model Hyper-parameter Type Search Space
RF Classifier n_estimators Discrete [10,100]
max_depth Discrete [5,50]
min_samples_split Discrete [2,11]
min_samples_leaf Discrete [1,11]
criterion Categorical 'gini', 'entropy'
max_features Discrete [1,64]
SVM Classifier C Continuous [0.1,50]
kernel Categorical 'linear', 'poly', 'rbf', 'sigmoid'
KNN Classifier n_neighbors Discrete [1,20]
ANN Classifier optimizer Categorical 'adam', 'rmsprop', 'sgd'
activation Categorical 'relu', 'tanh'
batch_size Discrete [16,64]
neurons Discrete [10,100]
epochs Discrete [20,50]
patience Discrete [3,20]
RF Regressor n_estimators Discrete [10,100]
max_depth Discrete [5,50]
min_samples_split Discrete [2,11]
min_samples_leaf Discrete [1,11]
criterion Categorical 'mse', 'mae'
max_features Discrete [1,13]
SVM Regressor C Continuous [0.1,50]
kernel Categorical 'linear', 'poly', 'rbf', 'sigmoid'
epsilon Continuous [0.001,1]
KNN Regressor n_neighbors Discrete [1,20]
ANN Regressor optimizer Categorical 'adam', 'rmsprop'
activation Categorical 'relu', 'tanh'
loss Categorical 'mse', 'mae'
batch_size Discrete [16,64]
neurons Discrete [10,100]
epochs Discrete [20,50]
patience Discrete [3,20]

HPO Algorithms

  • Grid search
  • Random search
  • Hyperband
  • Bayesian Optimization with Gaussian Processes (BO-GP)
  • Bayesian Optimization with Tree-structured Parzen Estimator (BO-TPE)
  • Particle swarm optimization (PSO)
  • Genetic algorithm (GA)

Requirements

Contact-Info

Please feel free to contact me for any questions or cooperation opportunities. I'd be happy to help.

Citation

If you find this repository useful in your research, please cite this article as:

L. Yang and A. Shami, “On hyperparameter optimization of machine learning algorithms: Theory and practice,” Neurocomputing, vol. 415, pp. 295–316, 2020, doi: https://doi.org/10.1016/j.neucom.2020.07.061.

@article{YANG2020295,
title = "On hyperparameter optimization of machine learning algorithms: Theory and practice",
author = "Li Yang and Abdallah Shami",
volume = "415",
pages = "295 - 316",
journal = "Neurocomputing",
year = "2020",
issn = "0925-2312",
doi = "https://doi.org/10.1016/j.neucom.2020.07.061",
url = "http://www.sciencedirect.com/science/article/pii/S0925231220311693"
}
Owner
Li Yang
Ph.D. Candidate in OC2 Lab at Western University
Li Yang
A benchmark dataset for mesh multi-label-classification based on cube engravings introduced in MeshCNN

Double Cube Engravings This script creates a dataset for multi-label mesh clasification, with an intentionally difficult setup for point cloud classif

Yotam Erel 1 Nov 30, 2021
ONNX Runtime Web demo is an interactive demo portal showing real use cases running ONNX Runtime Web in VueJS.

ONNX Runtime Web demo is an interactive demo portal showing real use cases running ONNX Runtime Web in VueJS. It currently supports four examples for you to quickly experience the power of ONNX Runti

Microsoft 58 Dec 18, 2022
This repository contains the code to replicate the analysis from the paper "Moving On - Investigating Inventors' Ethnic Origins Using Supervised Learning"

Replication Code for 'Moving On' - Investigating Inventors' Ethnic Origins Using Supervised Learning This repository contains the code to replicate th

Matthias Niggli 0 Jan 04, 2022
Ensembling Off-the-shelf Models for GAN Training

Data-Efficient GANs with DiffAugment project | paper | datasets | video | slides Generated using only 100 images of Obama, grumpy cats, pandas, the Br

MIT HAN Lab 1.2k Dec 26, 2022
Imagededup - 😎 Finding duplicate images made easy

imagededup is a python package that simplifies the task of finding exact and near duplicates in an image collection.

idealo 4.3k Jan 07, 2023
Edge Restoration Quality Assessment

ERQA - Edge Restoration Quality Assessment ERQA - a full-reference quality metric designed to analyze how good image and video restoration methods (SR

MSU Video Group 27 Dec 17, 2022
Locally Most Powerful Bayesian Test for Out-of-Distribution Detection using Deep Generative Models

LMPBT Supplementary code for the Paper entitled ``Locally Most Powerful Bayesian Test for Out-of-Distribution Detection using Deep Generative Models"

1 Sep 29, 2022
Repo for "TableParser: Automatic Table Parsing with Weak Supervision from Spreadsheets" at [email protected]

TableParser Repo for "TableParser: Automatic Table Parsing with Weak Supervision from Spreadsheets" at DS3 Lab 11 Dec 13, 2022

Implementation EfficientDet: Scalable and Efficient Object Detection in PyTorch

Implementation EfficientDet: Scalable and Efficient Object Detection in PyTorch

tonne 1.4k Dec 29, 2022
Some toy examples of score matching algorithms written in PyTorch

toy_gradlogp This repo implements some toy examples of the following score matching algorithms in PyTorch: ssm-vr: sliced score matching with variance

Ending Hsiao 21 Dec 26, 2022
Nsdf: A mesh SDF with just some code we can directly paste into our raymarcher

nsdf Representing SDFs of arbitrary meshes has been a bit tricky so far. Express

Jan Ivanecky 5 Feb 18, 2022
Pytorch Implementation of Value Retrieval with Arbitrary Queries for Form-like Documents.

Value Retrieval with Arbitrary Queries for Form-like Documents Introduction Pytorch Implementation of Value Retrieval with Arbitrary Queries for Form-

Salesforce 13 Sep 15, 2022
This is the repository for The Machine Learning Workshops, published by AI DOJO

This is the repository for The Machine Learning Workshops, published by AI DOJO. It contains all the workshop's code with supporting project files necessary to work through the code.

AI Dojo 12 May 06, 2022
This is 2nd term discrete maths project done by UCU students that uses backtracking to solve various problems.

Backtracking Project Sponsors This is a project made by UCU students: Olha Liuba - crossword solver implementation Hanna Yershova - sudoku solver impl

Dasha 4 Oct 17, 2021
LegoDNN: a block-grained scaling tool for mobile vision systems

Table of contents 1 Introduction 1.1 Major features 1.2 Architecture 2 Code and Installation 2.1 Code 2.2 Installation 3 Repository of DNNs in vision

41 Dec 24, 2022
BYOL for Audio: Self-Supervised Learning for General-Purpose Audio Representation

BYOL for Audio: Self-Supervised Learning for General-Purpose Audio Representation This is a demo implementation of BYOL for Audio (BYOL-A), a self-sup

NTT Communication Science Laboratories 160 Jan 04, 2023
Created as part of CS50 AI's coursework. This AI makes use of knowledge entailment to calculate the best probabilities to win Minesweeper.

Minesweeper-AI Created as part of CS50 AI's coursework. This AI makes use of knowledge entailment to calculate the best probabilities to win Minesweep

Beckham 0 Jul 20, 2022
Density-aware Single Image De-raining using a Multi-stream Dense Network (CVPR 2018)

DID-MDN Density-aware Single Image De-raining using a Multi-stream Dense Network He Zhang, Vishal M. Patel [Paper Link] (CVPR'18) We present a novel d

He Zhang 224 Dec 12, 2022
This repository contains the code for the paper in EMNLP 2021: "HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression".

HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression This repository contains the code for the paper in EM

Chenhe Dong 2 Mar 24, 2022