skip to main content
10.1145/3638530.3664077acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
abstract

A Review of Randomness Techniques in Deep Neural Networks

Published: 01 August 2024 Publication History

Abstract

This paper investigates the effects of various randomization techniques on Deep Neural Networks (DNNs) learning performance. We categorize the existing randomness techniques into four key types: injection of noise/randomness at the data, model structure, optimization or learning stage. We use this classification to identify gaps in the current coverage of potential mechanisms for the introduction of randomness, leading to proposing two new techniques: adding noise to the loss function and random masking of the gradient updates. We use a Particle Swarm Optimizer (PSO) for hyperparameter optimization and evaluate over 30,000 configurations across standard computer vision benchmarks. Our study reveals that data augmentation and weight initialization randomness significantly improve performance, and different optimizers prefer distinct randomization types. The complete implementation and dataset are available on GitHub1.
This paper for the Hot-off-the-Press track at GECCO 2024 summarizes the original work published at [2].

References

[1]
Mohammed Ghaith Altarabichi, Sławomir Nowaczyk, Sepideh Pashami, and Peyman Sheikholharam Mashhadi. 2023. Fast Genetic Algorithm for feature selection---A qualitative approximation approach. Expert Systems with Applications 211 (2023), 118528.
[2]
Mohammed Ghaith Altarabichi, Sławomir Nowaczyk, Sepideh Pashami, Peyman Sheikholharam Mashhadi, and Julia Handl. 2024. Rolling the dice for better deep learning performance: A study of randomness techniques in deep neural networks. Information Sciences 667 (2024), 120500.
[3]
Hung-Yu Tseng, Yi-Wen Chen, Yi-Hsuan Tsai, Sifei Liu, Yen-Yu Lin, and Ming-Hsuan Yang. 2020. Regularizing meta-learning via gradient dropout. In Proceedings of the Asian Conference on Computer Vision.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
GECCO '24 Companion: Proceedings of the Genetic and Evolutionary Computation Conference Companion
July 2024
2187 pages
ISBN:9798400704956
DOI:10.1145/3638530
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s).

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 August 2024

Check for updates

Author Tags

  1. deep neural network
  2. hyperparameter
  3. particle swarm optimization
  4. convolutional neural network
  5. randomized neural networks

Qualifiers

  • Abstract

Funding Sources

Conference

GECCO '24 Companion
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 25
    Total Downloads
  • Downloads (Last 12 months)25
  • Downloads (Last 6 weeks)25
Reflects downloads up to 14 Sep 2024

Other Metrics

Citations

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media