Lectures on Data Science:

Hyperparameter Optimization

Tuesday, 04.06.2024 · 10 a.m. - 4 p.m.
On-site

Hyperparameters are those parameters in your machine learning model that cannot be “learned” and that you therefore need to choose before starting training. Nevertheless, these parameters are often decisive for the performance of the model. Examples include the number/size of layers in a NN, batch-size, learning rate, etc. This course is intended to provide a brief introduction to hyperparameter optimization, i.e. the systematic, optimal choice of these parameters. The acquired knowledge is deepened in a three-hour practical session using Ray Tune (for the actual hyperparameter optimization) and PyTorch (for the underlying neural network).

 

Content

Part I: Theory

  • Wrap-up of Deep-Learning basics: standard concepts/architectures and optimization/learning algorithms
  • What are hyperparameters and why do we need to optimize them?
  • How to optimize them? – an overview of techniques for hyperparameter optimization
    (e.g. grid search, random search, Bayesian optimization, evolutionary optimization)
  • Concepts and usage of Ray Tune

Part II: Hands-on "Hyperparameter Optimization with PyTorch and Ray Tune"

  • How to modify an example PyTorch code for hyperparameter optimization
  • Good practice guidelines for hyperparameter tuning

Requirements for Part II: Access to a system (e.g., Workstation/Cluster via SSH from your notebook or just your notebook) with a working Python, PyTorch, and Ray Tune installation (set-up details for a corresponding virtual environment will be provided before the course)

Material for Part II: Templates for the example will be made available on GitHub

 

Trainers:           Dr. Alexander Rüttgers and Dr. Fabian Hoppe from DLR, Cologne

 

Prerequisites:   Course participants should be familiar with Python and should have some experience with coding deep learning applications in Python. Knowledge of PyTorch is helpful, but not necessary (as most DL frameworks are somehow similar and Ray Tune can be applied to all of them in a similar way)

 

Registration:     The course is designed for 40 participants. There are no course fees, but you must cover the travel costs yourself.

Please register by May 21st, 2024.