Repository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2023
-
Updated
Oct 27, 2024 - Jupyter Notebook
Repository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2023
The purpose of this repo is to make it easy to get started with JAX, Flax, and Haiku. It contains my "Machine Learning with JAX" series of tutorials (YouTube videos and Jupyter Notebooks) as well as the content I found useful while learning about the JAX ecosystem.
Accelerate, Optimize performance with streamlined training and serving options with JAX.
This is the official repository for the paper "Flora: Low-Rank Adapters Are Secretly Gradient Compressors" in ICML 2024.
JAX implementations of various deep reinforcement learning algorithms.
Implementation of PSGD optimizer in JAX
JAX/Flax implementation of finite-size scaling
Goal-conditioned reinforcement learning like 🔥
An implementation of adan optimizer for optax
H-Former is a VAE for generating in-between fonts (or combining fonts). Its encoder uses a Point net and transformer to compute a code vector of glyph. Its decoder is composed of multiple independent decoders which act on a code vector to reconstruct a point cloud representing a glpyh.
JAX implementation of Classical and Quantum Algorithms for Orthogonal Neural Networks by (Kerenidis et al., 2021)
Variational Graph Autoencoder implemented using Jax & Jraph
A Simplistic trainer for Flax
An Optax-based JAX implementation of the IVON optimizer for large-scale VI training of NNs (ICML'24 spotlight)
Direct port of TD3_BC to JAX using Haiku and optax.
dm-haiku implementation of hyperbolic neural networks
Add a description, image, and links to the optax topic page so that developers can more easily learn about it.
To associate your repository with the optax topic, visit your repo's landing page and select "manage topics."