PyHessian is a Pytorch library for second-order based analysis and training of Neural Networks
-
Updated
Apr 16, 2024 - Jupyter Notebook
PyHessian is a Pytorch library for second-order based analysis and training of Neural Networks
ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
A C++ interface to formulate and solve linear, quadratic and second order cone problems.
Pytorch implementation of preconditioned stochastic gradient descent (affine group preconditioner, low-rank approximation preconditioner and more)
Distributed K-FAC Preconditioner for PyTorch
FEDL-Federated Learning algorithm using TensorFlow (Transaction on Networking 2021)
This repository implements FEDL using pytorch
Tensorflow implementation of preconditioned stochastic gradient descent
PyTorch implementation of the Hessian-free optimizer
Hessian-based stochastic optimization in TensorFlow and keras
Compatible Intrinsic Triangulations (SIGGRAPH 2022)
This package is dedicated to high-order optimization methods. All the methods can be used similarly to standard PyTorch optimizers.
Federated Learning using PyTorch. Second-Order for Federated Learning. (IEEE Transactions on Parallel and Distributed Systems 2022)
An implementation of PSGD Kron second-order optimizer for PyTorch
Minimalist deep learning library with first and second-order optimization algorithms made for educational purpose
Implementation of PSGD optimizer in JAX
LIBS2ML: A Library for Scalable Second Order Machine Learning Algorithms
Prototyping of matrix free Newton methods in Julia
Subsampled Riemannian trust-region (RTR) algorithms
Add a description, image, and links to the second-order-optimization topic page so that developers can more easily learn about it.
To associate your repository with the second-order-optimization topic, visit your repo's landing page and select "manage topics."