Open-source Packages

Discover essential open-source tools and libraries for optimization, deep learning, and hyperparameter tuning.

🤗

Transformers

Hugging Face

State-of-the-art Machine Learning for PyTorch, TensorFlow, and JAX. Provides thousands of pre-trained models and integrated optimization tools.

TorchOpt

MetaOpt

An efficient library for differentiable optimization in PyTorch. It enables gradient-based meta-learning and hyperparameter optimization.

DeepSpeed

Microsoft

A deep learning optimization library that makes distributed training easy, efficient, and effective. Features ZeRO technology for massive model scaling.

Optax

DeepMind

A gradient processing and optimization library for JAX. Designed to facilitate research by providing composable building blocks.

Horovod

Uber

A distributed deep learning training framework for TensorFlow, Keras, PyTorch, and Apache MXNet. Makes distributed training easy and fast.

Optuna

Preferred Networks

An automatic hyperparameter optimization software framework, particularly designed for machine learning. Featuring an define-by-run API.

Ray Tune

Anyscale

A scalable hyperparameter tuning library. It is compatible with any machine learning framework and provides state-of-the-art search algorithms.

OpenBox

PKU-DAIR

An efficient and generalized black-box optimization system. It supports multiple optimization tasks including hyperparameter optimization and neural architecture search.

Nevergrad

Meta Research

A gradient-free optimization platform. It contains a wide range of algorithms for parameter tuning and works well for non-differentiable or noisy functions.

JAXopt

Google Research

Hardware accelerated, batchable and differentiable optimizers in JAX. Enforces distinct separation between objective functions and solvers.

Cooper

Cooper Org

A toolkit for Lagrangian-based constrained optimization in PyTorch. Handles formulation and solving of constrained optimization problems.

Betty

Leopard AI

An automatic differentiation library for multilevel optimization. It provides a unified framework for generalized meta-learning and hyperparameter optimization.

CVXPY

CVX Research

A Python-embedded modeling language for convex optimization problems. It allows you to express problems in a natural way that follows the math.

Higher

Meta Research

A library for higher-order optimization in PyTorch. It facilitates the implementation of "unrolled" optimization loops for meta-learning.

PyTorch Optimizer

Community

A comprehensive collection of optimizer implementations for PyTorch, including AdaBound, diffGrad, LAMB, and many others not in the official library.