Open-source Packages
Discover essential open-source tools and libraries for optimization, deep learning, and hyperparameter tuning.
No packages found
Try adjusting your search terms
Transformers
Hugging FaceState-of-the-art Machine Learning for PyTorch, TensorFlow, and JAX. Provides thousands of pre-trained models and integrated optimization tools.
TorchOpt
MetaOptAn efficient library for differentiable optimization in PyTorch. It enables gradient-based meta-learning and hyperparameter optimization.
Optax
DeepMindA gradient processing and optimization library for JAX. Designed to facilitate research by providing composable building blocks.
Ray Tune
AnyscaleA scalable hyperparameter tuning library. It is compatible with any machine learning framework and provides state-of-the-art search algorithms.
OpenBox
PKU-DAIRAn efficient and generalized black-box optimization system. It supports multiple optimization tasks including hyperparameter optimization and neural architecture search.
Nevergrad
Meta ResearchA gradient-free optimization platform. It contains a wide range of algorithms for parameter tuning and works well for non-differentiable or noisy functions.
JAXopt
Google ResearchHardware accelerated, batchable and differentiable optimizers in JAX. Enforces distinct separation between objective functions and solvers.
Cooper
Cooper OrgA toolkit for Lagrangian-based constrained optimization in PyTorch. Handles formulation and solving of constrained optimization problems.
Betty
Leopard AIAn automatic differentiation library for multilevel optimization. It provides a unified framework for generalized meta-learning and hyperparameter optimization.
Higher
Meta ResearchA library for higher-order optimization in PyTorch. It facilitates the implementation of "unrolled" optimization loops for meta-learning.
PyTorch Optimizer
CommunityA comprehensive collection of optimizer implementations for PyTorch, including AdaBound, diffGrad, LAMB, and many others not in the official library.