Skip to content

Pre-built implicit layer architectures with O(1) backprop, GPUs, and stiff+non-stiff DE solvers, demonstrating scientific machine learning (SciML) and physics-informed machine learning methods

License

Notifications You must be signed in to change notification settings

SciML/DiffEqFlux.jl

Repository files navigation

DiffEqFlux.jl

Join the chat at https://julialang.zulipchat.com #sciml-bridged Global Docs

codecov Build Status Build status

ColPrac: Contributor's Guide on Collaborative Practices for Community Packages SciML Code Style

DiffEq(For)Lux.jl (aka DiffEqFlux.jl) fuses the world of differential equations with machine learning by helping users put diffeq solvers into neural networks. This package utilizes DifferentialEquations.jl, and Lux.jl as its building blocks to support research in Scientific Machine Learning, specifically neural differential equations to add physical information into traditional machine learning.

Note

We maintain backwards compatibility with Flux.jl via FromFluxAdaptor()

Tutorials and Documentation

For information on using the package, see the stable documentation. Use the in-development documentation for the version of the documentation, which contains the unreleased features.

Problem Domain

DiffEqFlux.jl is for implicit layer machine learning. DiffEqFlux.jl provides architectures which match the interfaces of machine learning libraries such as Flux.jl and Lux.jl to make it easy to build continuous-time machine learning layers into larger machine learning applications.

The following layer functions exist:

  • Neural Ordinary Differential Equations (Neural ODEs)
  • Collocation-Based Neural ODEs (Neural ODEs without a solver, by far the fastest way!)
  • Multiple Shooting Neural Ordinary Differential Equations
  • Neural Stochastic Differential Equations (Neural SDEs)
  • Neural Differential-Algebraic Equations (Neural DAEs)
  • Neural Delay Differential Equations (Neural DDEs)
  • Augmented Neural ODEs
  • Hamiltonian Neural Networks (with specialized second order and symplectic integrators)
  • Continuous Normalizing Flows (CNF) and FFJORD

with high order, adaptive, implicit, GPU-accelerated, Newton-Krylov, etc. methods. For examples, please refer to the release blog post. Additional demonstrations, like neural PDEs and neural jump SDEs, can be found in this blog post (among many others!).

Do not limit yourself to the current neuralization. With this package, you can explore various ways to integrate the two methodologies:

  • Neural networks can be defined where the “activations” are nonlinear functions described by differential equations
  • Neural networks can be defined where some layers are ODE solves
  • ODEs can be defined where some terms are neural networks
  • Cost functions on ODEs can define neural networks

Flux ODE Training Animation

Breaking Changes

v4

  • TensorLayer has been removed, use Boltz.Layers.TensorProductLayer instead.
  • Basis functions in DiffEqFlux have been removed in favor of Boltz.Basis module.
  • SplineLayer has been removed, use Boltz.Layers.SplineLayer instead.
  • NeuralHamiltonianDE has been removed, use NeuralODE with Layers.HamiltonianNN instead.
  • HamiltonianNN has been removed in favor of Layers.HamiltonianNN.
  • Lux and Boltz are updated to v1.

v3

  • Flux dependency is dropped. If a non Lux AbstractLuxLayer is passed we try to automatically convert it to a Lux model with FromFluxAdaptor()(model).
  • Flux is no longer re-exported from DiffEqFlux. Instead we reexport Lux.
  • NeuralDAE now allows an optional du0 as input.
  • TensorLayer is now a Lux Neural Network.
  • APIs for quite a few layer constructions have changed. Please refer to the updated documentation for more details.