Skip to content

slimgroup/InvertibleNetworks.jl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

InvertibleNetworks.jl

Documentation Build Status JOSS paper
CI DOI

🎯 Overview

InvertibleNetworks.jl provides memory-efficient building blocks for invertible neural networks with hand-derived gradients, Jacobians, and log-determinants. The package is designed for high-performance scientific computing and machine learning applications.

✨ Key Features

  • Memory Efficient: Hand-derived gradients, Jacobians J, and log|J| for optimal memory usage
  • Flux Integration: Seamless integration with Flux.jl for automatic differentiation
  • AD Support: Support for Zygote and ChainRules automatic differentiation
  • GPU Support: Nvidia GPU support via CuArray
  • Comprehensive Examples: Various examples of invertible neural networks, normalizing flows, variational inference, and uncertainty quantification

πŸš€ Quick Start

Installation

In Julia REPL,

] add InvertibleNetworks

Or

using Pkg
Pkg.develop("InvertibleNetworks")

Running Tests

using Pkg
Pkg.test("InvertibleNetworks")

Basic Usage

using InvertibleNetworks, Flux

# Create a simple activation normalization layer
an = ActNorm(10; logdet=true)

# Forward pass
X = randn(Float32, 64, 64, 10, 4)
Y, logdet = an.forward(X)

# Inverse pass
X_reconstructed = an.inverse(Y)

# Test invertibility
@assert norm(X - X_reconstructed) < 1e-6

GPU Support

using InvertibleNetworks, Flux

# Move data to GPU
X = randn(Float32, 64, 64, 10, 4) |> gpu
AN = ActNorm(10; logdet=true) |> gpu

# Forward pass on GPU
Y, logdet = AN.forward(X)

🧱 Building Blocks

Core Layers

  • ActNorm: Activation normalization (Kingma and Dhariwal, 2018) (example)
  • Conv1x1: 1x1 Convolutions using Householder transformations (example)
  • ResidualBlock: Invertible residual blocks (example)
  • CouplingLayerGlow: Invertible coupling layer from Dinh et al. (2017) (example)
  • CouplingLayerHINT: Invertible recursive coupling layer HINT from Kruse et al. (2020) (example)
  • CouplingLayerHyperbolic: Invertible hyperbolic layer from Lensink et al. (2019) (example)
  • CouplingLayerIRIM: Invertible coupling layer from Putzky and Welling (2019) (example)

Activation Functions

  • ReLU: Rectified Linear Unit
  • LeakyReLU: Leaky Rectified Linear Unit
  • Sigmoid: Sigmoid activation with optional scaling
  • Sigmoid2: Modified sigmoid activation
  • GaLU: Gated Linear Unit
  • ExpClamp: Exponential with clamping

Utilities

  • Jacobian Computation: Hand-derived Jacobians for memory efficiency
  • Dimensionality Manipulation: squeeze/unsqueeze (column, patch, checkerboard), split/cat
  • Wavelet Transform

🌐 Network Architectures

Pre-built Networks

  • NetworkGlow: Generative flow with invertible 1x1 convolutions (generic example, source)
  • NetworkHINT: Multi-scale HINT networks
  • NetworkHyperbolic: Hyperbolic networks
  • NetworkIRIM: Invertible recurrent inference machines (Putzky and Welling, 2019) (generic example)
  • NetworkConditionalGlow: Conditional Glow networks
  • NetworkConditionalHINT: Conditional HINT networks

πŸ” Uncertainty-aware Image Reconstruction

Due to its memory scaling InvertibleNetworks.jl, has been particularily successful at Bayesian posterior sampling with simulation-based inference. To get started with this application refer to a simple example (Conditional sampling for MNSIT inpainting) but feel free to modify this script for your application and please reach out to us for help.

Example: MNIST Inpainting

# See examples/applications/conditional_sampling/amortized_glow_mnist_inpainting.jl
# for a complete example of conditional sampling for MNIST inpainting

mnist_sampling_cond

Other Examples

  • Invertible recurrent inference machines (Putzky and Welling, 2019) (generic example)

  • Generative models with maximum likelihood via the change of variable formula (example)

  • Glow: Generative flow with invertible 1x1 convolutions (Kingma and Dhariwal, 2018) (generic example, source)

πŸ“– Documentation

  • API Documentation: Stable | Development
  • Examples: See the examples/ directory for comprehensive usage examples
  • Tests: The test/ directory contains extensive unit tests

🀝 Contributing

We welcome contributions! Please see CONTRIBUTING.md for guidelines.

πŸ“„ Citation

If you use InvertibleNetworks.jl in your research, please cite:

@article{Orozco2024, 
    doi = {10.21105/joss.06554}, 
    url = {https://doi.org/10.21105/joss.06554}, 
    year = {2024}, 
    publisher = {The Open Journal}, 
    volume = {9}, 
    number = {99}, 
    pages = {6554}, 
    author = {Rafael Orozco and Philipp Witte and Mathias Louboutin and Ali Siahkoohi and Gabrio Rizzuti and Bas Peters and Felix J. Herrmann}, 
    title = {InvertibleNetworks.jl: A Julia package for scalable normalizing flows}, 
    journal = {Journal of Open Source Software} 
}

πŸ“š Related Publications

The following publications use InvertibleNetworks.jl:

πŸ‘₯ Authors

  • Rafael Orozco - Georgia Institute of Technology [[email protected]]
  • Philipp Witte - Georgia Institute of Technology (now Microsoft)
  • Gabrio Rizzuti - Utrecht University
  • Mathias Louboutin - Georgia Institute of Technology
  • Ali Siahkoohi - Georgia Institute of Technology

πŸ™ Acknowledgments

This package uses functions from:

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.