ZüNIS documentation

ZüNIS (Zürich Neural Importance Sampling) a work-in-progress Pytorch-based library for Monte-Carlo integration based on Neural imporance sampling [1], developed at ETH Zürich. In simple terms, we use artificial intelligence to compute integrals faster.

The goal is to provide a flexible library to integrate black-box functions with a level of automation comparable to the VEGAS Library [2], while using state-of-the-art methods that go around the limitations of existing tools.

Get Started

Do you need to compute an integral right now and cannot wait?

  1. go to the Installation page

  2. have a look at our Basic Example

API Overview

The ZüNIS library provides three level of abstractions, to allow both high-level and fine-grained control:

1. Integrators are the highest level of abstraction and control function integration strategies. They can automate trainer and flow creation.

2. Trainers are one level below and steer model training through loss functions, optimizers, sampling etc. They can automate flow creation.

3. Normalizing Flows are neural-network-based bijections from the unit hypercube to itself. They are the actual trainable sampling mechanism that we use to sample points for Monte Carlo integration.

Functions

The ZüNIS library is a tool to compute integrals and therefore functions are a central element of its API. The goal here is to be as agnostic possible as to which functions can be integrated and they are indeed always treated as a black box. In particular they do not need to be differentiable, run on a specific device, on a specific thread, etc.

The specifications we enforce are:

  1. integrals are always computed over a d-dimensional unit hypercube

  2. a function is a callable Python object

  3. input and output are provided by batch

In specific terms, the input will always be a torch.Tensor object x with shape \((N, d)\) and values between 0 and 1, and the output is expected to be a torch.Tensor object y with shape \((N,)\), such that y[i] = f(x[i])

Importance sampling

ZüNIS is a tool to compute integrals by importance sampling Monte Carlo estimation. This means that we have a function \(f\) defined over some multi-dimensional space \(\Omega\) and we want to compute

\[I = \int_\Omega dx f(x)\]

The importance sampling approach is based on the observation that for any non-zero probability distribution function \(p\) over \(\Omega\),

\[I = \underset{x \sim p(x) } {\mathbb{E}}\frac{f(x)}{p(x)}\]

We can therefore define an estimator for \(I\) by sampling \(N\) points from \(\Omega\). The standard deviation of this estimator \(\hat{I}_N\) is

\[\sigma\left[\hat{I}_N\right] = \frac{1}{\sqrt{N}}\left(\underset{x \sim p(x)}{\sigma}\left[ \frac{f(x)}{p(x)}\right]\right)\]

and the name of the game is to find a \(p(x)\) that minimizes this quantity in order to minimize the number of times we need to sample the function \(f\) to attain a given uncertainty on our integral estimation.

If this seems like a problem that machine learning should be able to solve, you are indeed onto something.