dkltrainer_integrator module¶
Survey/Refine integrator based on training models with a DKL trainer
- class DKLAdaptiveSurveyIntegrator(*args, **kwargs)[source]¶
Bases:
zunis.integration.adaptive_survey_integrator.AdaptiveSurveyIntegrator
Survey/Refine adaptive integrator based on the DKL loss. The loss is the D_KL distance between the PDF from a flow model and an un-normalized function, up to non-trainable terms
Explicitly:
L(f,q) = - int dx f(x) log(q(x))
This integrator is adaptive in the sense that survey batches are sampled from the flat distribution in the target space (the domain of f and q) until the learned q distribution is a better approximation of the normalized target function f than the flat distribution. Since our target space is the unit hypercube, this is easy:
L(f, uniform) = 0.
So as soon as the loss is negative, we sample from the flow instead of the uniform distribution.
- Parameters
f (function) – the function to integrate
n_iter (int) – general number of iterations - ignored for survey/refine if n_iter_survey/n_inter_refine is set
n_iter_survey (int) – number of iterations for the survey stage
n_iter_refine (int) – number of iterations for the refine stage
n_points – general number of points per iteration - ignored for survey/refine if n_points_survey/n_points_refine is set
n_points_survey (int) – number of points per iteration for the survey stage
n_points_refine (int) – number of points per iteration for the refine stage
use_survey (bool) – whether to use the points generated during the survey to compute the final integral not recommended due to uncontrolled correlations in error estimates
verbosity (int) – verbosity level of the integrator
- survey_switch_condition()[source]¶
Check if the loss is negative. This test is used to switch from uniform sampling to sampling from the flow in the survey phase.
The loss is the distance between the target function and the flow PDF. Since the distance between the target function and the uniform function, a negative loss indicates that flow is doing better.