Issue
Currently - as the primary bottleneck, the dimension reduction creates a low dimensional map f from pairs (x, G(x)) such that
G(x) = D o f o E (x), for encoder-decoder pairing E,D that possibly reduce dimensionality.
When predicting at a new x' the current Emulator.predict() method performs f o E(x') with optional decoding.
During sampling for MCMC we typically use this predict method.
For full efficiency in sampling too, we should really be projecting x |-> z = E(x) and sampling over the encoded z, then projecting back to x at the end.
Solution
- Possibly remove the
encoder_schedule from the emulator storage - have it initialized there and passed back
- Pass the
encoder_schedule to the `MCMCWrapper
- Encode not only the observation in MCMC but also the prior and IC.
- sample as normal with the reduced prior, but when returning the posterior from the MCMC, sample over the prior to project back into the original space
Issue
Currently - as the primary bottleneck, the dimension reduction creates a low dimensional map
ffrom pairs(x, G(x))such thatG(x) = D o f o E (x), for encoder-decoder pairingE,Dthat possibly reduce dimensionality.When predicting at a new
x'the current Emulator.predict() method performsf o E(x')with optional decoding.During sampling for MCMC we typically use this predict method.
For full efficiency in sampling too, we should really be projecting
x |-> z = E(x)and sampling over the encodedz, then projecting back toxat the end.Solution
encoder_schedulefrom the emulator storage - have it initialized there and passed backencoder_scheduleto the `MCMCWrapper