Add ability to incorporate time varying calibration uncertainty into the inference
It would be interesting to try incorporating the effects of a time varying amplitude/phase calibration uncertainty into the inference stage. If priors were set on amplitude scale factors and phase offsets at a set of nodes in time, the values could be interpolated between and then applied to rescale/rephase the data during inference. This would mean that the fast likelihood evaluation using the pre-summed dot products of the data and antenna patterns could not be used in this case.