Skip to content

Distance marginalization error

Running a distance-time marginalized analysis, the code stumbles on this error:

Traceback (most recent call last):
  File "/home/gregory.ashton/.conda/envs/bilby-test-2/lib/python3.7/multiprocessing/pool.py", line 121, in worker
    result = (True, func(*args, **kwds))
  File "/home/gregory.ashton/.conda/envs/bilby-test-2/lib/python3.7/multiprocessing/pool.py", line 44, in mapstar
    return list(map(*args))
  File "/home/gregory.ashton/.conda/envs/bilby-test-2/lib/python3.7/site-packages/bilby/bilby_mcmc/sampler.py", line 1303, in call_step
    sampler = sampler.step()
  File "/home/gregory.ashton/.conda/envs/bilby-test-2/lib/python3.7/site-packages/bilby/bilby_mcmc/sampler.py", line 1210, in step
    prop[LOGLKEY] = self.log_likelihood(prop)
  File "/home/gregory.ashton/.conda/envs/bilby-test-2/lib/python3.7/site-packages/bilby/bilby_mcmc/sampler.py", line 1171, in log_likelihood
    logl = _likelihood.log_likelihood_ratio()
  File "/home/gregory.ashton/.conda/envs/bilby-test-2/lib/python3.7/site-packages/bilby/gw/likelihood.py", line 420, in log_likelihood_ratio
    h_inner_h=optimal_snr_squared)
  File "/home/gregory.ashton/.conda/envs/bilby-test-2/lib/python3.7/site-packages/bilby/gw/likelihood.py", line 714, in time_marginalized_likelihood
    d_inner_h=d_inner_h_tc_array, h_inner_h=h_inner_h)
  File "/home/gregory.ashton/.conda/envs/bilby-test-2/lib/python3.7/site-packages/bilby/gw/likelihood.py", line 701, in distance_marginalized_likelihood
    d_inner_h_ref, h_inner_h_ref)
  File "/home/gregory.ashton/.conda/envs/bilby-test-2/lib/python3.7/site-packages/bilby/core/utils.py", line 932, in __call__
    output[~bad], ier = bispeu(*self.tck, x[~bad], y[~bad])
ValueError: unexpected array size: new_size=1, got array with arr_size=0

If I'm reading it correctly, the bad indexes are all True, so ~bad are False. So, I think this can be fixed by a simple check here. If bad is all True, the line isn't needed.

However, I'm rather unfamiliar with this bit of the code. Does this sound correct @colm.talbot ?