Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • john-veitch/bilby
  • duncanmmacleod/bilby
  • colm.talbot/bilby
  • lscsoft/bilby
  • matthew-pitkin/bilby
  • salvatore-vitale/tupak
  • charlie.hoy/bilby
  • bfarr/bilby
  • virginia.demilio/bilby
  • vivien/bilby
  • eric-howell/bilby
  • sebastian-khan/bilby
  • rhys.green/bilby
  • moritz.huebner/bilby
  • joseph.mills/bilby
  • scott.coughlin/bilby
  • matthew.carney/bilby
  • hyungwon.lee/bilby
  • monica.rizzo/bilby
  • christopher-berry/bilby
  • lindsay.demarchi/bilby
  • kaushik.rao/bilby
  • charles.kimball/bilby
  • andrew.matas/bilby
  • juan.calderonbustillo/bilby
  • patrick-meyers/bilby
  • hannah.middleton/bilby
  • eve.chase/bilby
  • grant.meadors/bilby
  • khun.phukon/bilby
  • sumeet.kulkarni/bilby
  • daniel.reardon/bilby
  • cjhaster/bilby
  • sylvia.biscoveanu/bilby
  • james-clark/bilby
  • meg.millhouse/bilby
  • joshua.willis/bilby
  • nikhil.sarin/bilby
  • paul.easter/bilby
  • youngmin/bilby
  • daniel-williams/bilby
  • shanika.galaudage/bilby
  • bruce.edelman/bilby
  • avi.vajpeyi/bilby
  • isobel.romero-shaw/bilby
  • andrew.kim/bilby
  • dominika.zieba/bilby
  • jonathan.davies/bilby
  • marc.arene/bilby
  • srishti.tiwari/bilby-tidal-heating-eccentric
  • aditya.vijaykumar/bilby
  • michael.williams/bilby
  • cecilio.garcia-quiros/bilby
  • rory-smith/bilby
  • maite.mateu-lucena/bilby
  • wushichao/bilby
  • kaylee.desoto/bilby
  • brandon.piotrzkowski/bilby
  • rossella.gamba/bilby
  • hunter.gabbard/bilby
  • deep.chatterjee/bilby
  • tathagata.ghosh/bilby
  • arunava.mukherjee/bilby
  • philip.relton/bilby
  • reed.essick/bilby
  • pawan.gupta/bilby
  • francisco.hernandez/bilby
  • rhiannon.udall/bilby
  • leo.tsukada/bilby
  • will-farr/bilby
  • vijay.varma/bilby
  • jeremy.baier/bilby
  • joshua.brandt/bilby
  • ethan.payne/bilby
  • ka-lok.lo/bilby
  • antoni.ramos-buades/bilby
  • oliviastephany.wilk/bilby
  • jack.heinzel/bilby
  • samson.leong/bilby-psi4
  • viviana.caceres/bilby
  • nadia.qutob/bilby
  • michael-coughlin/bilby
  • hemantakumar.phurailatpam/bilby
  • boris.goncharov/bilby
  • sama.al-shammari/bilby
  • siqi.zhong/bilby
  • jocelyn-read/bilby
  • marc.penuliar/bilby
  • stephanie.letourneau/bilby
  • alexandresebastien.goettel/bilby
  • alec.gunny/bilby
  • serguei.ossokine/bilby
  • pratyusava.baral/bilby
  • sophie.hourihane/bilby
  • eunsub/bilby
  • james.hart/bilby
  • pratyusava.baral/bilby-tg
  • zhaozc/bilby
  • pratyusava.baral/bilby_SoG
  • tomasz.baka/bilby
  • nicogerardo.bers/bilby
  • soumen.roy/bilby
  • isaac.mcmahon/healpix-redundancy
  • asamakai.baker/bilby-frequency-dependent-antenna-pattern-functions
  • anna.puecher/bilby
  • pratyusava.baral/bilby-x-g
  • thibeau.wouters/bilby
  • christian.adamcewicz/bilby
  • raffi.enficiaud/bilby
109 results
Show changes
Commits on Source (25)
Showing
with 1067 additions and 398 deletions
......@@ -38,10 +38,6 @@ containers:
.test-python: &test-python
stage: initial
image: python
before_script:
# this is required because pytables doesn't use a wheel on py37
- apt-get -yqq update
- apt-get -yqq install libhdf5-dev
script:
- python -m pip install .
- python -m pip list installed
......@@ -109,6 +105,15 @@ precommits-py3.9:
# CACHE_DIR: ".pip310"
# PYVERSION: "python310"
install:
stage: initial
parallel:
matrix:
- EXTRA: [gw, mcmc, all]
image: containers.ligo.org/lscsoft/bilby/v2-bilby-python39
script:
- pip install .[$EXTRA]
# ------------------- Test stage -------------------------------------------
.unit-tests: &unit-test
......@@ -133,7 +138,9 @@ python-3.9:
- coverage xml
artifacts:
reports:
cobertura: coverage.xml
coverage_report:
coverage_format: cobertura
path: coverage/cobertura-coverage.xml
paths:
- htmlcov/
expire_in: 30 days
......
include README.rst
include LICENSE.md
include requirements.txt
include gw_requirements.txt
include mcmc_requirements.txt
include optional_requirements.txt
include sampler_requirements.txt
recursive-include test *.py *.prior
......@@ -333,7 +333,7 @@ class Prior(object):
@classmethod
def from_repr(cls, string):
"""Generate the prior from it's __repr__"""
"""Generate the prior from its __repr__"""
return cls._from_repr(string)
@classmethod
......@@ -429,9 +429,9 @@ class Prior(object):
val = None
elif re.sub(r'\'.*\'', '', val) in ['r', 'u']:
val = val[2:-1]
elif "'" in val:
elif val.startswith("'") and val.endswith("'"):
val = val.strip("'")
elif '(' in val:
elif '(' in val and not val.startswith(("[", "{")):
other_cls = val.split('(')[0]
vals = '('.join(val.split('(')[1:])[:-1]
if "." in other_cls:
......
This diff is collapsed.
import re
import numpy as np
import scipy.stats
from scipy.special import erfinv
......@@ -7,7 +9,6 @@ from ..utils import logger, infer_args_from_method, get_dict_with_properties
class BaseJointPriorDist(object):
def __init__(self, names, bounds=None):
"""
A class defining JointPriorDist that will be overwritten with child
......@@ -24,7 +25,7 @@ class BaseJointPriorDist(object):
A list of bounds on each parameter. The defaults are for bounds at
+/- infinity.
"""
self.distname = 'joint_dist'
self.distname = "joint_dist"
if not isinstance(names, list):
self.names = [names]
else:
......@@ -41,8 +42,9 @@ class BaseJointPriorDist(object):
for bound in bounds:
if isinstance(bounds, (list, tuple, np.ndarray)):
if len(bound) != 2:
raise ValueError("Bounds must contain an upper and "
"lower value.")
raise ValueError(
"Bounds must contain an upper and " "lower value."
)
else:
if bound[1] <= bound[0]:
raise ValueError("Bounds are not properly set")
......@@ -76,8 +78,7 @@ class BaseJointPriorDist(object):
Check if all requested parameters have been filled.
"""
return not np.any([val is None for val in
self.requested_parameters.values()])
return not np.any([val is None for val in self.requested_parameters.values()])
def reset_request(self):
"""
......@@ -92,8 +93,7 @@ class BaseJointPriorDist(object):
Check if all the rescaled parameters have been filled.
"""
return not np.any([val is None for val in
self.rescale_parameters.values()])
return not np.any([val is None for val in self.rescale_parameters.values()])
def reset_rescale(self):
"""
......@@ -131,8 +131,12 @@ class BaseJointPriorDist(object):
"""
dist_name = self.__class__.__name__
instantiation_dict = self.get_instantiation_dict()
args = ', '.join(['{}={}'.format(key, repr(instantiation_dict[key]))
for key in instantiation_dict])
args = ", ".join(
[
"{}={}".format(key, repr(instantiation_dict[key]))
for key in instantiation_dict
]
)
return "{}({})".format(dist_name, args)
def prob(self, samp):
......@@ -308,9 +312,17 @@ class BaseJointPriorDist(object):
class MultivariateGaussianDist(BaseJointPriorDist):
def __init__(self, names, nmodes=1, mus=None, sigmas=None, corrcoefs=None,
covs=None, weights=None, bounds=None):
def __init__(
self,
names,
nmodes=1,
mus=None,
sigmas=None,
corrcoefs=None,
covs=None,
weights=None,
bounds=None,
):
"""
A class defining a multi-variate Gaussian, allowing multiple modes for
a Gaussian mixture model.
......@@ -359,11 +371,13 @@ class MultivariateGaussianDist(BaseJointPriorDist):
for name in self.names:
bound = self.bounds[name]
if bound[0] != -np.inf or bound[1] != np.inf:
logger.warning("If using bounded ranges on the multivariate "
"Gaussian this will lead to biased posteriors "
"for nested sampling routines that require "
"a prior transform.")
self.distname = 'mvg'
logger.warning(
"If using bounded ranges on the multivariate "
"Gaussian this will lead to biased posteriors "
"for nested sampling routines that require "
"a prior transform."
)
self.distname = "mvg"
self.mus = []
self.covs = []
self.corrcoefs = []
......@@ -385,8 +399,7 @@ class MultivariateGaussianDist(BaseJointPriorDist):
if len(np.shape(sigmas)) == 1:
sigmas = [sigmas]
elif len(np.shape(sigmas)) == 0:
raise ValueError("Must supply a list of standard "
"deviations")
raise ValueError("Must supply a list of standard " "deviations")
if covs is not None:
if isinstance(covs, np.ndarray):
covs = [covs]
......@@ -404,10 +417,11 @@ class MultivariateGaussianDist(BaseJointPriorDist):
if len(np.shape(corrcoefs)) == 2:
corrcoefs = [np.array(corrcoefs)]
elif len(np.shape(corrcoefs)) != 3:
raise TypeError("List of correlation coefficients the wrong shape")
raise TypeError(
"List of correlation coefficients the wrong shape"
)
elif not isinstance(corrcoefs, list):
raise TypeError("Must pass a list of correlation "
"coefficients")
raise TypeError("Must pass a list of correlation " "coefficients")
if weights is not None:
if isinstance(weights, (int, float)):
weights = [weights]
......@@ -429,12 +443,11 @@ class MultivariateGaussianDist(BaseJointPriorDist):
sigma = sigmas[i] if sigmas is not None else None
corrcoef = corrcoefs[i] if corrcoefs is not None else None
cov = covs[i] if covs is not None else None
weight = weights[i] if weights is not None else 1.
weight = weights[i] if weights is not None else 1.0
self.add_mode(mu, sigma, corrcoef, cov, weight)
def add_mode(self, mus=None, sigmas=None, corrcoef=None, cov=None,
weight=1.):
def add_mode(self, mus=None, sigmas=None, corrcoef=None, cov=None, weight=1.0):
"""
Add a new mode.
"""
......@@ -455,8 +468,10 @@ class MultivariateGaussianDist(BaseJointPriorDist):
if len(self.covs[-1].shape) != 2:
raise ValueError("Covariance matrix must be a 2d array")
if (self.covs[-1].shape[0] != self.covs[-1].shape[1] or
self.covs[-1].shape[0] != self.num_vars):
if (
self.covs[-1].shape[0] != self.covs[-1].shape[1]
or self.covs[-1].shape[0] != self.num_vars
):
raise ValueError("Covariance shape is inconsistent")
# check matrix is symmetric
......@@ -473,23 +488,25 @@ class MultivariateGaussianDist(BaseJointPriorDist):
self.corrcoefs.append(np.asarray(corrcoef))
if len(self.corrcoefs[-1].shape) != 2:
raise ValueError("Correlation coefficient matrix must be a 2d "
"array.")
if (self.corrcoefs[-1].shape[0] != self.corrcoefs[-1].shape[1] or
self.corrcoefs[-1].shape[0] != self.num_vars):
raise ValueError("Correlation coefficient matrix shape is "
"inconsistent")
raise ValueError(
"Correlation coefficient matrix must be a 2d " "array."
)
if (
self.corrcoefs[-1].shape[0] != self.corrcoefs[-1].shape[1]
or self.corrcoefs[-1].shape[0] != self.num_vars
):
raise ValueError(
"Correlation coefficient matrix shape is " "inconsistent"
)
# check matrix is symmetric
if not np.allclose(self.corrcoefs[-1], self.corrcoefs[-1].T):
raise ValueError("Correlation coefficient matrix is not "
"symmetric")
raise ValueError("Correlation coefficient matrix is not " "symmetric")
# check diagonal is all ones
if not np.all(np.diag(self.corrcoefs[-1]) == 1.):
raise ValueError("Correlation coefficient matrix is not"
"correct")
if not np.all(np.diag(self.corrcoefs[-1]) == 1.0):
raise ValueError("Correlation coefficient matrix is not" "correct")
try:
self.sigmas.append(list(sigmas)) # standard deviations
......@@ -497,8 +514,10 @@ class MultivariateGaussianDist(BaseJointPriorDist):
raise TypeError("'sigmas' must be a list")
if len(self.sigmas[-1]) != self.num_vars:
raise ValueError("Number of standard deviations must be the "
"same as the number of parameters.")
raise ValueError(
"Number of standard deviations must be the "
"same as the number of parameters."
)
# convert correlation coefficients to covariance matrix
D = self.sigmas[-1] * np.identity(self.corrcoefs[-1].shape[0])
......@@ -515,18 +534,20 @@ class MultivariateGaussianDist(BaseJointPriorDist):
self.eigvalues.append(evals)
self.eigvectors.append(evecs)
except Exception as e:
raise RuntimeError("Problem getting eigenvalues and vectors: "
"{}".format(e))
raise RuntimeError(
"Problem getting eigenvalues and vectors: " "{}".format(e)
)
# check eigenvalues are positive
if np.any(self.eigvalues[-1] <= 0.):
raise ValueError("Correlation coefficient matrix is not positive "
"definite")
if np.any(self.eigvalues[-1] <= 0.0):
raise ValueError(
"Correlation coefficient matrix is not positive " "definite"
)
self.sqeigvalues.append(np.sqrt(self.eigvalues[-1]))
# set the weights
if weight is None:
self.weights.append(1.)
self.weights.append(1.0)
else:
self.weights.append(weight)
......@@ -537,12 +558,13 @@ class MultivariateGaussianDist(BaseJointPriorDist):
self.nmodes += 1
# add multivariate Gaussian
self.mvn.append(scipy.stats.multivariate_normal(mean=self.mus[-1],
cov=self.covs[-1]))
self.mvn.append(
scipy.stats.multivariate_normal(mean=self.mus[-1], cov=self.covs[-1])
)
def _rescale(self, samp, **kwargs):
try:
mode = kwargs['mode']
mode = kwargs["mode"]
except KeyError:
mode = None
......@@ -552,17 +574,17 @@ class MultivariateGaussianDist(BaseJointPriorDist):
else:
mode = np.argwhere(self.cumweights - np.random.rand() > 0)[0][0]
samp = erfinv(2. * samp - 1) * 2. ** 0.5
samp = erfinv(2.0 * samp - 1) * 2.0 ** 0.5
# rotate and scale to the multivariate normal shape
samp = self.mus[mode] + self.sigmas[mode] * np.einsum('ij,kj->ik',
samp * self.sqeigvalues[mode],
self.eigvectors[mode])
samp = self.mus[mode] + self.sigmas[mode] * np.einsum(
"ij,kj->ik", samp * self.sqeigvalues[mode], self.eigvectors[mode]
)
return samp
def _sample(self, size, **kwargs):
try:
mode = kwargs['mode']
mode = kwargs["mode"]
except KeyError:
mode = None
......@@ -620,12 +642,15 @@ class MultivariateGaussianDist(BaseJointPriorDist):
if sorted(self.__dict__.keys()) != sorted(other.__dict__.keys()):
return False
for key in self.__dict__:
if key == 'mvn':
if key == "mvn":
if len(self.__dict__[key]) != len(other.__dict__[key]):
return False
for thismvn, othermvn in zip(self.__dict__[key], other.__dict__[key]):
if (not isinstance(thismvn, scipy.stats._multivariate.multivariate_normal_frozen) or
not isinstance(othermvn, scipy.stats._multivariate.multivariate_normal_frozen)):
if not isinstance(
thismvn, scipy.stats._multivariate.multivariate_normal_frozen
) or not isinstance(
othermvn, scipy.stats._multivariate.multivariate_normal_frozen
):
return False
elif isinstance(self.__dict__[key], (np.ndarray, list)):
thisarr = np.asarray(self.__dict__[key])
......@@ -645,13 +670,44 @@ class MultivariateGaussianDist(BaseJointPriorDist):
return False
return True
@classmethod
def from_repr(cls, string):
"""Generate the distribution from its __repr__"""
return cls._from_repr(string)
@classmethod
def _from_repr(cls, string):
subclass_args = infer_args_from_method(cls.__init__)
string = string.replace(" ", "")
kwargs = cls._split_repr(string)
for key in kwargs:
val = kwargs[key]
if key not in subclass_args:
raise AttributeError(
"Unknown argument {} for class {}".format(key, cls.__name__)
)
else:
kwargs[key.strip()] = Prior._parse_argument_string(val)
return cls(**kwargs)
@classmethod
def _split_repr(cls, string):
string = string.replace(",", ", ")
# see https://stackoverflow.com/a/72146415/1862861
args = re.findall(r"(\w+)=(\[.*?]|{.*?}|\S+)(?=\s*,\s*\w+=|\Z)", string)
kwargs = dict()
for key, arg in args:
kwargs[key.strip()] = arg
return kwargs
class MultivariateNormalDist(MultivariateGaussianDist):
""" A synonym for the :class:`~bilby.core.prior.MultivariateGaussianDist` distribution."""
"""A synonym for the :class:`~bilby.core.prior.MultivariateGaussianDist` distribution."""
class JointPrior(Prior):
def __init__(self, dist, name=None, latex_label=None, unit=None):
"""This defines the single parameter Prior object for parameters that belong to a JointPriorDist
......@@ -667,14 +723,23 @@ class JointPrior(Prior):
See superclass
"""
if BaseJointPriorDist not in dist.__class__.__bases__:
raise TypeError("Must supply a JointPriorDist object instance to be shared by all joint params")
raise TypeError(
"Must supply a JointPriorDist object instance to be shared by all joint params"
)
if name not in dist.names:
raise ValueError("'{}' is not a parameter in the JointPriorDist".format(name))
raise ValueError(
"'{}' is not a parameter in the JointPriorDist".format(name)
)
self.dist = dist
super(JointPrior, self).__init__(name=name, latex_label=latex_label, unit=unit, minimum=dist.bounds[name][0],
maximum=dist.bounds[name][1])
super(JointPrior, self).__init__(
name=name,
latex_label=latex_label,
unit=unit,
minimum=dist.bounds[name][0],
maximum=dist.bounds[name][1],
)
@property
def minimum(self):
......@@ -737,9 +802,11 @@ class JointPrior(Prior):
"""
if self.name in self.dist.sampled_parameters:
logger.warning("You have already drawn a sample from parameter "
"'{}'. The same sample will be "
"returned".format(self.name))
logger.warning(
"You have already drawn a sample from parameter "
"'{}'. The same sample will be "
"returned".format(self.name)
)
if len(self.dist.current_sample) == 0:
# generate a sample
......@@ -779,16 +846,22 @@ class JointPrior(Prior):
# check for the same number of values for each parameter
for i in range(len(self.dist) - 1):
if (isinstance(values[i], (list, np.ndarray)) or
isinstance(values[i + 1], (list, np.ndarray))):
if (isinstance(values[i], (list, np.ndarray)) and
isinstance(values[i + 1], (list, np.ndarray))):
if isinstance(values[i], (list, np.ndarray)) or isinstance(
values[i + 1], (list, np.ndarray)
):
if isinstance(values[i], (list, np.ndarray)) and isinstance(
values[i + 1], (list, np.ndarray)
):
if len(values[i]) != len(values[i + 1]):
raise ValueError("Each parameter must have the same "
"number of requested values.")
raise ValueError(
"Each parameter must have the same "
"number of requested values."
)
else:
raise ValueError("Each parameter must have the same "
"number of requested values.")
raise ValueError(
"Each parameter must have the same "
"number of requested values."
)
lnp = self.dist.ln_prob(np.asarray(values).T)
......@@ -798,16 +871,16 @@ class JointPrior(Prior):
else:
# if not all parameters have been requested yet, just return 0
if isinstance(val, (float, int)):
return 0.
return 0.0
else:
try:
# check value has a length
len(val)
except Exception as e:
raise TypeError('Invalid type for ln_prob: {}'.format(e))
raise TypeError("Invalid type for ln_prob: {}".format(e))
if len(val) == 1:
return 0.
return 0.0
else:
return np.zeros_like(val)
......@@ -831,14 +904,18 @@ class JointPrior(Prior):
class MultivariateGaussian(JointPrior):
def __init__(self, dist, name=None, latex_label=None, unit=None):
if not isinstance(dist, MultivariateGaussianDist):
raise JointPriorDistError("dist object must be instance of MultivariateGaussianDist")
super(MultivariateGaussian, self).__init__(dist=dist, name=name, latex_label=latex_label, unit=unit)
raise JointPriorDistError(
"dist object must be instance of MultivariateGaussianDist"
)
super(MultivariateGaussian, self).__init__(
dist=dist, name=name, latex_label=latex_label, unit=unit
)
class MultivariateNormal(MultivariateGaussian):
""" A synonym for the :class:`bilby.core.prior.MultivariateGaussian`
prior distribution."""
"""A synonym for the :class:`bilby.core.prior.MultivariateGaussian`
prior distribution."""
class JointPriorDistError(PriorException):
""" Class for Error handling of JointPriorDists for JointPriors """
"""Class for Error handling of JointPriorDists for JointPriors"""
......@@ -1849,7 +1849,7 @@ class ResultList(list):
raise ResultListError("Inconsistent parameters between results")
def check_consistent_data(self):
if not np.all([res.log_noise_evidence == self[0].log_noise_evidence for res in self])\
if not np.allclose([res.log_noise_evidence for res in self], self[0].log_noise_evidence, atol=1e-8, rtol=0.0)\
and not np.all([np.isnan(res.log_noise_evidence) for res in self]):
raise ResultListError("Inconsistent data between results")
......@@ -1977,6 +1977,8 @@ def make_pp_plot(results, filename=None, save=True, confidence_interval=[0.68, 0
The font size for the legend
keys: list
A list of keys to use, if None defaults to search_parameter_keys
title: bool
Whether to add the number of results and total p-value as a plot title
confidence_interval_alpha: float, list, optional
The transparency for the background condifence interval
weight_list: list, optional
......@@ -1998,11 +2000,12 @@ def make_pp_plot(results, filename=None, save=True, confidence_interval=[0.68, 0
if weight_list is None:
weight_list = [None] * len(results)
credible_levels = pd.DataFrame()
credible_levels = list()
for i, result in enumerate(results):
credible_levels = credible_levels.append(
result.get_all_injection_credible_levels(keys, weights=weight_list[i]),
ignore_index=True)
credible_levels.append(
result.get_all_injection_credible_levels(keys, weights=weight_list[i])
)
credible_levels = pd.DataFrame(credible_levels)
if lines is None:
colors = ["C{}".format(i) for i in range(8)]
......
......@@ -9,7 +9,7 @@ import multiprocessing
import pickle
import numpy as np
from pandas import DataFrame
from pandas import DataFrame, Series
from ..core.likelihood import MarginalizedLikelihoodReconstructionError
from ..core.utils import logger, solar_mass, command_line_args
......@@ -32,12 +32,16 @@ def redshift_to_comoving_distance(redshift, cosmology=None):
def luminosity_distance_to_redshift(distance, cosmology=None):
from astropy import units
cosmology = get_cosmology(cosmology)
if isinstance(distance, Series):
distance = distance.values
return z_at_value(cosmology.luminosity_distance, distance * units.Mpc)
def comoving_distance_to_redshift(distance, cosmology=None):
from astropy import units
cosmology = get_cosmology(cosmology)
if isinstance(distance, Series):
distance = distance.values
return z_at_value(cosmology.comoving_distance, distance * units.Mpc)
......@@ -1209,9 +1213,9 @@ def compute_snrs(sample, likelihood, npool=1):
"""
if likelihood is not None:
if isinstance(sample, dict):
likelihood.parameters.update(sample)
signal_polarizations =\
likelihood.waveform_generator.frequency_domain_strain(sample)
likelihood.parameters.update(sample)
for ifo in likelihood.interferometers:
per_detector_snr = likelihood.calculate_snrs(
signal_polarizations, ifo)
......@@ -1255,10 +1259,10 @@ def _compute_snrs(args):
"""A wrapper of computing the SNRs to enable multiprocessing"""
ii, sample, likelihood = args
sample = dict(sample).copy()
likelihood.parameters.update(sample)
signal_polarizations = likelihood.waveform_generator.frequency_domain_strain(
sample
)
likelihood.parameters.update(sample)
snrs = list()
for ifo in likelihood.interferometers:
snrs.append(likelihood.calculate_snrs(signal_polarizations, ifo))
......
......@@ -39,7 +39,7 @@ def get_cosmology(cosmology=None):
if cosmology is None:
cosmology = DEFAULT_COSMOLOGY
elif isinstance(cosmology, str):
cosmology = cosmo.__dict__[cosmology]
cosmology = getattr(cosmo, cosmology)
return cosmology
......@@ -65,7 +65,7 @@ def set_cosmology(cosmology=None):
elif isinstance(cosmology, cosmo.FLRW):
cosmology = cosmology
elif isinstance(cosmology, str):
cosmology = cosmo.__dict__[cosmology]
cosmology = getattr(cosmo, cosmology)
elif isinstance(cosmology, dict):
if 'Ode0' in cosmology.keys():
if 'w0' in cosmology.keys():
......
This diff is collapsed.
......@@ -309,7 +309,8 @@ def _base_lal_cbc_fd_waveform(
if pn_amplitude_order != 0:
start_frequency = lalsim.SimInspiralfLow2fStart(
minimum_frequency, int(pn_amplitude_order), approximant)
float(minimum_frequency), int(pn_amplitude_order), approximant
)
else:
start_frequency = minimum_frequency
......@@ -339,9 +340,9 @@ def _base_lal_cbc_fd_waveform(
lalsim.SimInspiralWaveformParamsInsertPNAmplitudeOrder(
waveform_dictionary, int(pn_amplitude_order))
lalsim_SimInspiralWaveformParamsInsertTidalLambda1(
waveform_dictionary, lambda_1)
waveform_dictionary, float(lambda_1))
lalsim_SimInspiralWaveformParamsInsertTidalLambda2(
waveform_dictionary, lambda_2)
waveform_dictionary, float(lambda_2))
for key, value in waveform_kwargs.items():
func = getattr(lalsim, "SimInspiralWaveformParamsInsert" + key, None)
......@@ -772,9 +773,9 @@ def _base_waveform_frequency_sequence(
lalsim.SimInspiralWaveformParamsInsertPNAmplitudeOrder(
waveform_dictionary, int(pn_amplitude_order))
lalsim_SimInspiralWaveformParamsInsertTidalLambda1(
waveform_dictionary, lambda_1)
waveform_dictionary, float(lambda_1))
lalsim_SimInspiralWaveformParamsInsertTidalLambda2(
waveform_dictionary, lambda_2)
waveform_dictionary, float(lambda_2))
for key, value in waveform_kwargs.items():
func = getattr(lalsim, "SimInspiralWaveformParamsInsert" + key, None)
......
......@@ -721,25 +721,6 @@ def gw_data_find(observatory, gps_start_time, duration, calibration,
return output_cache_file
def build_roq_weights(data, basis, deltaF):
"""
For a data array and reduced basis compute roq weights
Parameters
==========
data: array-like
data set
basis: array-like
(reduced basis element)*invV (the inverse Vandermonde matrix)
deltaF: float
integration element df
"""
weights = np.dot(data, np.conjugate(basis)) * deltaF * 4.
return weights
def convert_args_list_to_float(*args_list):
""" Converts inputs to floats, returns a list in the same order as the input"""
try:
......
......@@ -54,8 +54,8 @@ RUN git clone https://github.com/jellis18/PTMCMCSampler.git \
&& (cd PTMCMCSampler && python setup.py install)
# Install GW packages
RUN conda install -n ${{conda_env}} -c conda-forge python-lalsimulation
RUN pip install ligo-gracedb gwpy ligo.skymap bilby.cython
RUN conda install -n ${{conda_env}} -c conda-forge python-lalsimulation bilby.cython
RUN pip install ligo-gracedb gwpy ligo.skymap
# Add the ROQ data to the image
RUN mkdir roq_basis \
......@@ -64,4 +64,6 @@ RUN mkdir roq_basis \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/B_quadratic.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/fnodes_linear.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/fnodes_quadratic.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/params.dat
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/params.dat \
&& wget https://git.ligo.org/soichiro.morisaki/roq_basis/raw/main/IMRPhenomD/16s_nospins/basis.hdf5 \
&& wget https://git.ligo.org/soichiro.morisaki/roq_basis/raw/main/IMRPhenomD/16s_nospins/basis_multiband.hdf5
......@@ -56,8 +56,8 @@ RUN git clone https://github.com/jellis18/PTMCMCSampler.git \
&& (cd PTMCMCSampler && python setup.py install)
# Install GW packages
RUN conda install -n ${conda_env} -c conda-forge python-lalsimulation
RUN pip install ligo-gracedb gwpy ligo.skymap bilby.cython
RUN conda install -n ${conda_env} -c conda-forge python-lalsimulation bilby.cython
RUN pip install ligo-gracedb gwpy ligo.skymap
# Add the ROQ data to the image
RUN mkdir roq_basis \
......@@ -66,4 +66,6 @@ RUN mkdir roq_basis \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/B_quadratic.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/fnodes_linear.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/fnodes_quadratic.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/params.dat
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/params.dat \
&& wget https://git.ligo.org/soichiro.morisaki/roq_basis/raw/main/IMRPhenomD/16s_nospins/basis.hdf5 \
&& wget https://git.ligo.org/soichiro.morisaki/roq_basis/raw/main/IMRPhenomD/16s_nospins/basis_multiband.hdf5
......@@ -56,8 +56,8 @@ RUN git clone https://github.com/jellis18/PTMCMCSampler.git \
&& (cd PTMCMCSampler && python setup.py install)
# Install GW packages
RUN conda install -n ${conda_env} -c conda-forge python-lalsimulation
RUN pip install ligo-gracedb gwpy ligo.skymap bilby.cython
RUN conda install -n ${conda_env} -c conda-forge python-lalsimulation bilby.cython
RUN pip install ligo-gracedb gwpy ligo.skymap
# Add the ROQ data to the image
RUN mkdir roq_basis \
......@@ -66,4 +66,6 @@ RUN mkdir roq_basis \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/B_quadratic.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/fnodes_linear.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/fnodes_quadratic.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/params.dat
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/params.dat \
&& wget https://git.ligo.org/soichiro.morisaki/roq_basis/raw/main/IMRPhenomD/16s_nospins/basis.hdf5 \
&& wget https://git.ligo.org/soichiro.morisaki/roq_basis/raw/main/IMRPhenomD/16s_nospins/basis_multiband.hdf5
......@@ -8,6 +8,4 @@ pandas
dill
tqdm
h5py
tables
astropy
attrs
\ No newline at end of file
attrs
......@@ -16,11 +16,5 @@ addopts =
name = bilby
license_file = LICENSE.md
[options.extras_require]
gw =
astropy
gwpy
lalsuite
[isort]
known_third_party = astropy,attr,bilby,deepdish,gwin,gwinc,gwpy,lal,lalsimulation,matplotlib,mock,nflows,numpy,packaging,pandas,past,pycbc,pymc3,pytest,scipy,setuptools,skimage,torch
......@@ -56,8 +56,12 @@ def get_long_description():
return long_description
def get_requirements():
with open("requirements.txt", "r") as ff:
def get_requirements(kind=None):
if kind is None:
fname = "requirements.txt"
else:
fname = f"{kind}_requirements.txt"
with open(fname, "r") as ff:
requirements = ff.readlines()
return requirements
......@@ -107,6 +111,16 @@ setup(
},
python_requires=">=3.8",
install_requires=get_requirements(),
extras_require={
"gw": get_requirements("gw"),
"mcmc": get_requirements("mcmc"),
"all": (
get_requirements("sampler")
+ get_requirements("gw")
+ get_requirements("mcmc")
+ get_requirements("optional")
),
},
entry_points={
"console_scripts": [
"bilby_plot=cli_bilby.plot_multiple_posteriors:main",
......