Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • john-veitch/bilby
  • duncanmmacleod/bilby
  • colm.talbot/bilby
  • lscsoft/bilby
  • matthew-pitkin/bilby
  • salvatore-vitale/tupak
  • charlie.hoy/bilby
  • bfarr/bilby
  • virginia.demilio/bilby
  • vivien/bilby
  • eric-howell/bilby
  • sebastian-khan/bilby
  • rhys.green/bilby
  • moritz.huebner/bilby
  • joseph.mills/bilby
  • scott.coughlin/bilby
  • matthew.carney/bilby
  • hyungwon.lee/bilby
  • monica.rizzo/bilby
  • christopher-berry/bilby
  • lindsay.demarchi/bilby
  • kaushik.rao/bilby
  • charles.kimball/bilby
  • andrew.matas/bilby
  • juan.calderonbustillo/bilby
  • patrick-meyers/bilby
  • hannah.middleton/bilby
  • eve.chase/bilby
  • grant.meadors/bilby
  • khun.phukon/bilby
  • sumeet.kulkarni/bilby
  • daniel.reardon/bilby
  • cjhaster/bilby
  • sylvia.biscoveanu/bilby
  • james-clark/bilby
  • meg.millhouse/bilby
  • joshua.willis/bilby
  • nikhil.sarin/bilby
  • paul.easter/bilby
  • youngmin/bilby
  • daniel-williams/bilby
  • shanika.galaudage/bilby
  • bruce.edelman/bilby
  • avi.vajpeyi/bilby
  • isobel.romero-shaw/bilby
  • andrew.kim/bilby
  • dominika.zieba/bilby
  • jonathan.davies/bilby
  • marc.arene/bilby
  • srishti.tiwari/bilby-tidal-heating-eccentric
  • aditya.vijaykumar/bilby
  • michael.williams/bilby
  • cecilio.garcia-quiros/bilby
  • rory-smith/bilby
  • maite.mateu-lucena/bilby
  • wushichao/bilby
  • kaylee.desoto/bilby
  • brandon.piotrzkowski/bilby
  • rossella.gamba/bilby
  • hunter.gabbard/bilby
  • deep.chatterjee/bilby
  • tathagata.ghosh/bilby
  • arunava.mukherjee/bilby
  • philip.relton/bilby
  • reed.essick/bilby
  • pawan.gupta/bilby
  • francisco.hernandez/bilby
  • rhiannon.udall/bilby
  • leo.tsukada/bilby
  • will-farr/bilby
  • vijay.varma/bilby
  • jeremy.baier/bilby
  • joshua.brandt/bilby
  • ethan.payne/bilby
  • ka-lok.lo/bilby
  • antoni.ramos-buades/bilby
  • oliviastephany.wilk/bilby
  • jack.heinzel/bilby
  • samson.leong/bilby-psi4
  • viviana.caceres/bilby
  • nadia.qutob/bilby
  • michael-coughlin/bilby
  • hemantakumar.phurailatpam/bilby
  • boris.goncharov/bilby
  • sama.al-shammari/bilby
  • siqi.zhong/bilby
  • jocelyn-read/bilby
  • marc.penuliar/bilby
  • stephanie.letourneau/bilby
  • alexandresebastien.goettel/bilby
  • alec.gunny/bilby
  • serguei.ossokine/bilby
  • pratyusava.baral/bilby
  • sophie.hourihane/bilby
  • eunsub/bilby
  • james.hart/bilby
  • pratyusava.baral/bilby-tg
  • zhaozc/bilby
  • pratyusava.baral/bilby_SoG
  • tomasz.baka/bilby
  • nicogerardo.bers/bilby
  • soumen.roy/bilby
  • isaac.mcmahon/healpix-redundancy
  • asamakai.baker/bilby-frequency-dependent-antenna-pattern-functions
  • anna.puecher/bilby
  • pratyusava.baral/bilby-x-g
  • thibeau.wouters/bilby
  • christian.adamcewicz/bilby
  • raffi.enficiaud/bilby
109 results
Show changes
Commits on Source (87)
Showing
with 995 additions and 125 deletions
......@@ -14,12 +14,10 @@ stages:
- deploy
# test example on python 2
python-2:
python-2.7:
stage: test
image: bilbydev/test-suite-py2
image: bilbydev/bilby-test-suite-python27
before_script:
# Source the .bashrc for MultiNest
- source /root/.bashrc
# Install the dependencies specified in the Pipfile
- pipenv install --two --python=/opt/conda/bin/python2 --system --deploy
script:
......@@ -28,12 +26,10 @@ python-2:
- pytest --ignore=test/utils_py3_test.py
# test example on python 3
python-3:
python-3.7:
stage: test
image: bilbydev/test-suite-py3
image: bilbydev/bilby-test-suite-python37
before_script:
# Source the .bashrc for MultiNest
- source /root/.bashrc
# Install the dependencies specified in the Pipfile
- pipenv install --three --python=/opt/conda/bin/python --system --deploy
script:
......@@ -49,7 +45,6 @@ python-3:
# Make the documentation
- cd docs
- conda install -y make
- make clean
- make html
......@@ -62,8 +57,8 @@ python-3:
pages:
stage: deploy
dependencies:
- python-3
- python-2
- python-3.7
- python-2.7
script:
- mkdir public/
- mv htmlcov/ public/
......
......@@ -2,12 +2,40 @@
## Unreleased
### Added
-
### Changed
-
### Removed
-
## [0.3.5] 2019-01-25
### Added
- Reduced Order Quadrature likelihood
- PTMCMCSampler
- CBC result class
- Additional tutorials on using GraceDB and experts guide on running on events in open data
### Changed
- Updated repository information in Dockerfile for PyMultinest
## [0.3.4] 2019-01-10
### Changes
- Renamed the "basic_tutorial.py" example to "fast_tutorial.py" and created a
"standard_15d_cbc_tutorial.py"
- Renamed "prior" to "priors" in bilby.gw.likelihood.GravtitationalWaveTransient
for consistency with bilby.core. **WARNING**: This will break scripts which
use marginalization.
- Added `outdir` kwarg for plotting methods in `bilby.core.result.Result`. This makes plotting
into custom destinations easier.
- Fixed definition of matched_filter_snr, the interferometer method has become `ifo.inner_product`.
### Added
- log-likelihood evaluations for pymultinest
## [0.3.3] 2018-11-08
Changes currently on master, but not under a tag.
......@@ -20,6 +48,7 @@ Changes currently on master, but not under a tag.
- Added method to result to get injection recovery credible levels
- Added function to generate a pp-plot from many results to core/result.py
- Fixed a bug which caused `Interferometer.detector_tensor` not to update when `latitude`, `longitude`, `xarm_azimuth`, `yarm_azimuth`, `xarm_tilt`, `yarm_tilt` were updated.
- Added implementation of the ROQ likelihood. The basis needs to be specified by the user.
- Extracted time and frequency series behaviour from `WaveformGenerator` and `InterferometerStrainData` and moved it to `series.gw.CoupledTimeAndFrequencySeries`
### Changes
......
include README.rst
include LICENSE.md
......@@ -210,20 +210,24 @@ class PriorDict(OrderedDict):
"""
return np.product([self[key].prob(sample[key]) for key in sample], **kwargs)
def ln_prob(self, sample):
def ln_prob(self, sample, axis=None):
"""
Parameters
----------
sample: dict
Dictionary of the samples of which we want to have the log probability of
Dictionary of the samples of which to calculate the log probability
axis: None or int
Axis along which the summation is performed
Returns
-------
float: Joint log probability of all the individual sample probabilities
float or ndarray:
Joint log probability of all the individual sample probabilities
"""
return np.sum([self[key].ln_prob(sample[key]) for key in sample])
return np.sum([self[key].ln_prob(sample[key]) for key in sample],
axis=axis)
def rescale(self, keys, theta):
"""Rescale samples from unit cube to prior
......
......@@ -11,6 +11,7 @@ import corner
import scipy.stats
import matplotlib
import matplotlib.pyplot as plt
from matplotlib import lines as mpllines
from . import utils
from .utils import (logger, infer_parameters_from_function,
......@@ -74,9 +75,10 @@ class Result(object):
nested_samples=None, log_evidence=np.nan,
log_evidence_err=np.nan, log_noise_evidence=np.nan,
log_bayes_factor=np.nan, log_likelihood_evaluations=None,
sampling_time=None, nburn=None, walkers=None,
max_autocorrelation_time=None, parameter_labels=None,
parameter_labels_with_unit=None, version=None):
log_prior_evaluations=None, sampling_time=None, nburn=None,
walkers=None, max_autocorrelation_time=None,
parameter_labels=None, parameter_labels_with_unit=None,
version=None):
""" A class to store the results of the sampling run
Parameters
......@@ -102,6 +104,8 @@ class Result(object):
Natural log evidences
log_likelihood_evaluations: array_like
The evaluations of the likelihood for each sample point
log_prior_evaluations: array_like
The evaluations of the prior for each sample point
sampling_time: float
The time taken to complete the sampling
nburn: int
......@@ -116,9 +120,10 @@ class Result(object):
Version information for software used to generate the result. Note,
this information is generated when the result object is initialized
Note:
All sampling output parameters, e.g. the samples themselves are
typically not given at initialisation, but set at a later stage.
Note
---------
All sampling output parameters, e.g. the samples themselves are
typically not given at initialisation, but set at a later stage.
"""
......@@ -143,10 +148,14 @@ class Result(object):
self.log_noise_evidence = log_noise_evidence
self.log_bayes_factor = log_bayes_factor
self.log_likelihood_evaluations = log_likelihood_evaluations
self.log_prior_evaluations = log_prior_evaluations
self.sampling_time = sampling_time
self.version = version
self.max_autocorrelation_time = max_autocorrelation_time
self.prior_values = None
self._kde = None
def __str__(self):
"""Print a summary """
if getattr(self, 'posterior', None) is not None:
......@@ -269,8 +278,8 @@ class Result(object):
'log_noise_evidence', 'log_bayes_factor', 'priors', 'posterior',
'injection_parameters', 'meta_data', 'search_parameter_keys',
'fixed_parameter_keys', 'sampling_time', 'sampler_kwargs',
'log_likelihood_evaluations', 'samples', 'nested_samples',
'walkers', 'nburn', 'parameter_labels',
'log_likelihood_evaluations', 'log_prior_evaluations', 'samples',
'nested_samples', 'walkers', 'nburn', 'parameter_labels',
'parameter_labels_with_unit', 'version']
dictionary = OrderedDict()
for attr in save_attrs:
......@@ -281,7 +290,7 @@ class Result(object):
pass
return dictionary
def save_to_file(self, overwrite=False):
def save_to_file(self, overwrite=False, outdir=None):
"""
Writes the Result to a deepdish h5 file
......@@ -290,9 +299,12 @@ class Result(object):
overwrite: bool, optional
Whether or not to overwrite an existing result file.
default=False
outdir: str, optional
Path to the outdir. Default is the one stored in the result object.
"""
file_name = result_file_name(self.outdir, self.label)
utils.check_directory_exists_and_if_not_mkdir(self.outdir)
outdir = self._safe_outdir_creation(outdir, self.save_to_file)
file_name = result_file_name(outdir, self.label)
if os.path.isfile(file_name):
if overwrite:
logger.debug('Removing existing file {}'.format(file_name))
......@@ -322,10 +334,10 @@ class Result(object):
logger.error("\n\n Saving the data has failed with the "
"following message:\n {} \n\n".format(e))
def save_posterior_samples(self):
def save_posterior_samples(self, outdir=None):
"""Saves posterior samples to a file"""
filename = '{}/{}_posterior_samples.txt'.format(self.outdir, self.label)
utils.check_directory_exists_and_if_not_mkdir(self.outdir)
outdir = self._safe_outdir_creation(outdir, self.save_posterior_samples)
filename = '{}/{}_posterior_samples.txt'.format(outdir, self.label)
self.posterior.to_csv(filename, index=False, header=True)
def get_latex_labels_from_parameter_keys(self, keys):
......@@ -385,7 +397,7 @@ class Result(object):
return self.posterior_volume / self.prior_volume(priors)
def get_one_dimensional_median_and_error_bar(self, key, fmt='.2f',
quantiles=[0.16, 0.84]):
quantiles=(0.16, 0.84)):
""" Calculate the median and error bar for a given key
Parameters
......@@ -394,8 +406,8 @@ class Result(object):
The parameter key for which to calculate the median and error bar
fmt: str, ('.2f')
A format string
quantiles: list
A length-2 list of the lower and upper-quantiles to calculate
quantiles: list, tuple
A length-2 tuple of the lower and upper-quantiles to calculate
the errors bars for.
Returns
......@@ -424,8 +436,8 @@ class Result(object):
def plot_single_density(self, key, prior=None, cumulative=False,
title=None, truth=None, save=True,
file_base_name=None, bins=50, label_fontsize=16,
title_fontsize=16, quantiles=[0.16, 0.84], dpi=300):
""" Plot a 1D marginal density, either probablility or cumulative.
title_fontsize=16, quantiles=(0.16, 0.84), dpi=300):
""" Plot a 1D marginal density, either probability or cumulative.
Parameters
----------
......@@ -454,8 +466,8 @@ class Result(object):
The number of histogram bins
label_fontsize, title_fontsize: int
The fontsizes for the labels and titles
quantiles: list
A length-2 list of the lower and upper-quantiles to calculate
quantiles: tuple
A length-2 tuple of the lower and upper-quantiles to calculate
the errors bars for.
dpi: int
Dots per inch resolution of the plot
......@@ -489,7 +501,7 @@ class Result(object):
if isinstance(prior, Prior):
theta = np.linspace(ax.get_xlim()[0], ax.get_xlim()[1], 300)
ax.plot(theta, Prior.prob(theta), color='C2')
ax.plot(theta, prior.prob(theta), color='C2')
if save:
fig.tight_layout()
......@@ -504,7 +516,8 @@ class Result(object):
def plot_marginals(self, parameters=None, priors=None, titles=True,
file_base_name=None, bins=50, label_fontsize=16,
title_fontsize=16, quantiles=[0.16, 0.84], dpi=300):
title_fontsize=16, quantiles=(0.16, 0.84), dpi=300,
outdir=None):
""" Plot 1D marginal distributions
Parameters
......@@ -527,12 +540,14 @@ class Result(object):
bins: int
The number of histogram bins
label_fontsize, title_fontsize: int
The fontsizes for the labels and titles
quantiles: list
A length-2 list of the lower and upper-quantiles to calculate
The font sizes for the labels and titles
quantiles: tuple
A length-2 tuple of the lower and upper-quantiles to calculate
the errors bars for.
dpi: int
Dots per inch resolution of the plot
outdir: str, optional
Path to the outdir. Default is the one store in the result object.
Returns
-------
......@@ -554,7 +569,8 @@ class Result(object):
truths = self.injection_parameters
if file_base_name is None:
file_base_name = '{}/{}_1d/'.format(self.outdir, self.label)
outdir = self._safe_outdir_creation(outdir, self.plot_marginals)
file_base_name = '{}/{}_1d/'.format(outdir, self.label)
check_directory_exists_and_if_not_mkdir(file_base_name)
if priors is True:
......@@ -605,7 +621,8 @@ class Result(object):
**kwargs:
Other keyword arguments are passed to `corner.corner`. We set some
defaults to improve the basic look and feel, but these can all be
overridden.
overridden. Also optional an 'outdir' argument which can be used
to override the outdir set by the absolute path of the result object.
Notes
-----
......@@ -716,8 +733,8 @@ class Result(object):
if save:
if filename is None:
utils.check_directory_exists_and_if_not_mkdir(self.outdir)
filename = '{}/{}_corner.png'.format(self.outdir, self.label)
outdir = self._safe_outdir_creation(kwargs.get('outdir'), self.plot_corner)
filename = '{}/{}_corner.png'.format(outdir, self.label)
logger.debug('Saving corner plot to {}'.format(filename))
fig.savefig(filename, dpi=dpi)
plt.close(fig)
......@@ -748,16 +765,16 @@ class Result(object):
ax.set_ylabel(self.parameter_labels[i])
fig.tight_layout()
filename = '{}/{}_walkers.png'.format(self.outdir, self.label)
outdir = self._safe_outdir_creation(kwargs.get('outdir'), self.plot_walkers)
filename = '{}/{}_walkers.png'.format(outdir, self.label)
logger.debug('Saving walkers plot to {}'.format('filename'))
utils.check_directory_exists_and_if_not_mkdir(self.outdir)
fig.savefig(filename)
plt.close(fig)
def plot_with_data(self, model, x, y, ndraws=1000, npoints=1000,
xlabel=None, ylabel=None, data_label='data',
data_fmt='o', draws_label=None, filename=None,
maxl_label='max likelihood', dpi=300):
maxl_label='max likelihood', dpi=300, outdir=None):
""" Generate a figure showing the data and fits to the data
Parameters
......@@ -783,6 +800,8 @@ class Result(object):
filename: str
If given, the filename to use. Otherwise, the filename is generated
from the outdir and label attributes.
outdir: str, optional
Path to the outdir. Default is the one store in the result object.
"""
......@@ -804,7 +823,7 @@ class Result(object):
s = model_posterior.iloc[self.posterior.log_likelihood.idxmax()]
ax.plot(xsmooth, model(xsmooth, **s), lw=1, color='k',
label=maxl_label)
except AttributeError:
except (AttributeError, TypeError):
logger.debug(
"No log likelihood values stored, unable to plot max")
......@@ -821,8 +840,8 @@ class Result(object):
ax.legend(numpoints=3)
fig.tight_layout()
if filename is None:
utils.check_directory_exists_and_if_not_mkdir(self.outdir)
filename = '{}/{}_plot_with_data'.format(self.outdir, self.label)
outdir = self._safe_outdir_creation(outdir, self.plot_with_data)
filename = '{}/{}_plot_with_data'.format(outdir, self.label)
fig.savefig(filename, dpi=dpi)
plt.close(fig)
......@@ -855,6 +874,11 @@ class Result(object):
data_frame[key] = priors[key]
data_frame['log_likelihood'] = getattr(
self, 'log_likelihood_evaluations', np.nan)
if self.log_prior_evaluations is None:
data_frame['log_prior'] = self.priors.ln_prob(
data_frame[self.search_parameter_keys], axis=0)
else:
data_frame['log_prior'] = self.log_prior_evaluations
if conversion_function is not None:
data_frame = conversion_function(data_frame, likelihood, priors)
self.posterior = data_frame
......@@ -935,20 +959,20 @@ class Result(object):
bool: True if attribute name matches with an attribute of other_object, False otherwise
"""
A = getattr(self, name, False)
B = getattr(other_object, name, False)
logger.debug('Checking {} value: {}=={}'.format(name, A, B))
if (A is not False) and (B is not False):
typeA = type(A)
typeB = type(B)
if typeA == typeB:
if typeA in [str, float, int, dict, list]:
a = getattr(self, name, False)
b = getattr(other_object, name, False)
logger.debug('Checking {} value: {}=={}'.format(name, a, b))
if (a is not False) and (b is not False):
type_a = type(a)
type_b = type(b)
if type_a == type_b:
if type_a in [str, float, int, dict, list]:
try:
return A == B
return a == b
except ValueError:
return False
elif typeA in [np.ndarray]:
return np.all(A == B)
elif type_a in [np.ndarray]:
return np.all(a == b)
return False
@property
......@@ -957,9 +981,9 @@ class Result(object):
Uses `scipy.stats.gaussian_kde` to generate the kernel density
"""
try:
if self._kde:
return self._kde
except AttributeError:
else:
self._kde = scipy.stats.gaussian_kde(
self.posterior[self.search_parameter_keys].values.T)
return self._kde
......@@ -989,6 +1013,18 @@ class Result(object):
for s in sample]
return self.kde(ordered_sample)
def _safe_outdir_creation(self, outdir=None, caller_func=None):
if outdir is None:
outdir = self.outdir
try:
utils.check_directory_exists_and_if_not_mkdir(outdir)
except PermissionError:
raise FileMovedError("Can not write in the out directory.\n"
"Did you move the here file from another system?\n"
"Try calling " + caller_func.__name__ + " with the 'outdir' "
"keyword argument, e.g. " + caller_func.__name__ + "(outdir='.')")
return outdir
def plot_multiple(results, filename=None, labels=None, colours=None,
save=True, evidences=False, **kwargs):
......@@ -1041,7 +1077,7 @@ def plot_multiple(results, filename=None, labels=None, colours=None,
hist_kwargs['color'] = c
fig = result.plot_corner(fig=fig, save=False, color=c, **kwargs)
default_filename += '_{}'.format(result.label)
lines.append(matplotlib.lines.Line2D([0], [0], color=c))
lines.append(mpllines.Line2D([0], [0], color=c))
default_labels.append(result.label)
# Rescale the axes
......@@ -1091,7 +1127,7 @@ def make_pp_plot(results, filename=None, save=True, **kwargs):
Returns
-------
fig:
Matplotlib figure
matplotlib figure
"""
fig = plt.figure()
credible_levels = pd.DataFrame()
......@@ -1113,3 +1149,11 @@ def make_pp_plot(results, filename=None, save=True, **kwargs):
filename = 'outdir/pp.png'
plt.savefig(filename)
return fig
class ResultError(Exception):
""" Base exception for all Result related errors """
class FileMovedError(ResultError):
""" Exceptions that occur when files have been moved """
......@@ -12,12 +12,14 @@ from .dynesty import Dynesty
from .emcee import Emcee
from .nestle import Nestle
from .ptemcee import Ptemcee
from .ptmcmc import PTMCMCSampler
from .pymc3 import Pymc3
from .pymultinest import Pymultinest
implemented_samplers = {
'cpnest': Cpnest, 'dynesty': Dynesty, 'emcee': Emcee, 'nestle': Nestle,
'ptemcee': Ptemcee, 'pymc3': Pymc3, 'pymultinest': Pymultinest}
'ptemcee': Ptemcee,'ptmcmcsampler' : PTMCMCSampler,
'pymc3': Pymc3, 'pymultinest': Pymultinest }
if command_line_args.sampler_help:
sampler = command_line_args.sampler_help
......@@ -39,7 +41,8 @@ if command_line_args.sampler_help:
def run_sampler(likelihood, priors=None, label='label', outdir='outdir',
sampler='dynesty', use_ratio=None, injection_parameters=None,
conversion_function=None, plot=False, default_priors_file=None,
clean=None, meta_data=None, save=True, **kwargs):
clean=None, meta_data=None, save=True, result_class=None,
**kwargs):
"""
The primary interface to easy parameter estimation
......@@ -81,6 +84,10 @@ def run_sampler(likelihood, priors=None, label='label', outdir='outdir',
overwritten.
save: bool
If true, save the priors and results to disk.
result_class: bilby.core.result.Result, or child of
The result class to use. By default, `bilby.core.result.Result` is used,
but objects which inherit from this class can be given providing
additional methods.
**kwargs:
All kwargs are passed directly to the samplers `run` function
......@@ -90,6 +97,10 @@ def run_sampler(likelihood, priors=None, label='label', outdir='outdir',
An object containing the results
"""
logger.info(
"Running for label '{}', output will be saved to '{}'".format(
label, outdir))
if clean:
command_line_args.clean = clean
if command_line_args.clean:
......@@ -122,7 +133,8 @@ def run_sampler(likelihood, priors=None, label='label', outdir='outdir',
sampler = sampler_class(
likelihood, priors=priors, outdir=outdir, label=label,
injection_parameters=injection_parameters, meta_data=meta_data,
use_ratio=use_ratio, plot=plot, **kwargs)
use_ratio=use_ratio, plot=plot, result_class=result_class,
**kwargs)
else:
print(implemented_samplers)
raise ValueError(
......@@ -177,4 +189,3 @@ def run_sampler(likelihood, priors=None, label='label', outdir='outdir',
result.plot_corner()
logger.info("Summary of results:\n{}".format(result))
return result
......@@ -35,6 +35,10 @@ class Sampler(object):
A dictionary of the injection parameters
meta_data:
A dictionary of extra meta data to store in the result
result_class: bilby.core.result.Result, or child of
The result class to use. By default, `bilby.core.result.Result` is used,
but objects which inherit from this class can be given providing
additional methods.
**kwargs: dict
Additional keyword arguments
......@@ -81,7 +85,8 @@ class Sampler(object):
def __init__(
self, likelihood, priors, outdir='outdir', label='label',
use_ratio=False, plot=False, skip_import_verification=False,
injection_parameters=None, meta_data=None, **kwargs):
injection_parameters=None, meta_data=None, result_class=None,
**kwargs):
self.likelihood = likelihood
if isinstance(priors, PriorDict):
self.priors = priors
......@@ -108,7 +113,7 @@ class Sampler(object):
self._log_summary_for_sampler()
self.result = self._initialise_result()
self.result = self._initialise_result(result_class)
@property
def search_parameter_keys(self):
......@@ -145,7 +150,7 @@ class Sampler(object):
external_sampler_name = self.__class__.__name__.lower()
try:
self.external_sampler = __import__(external_sampler_name)
except (ImportError, SystemExit, ModuleNotFoundError):
except (ImportError, SystemExit):
raise SamplerNotInstalledError(
"Sampler {} is not installed on this system".format(external_sampler_name))
......@@ -187,20 +192,30 @@ class Sampler(object):
for key in self.__fixed_parameter_keys:
logger.info(' {} = {}'.format(key, self.priors[key].peak))
def _initialise_result(self):
def _initialise_result(self, result_class):
"""
Returns
-------
bilby.core.result.Result: An initial template for the result
"""
result = Result(label=self.label, outdir=self.outdir,
sampler=self.__class__.__name__.lower(),
search_parameter_keys=self.__search_parameter_keys,
fixed_parameter_keys=self.__fixed_parameter_keys,
priors=self.priors, meta_data=self.meta_data,
injection_parameters=self.injection_parameters,
sampler_kwargs=self.kwargs)
result_kwargs = dict(
label=self.label, outdir=self.outdir,
sampler=self.__class__.__name__.lower(),
search_parameter_keys=self.__search_parameter_keys,
fixed_parameter_keys=self.__fixed_parameter_keys,
priors=self.priors, meta_data=self.meta_data,
injection_parameters=self.injection_parameters,
sampler_kwargs=self.kwargs)
if result_class is None:
result = Result(**result_kwargs)
elif issubclass(result_class, Result):
result = result_class(**result_kwargs)
else:
raise ValueError(
"Input result_class={} not understood".format(result_class))
return result
def _check_if_priors_can_be_sampled(self):
......
......@@ -84,6 +84,8 @@ class Cpnest(NestedSampler):
out.plot()
self.result.posterior = DataFrame(out.posterior_samples)
self.result.posterior.rename(columns=dict(
logL='log_likelihood', logPrior='log_prior'), inplace=True)
self.result.log_evidence = out.NS.state.logZ
self.result.log_evidence_err = np.nan
return self.result
......
......@@ -4,7 +4,7 @@ import numpy as np
from pandas import DataFrame
from ..utils import logger, get_progress_bar
from .base_sampler import MCMCSampler
from .base_sampler import MCMCSampler, SamplerError
class Emcee(MCMCSampler):
......@@ -116,7 +116,15 @@ class Emcee(MCMCSampler):
self.calculate_autocorrelation(sampler.chain.reshape((-1, self.ndim)))
self.print_nburn_logging_info()
self.result.nburn = self.nburn
if self.result.nburn > self.nsteps:
raise SamplerError(
"The run has finished, but the chain is not burned in: "
"`nburn < nsteps`. Try increasing the number of steps.")
self.result.samples = sampler.chain[:, self.nburn:, :].reshape((-1, self.ndim))
blobs_flat = np.array(sampler.blobs)[self.nburn:, :, :].reshape((-1, 2))
log_likelihoods, log_priors = blobs_flat.T
self.result.log_likelihood_evaluations = log_likelihoods
self.result.log_prior_evaluations = log_priors
self.result.walkers = sampler.chain
self.result.log_evidence = np.nan
self.result.log_evidence_err = np.nan
......@@ -142,8 +150,9 @@ class Emcee(MCMCSampler):
for _ in range(self.nwalkers)]
def lnpostfn(self, theta):
p = self.log_prior(theta)
if np.isinf(p):
return -np.inf
log_prior = self.log_prior(theta)
if np.isinf(log_prior):
return -np.inf, [np.nan, np.nan]
else:
return self.log_likelihood(theta) + p
log_likelihood = self.log_likelihood(theta)
return log_likelihood + log_prior, [log_likelihood, log_prior]
......@@ -2,8 +2,9 @@ from __future__ import absolute_import, division, print_function
import numpy as np
from ..utils import get_progress_bar, logger
from ..utils import get_progress_bar
from . import Emcee
from .base_sampler import SamplerError
class Ptemcee(Emcee):
......@@ -64,9 +65,13 @@ class Ptemcee(Emcee):
for _ in range(self.nwalkers)]
for _ in range(self.kwargs['ntemps'])]
for _ in tqdm(
log_likelihood_evaluations = []
log_prior_evaluations = []
for pos, logpost, loglike in tqdm(
sampler.sample(self.pos0, **self.sampler_function_kwargs),
total=self.nsteps):
log_likelihood_evaluations.append(loglike)
log_prior_evaluations.append(logpost - loglike)
pass
self.calculate_autocorrelation(sampler.chain.reshape((-1, self.ndim)))
......@@ -74,9 +79,15 @@ class Ptemcee(Emcee):
self.print_nburn_logging_info()
self.result.nburn = self.nburn
if self.result.nburn > self.nsteps:
logger.warning('Chain not burned in, no samples generated.')
raise SamplerError(
"The run has finished, but the chain is not burned in: "
"`nburn < nsteps`. Try increasing the number of steps.")
self.result.samples = sampler.chain[0, :, self.nburn:, :].reshape(
(-1, self.ndim))
self.result.log_likelihood_evaluations = np.array(
log_likelihood_evaluations)[self.nburn:, 0, :].reshape((-1))
self.result.log_prior_evaluations = np.array(
log_prior_evaluations)[self.nburn:, 0, :].reshape((-1))
self.result.betas = sampler.betas
self.result.log_evidence, self.result.log_evidence_err =\
sampler.log_evidence_estimate(
......
from __future__ import absolute_import, print_function
import glob
import shutil
import numpy as np
from .base_sampler import MCMCSampler, SamplerNotInstalledError
from ..utils import logger
class PTMCMCSampler(MCMCSampler):
"""bilby wrapper of PTMCMC (https://github.com/jellis18/PTMCMCSampler/)
All positional and keyword arguments (i.e., the args and kwargs) passed to
`run_sampler` will be propagated to `PTMCMCSampler.PTMCMCSampler`, see
documentation for that class for further help. Under Other Parameters, we
list commonly used kwargs and the bilby defaults.
Other Parameters
----------------
Niter: int (2*10**4 + 1)
The number of mcmc steps
burn: int (5 * 10**3)
If given, the fixed number of steps to discard as burn-in
thin: int (1)
The number of steps before saving the sample to the chain
custom_proposals: dict (None)
Add dictionary of proposals to the array of proposals, this must be in
the form of a dictionary with the name of the proposal, then a list
containing the jump function and the weight e.g {'name' : [function ,
weight]} see
(https://github.com/rgreen1995/PTMCMCSampler/blob/master/examples/simple.ipynb)
and
(http://jellis18.github.io/PTMCMCSampler/PTMCMCSampler.html#ptmcmcsampler-ptmcmcsampler-module)
for examples and more info. logl_grad: func (None) Gradient of
likelihood if known (default = None) logp_grad: func (None) Gradient
of prior if known (default = None) verbose: bool (True) Update current
run-status to the screen
"""
default_kwargs = {'p0': None, 'Niter': 2 * 10**4 + 1, 'neff': 10**4,
'burn': 5 * 10**3, 'verbose': True,
'ladder': None, 'Tmin': 1, 'Tmax': None, 'Tskip': 100,
'isave': 1000, 'thin': 1, 'covUpdate': 1000,
'SCAMweight': 1, 'AMweight': 1, 'DEweight': 1,
'HMCweight': 0, 'MALAweight': 0, 'NUTSweight': 0,
'HMCstepsize': 0.1, 'HMCsteps': 300,
'groups': None, 'custom_proposals': None,
'loglargs': {}, 'loglkwargs': {}, 'logpargs': {},
'logpkwargs': {}, 'logl_grad': None, 'logp_grad': None,
'outDir': None}
def __init__(self, likelihood, priors, outdir='outdir', label='label',
use_ratio=False, plot=False, skip_import_verification=False,
pos0=None, burn_in_fraction=0.25, **kwargs):
MCMCSampler.__init__(self, likelihood=likelihood, priors=priors,
outdir=outdir, label=label, use_ratio=use_ratio,
plot=plot,
skip_import_verification=skip_import_verification,
**kwargs)
self.p0 = self.get_random_draw_from_prior()
self.likelihood = likelihood
self.priors = priors
def _verify_external_sampler(self):
# PTMCMC is imported with Caps so need to overwrite the parent function
# which forces `__name__.lower()
external_sampler_name = self.__class__.__name__
try:
self.external_sampler = __import__(external_sampler_name)
except (ImportError, SystemExit):
raise SamplerNotInstalledError(
"Sampler {} is not installed on this system".format(external_sampler_name))
def _translate_kwargs(self, kwargs):
if 'Niter' not in kwargs:
for equiv in self.nsteps_equiv_kwargs:
if equiv in kwargs:
kwargs['Niter'] = kwargs.pop(equiv)
if 'burn' not in kwargs:
for equiv in self.nburn_equiv_kwargs:
if equiv in kwargs:
kwargs['burn'] = kwargs.pop(equiv)
@property
def custom_proposals(self):
return self.kwargs['custom_proposals']
@property
def sampler_init_kwargs(self):
keys = ['groups',
'loglargs',
'logp_grad',
'logpkwargs',
'loglkwargs',
'logl_grad',
'logpargs',
'outDir',
'verbose']
init_kwargs = {key: self.kwargs[key] for key in keys}
if init_kwargs['outDir'] is None:
init_kwargs['outDir'] = '{}/ptmcmc_temp_{}/'.format(self.outdir, self.label)
return init_kwargs
@property
def sampler_function_kwargs(self):
keys = ['Niter',
'neff',
'Tmin',
'HMCweight',
'covUpdate',
'SCAMweight',
'ladder',
'burn',
'NUTSweight',
'AMweight',
'MALAweight',
'thin',
'HMCstepsize',
'isave',
'Tskip',
'HMCsteps',
'Tmax',
'DEweight']
sampler_kwargs = {key: self.kwargs[key] for key in keys}
return sampler_kwargs
@staticmethod
def _import_external_sampler():
from PTMCMCSampler import PTMCMCSampler
return PTMCMCSampler
def run_sampler(self):
PTMCMCSampler = self._import_external_sampler()
sampler = PTMCMCSampler.PTSampler(ndim=self.ndim, logp=self.log_prior,
logl=self.log_likelihood, cov=np.eye(self.ndim),
**self.sampler_init_kwargs)
if self.custom_proposals is not None:
for proposal in self.custom_proposals:
logger.info('Adding {} to proposals with weight {}'.format(
proposal, self.custom_proposals[proposal][1]))
sampler.addProposalToCycle(self.custom_proposals[proposal][0],
self.custom_proposals[proposal][1])
sampler.sample(p0=self.p0, **self.sampler_function_kwargs)
samples, meta, loglike = self.__read_in_data()
self.result.nburn = self.sampler_function_kwargs['burn']
self.result.samples = samples[self.result.nburn:]
self.meta_data['sampler_meta'] = meta
self.result.log_likelihood_evaluations = loglike[self.result.nburn:]
self.result.sampler_output = np.nan
self.result.walkers = np.nan
self.result.log_evidence = np.nan
self.result.log_evidence_err = np.nan
return self.result
def __read_in_data(self):
""" Read the data stored by PTMCMC to disk """
temp_outDir = self.sampler_init_kwargs['outDir']
data = np.loadtxt('{}/chain_1.txt'.format(temp_outDir))
jumpfiles = glob.glob('{}/*jump.txt'.format(temp_outDir))
jumps = map(np.loadtxt, jumpfiles)
samples = data[:, :-4]
loglike = data[:, -3]
jump_accept = {}
for ct, j in enumerate(jumps):
label = jumpfiles[ct].split('/')[-1].split('_jump.txt')[0]
jump_accept[label] = j
PT_swap = {'swap_accept': data[-1]}
tot_accept = {'tot_accept': data[-2]}
log_post = {'log_post': data[:, -4]}
meta = {}
meta['tot_accept'] = tot_accept
meta['PT_swap'] = PT_swap
meta['proposals'] = jump_accept
meta['log_post'] = log_post
shutil.rmtree(temp_outDir)
return samples, meta, loglike
......@@ -121,18 +121,28 @@ class Pymc3(MCMCSampler):
for key in self.__fixed_parameter_keys:
logger.info(' {} = {}'.format(key, self.priors[key].peak))
def _initialise_result(self):
def _initialise_result(self, result_class):
"""
Initialise results within Pymc3 subclass.
"""
result = Result(label=self.label, outdir=self.outdir,
sampler=self.__class__.__name__.lower(),
search_parameter_keys=self.__search_parameter_keys,
fixed_parameter_keys=self.__fixed_parameter_keys,
priors=self.priors, meta_data=self.meta_data,
injection_parameters=self.injection_parameters,
sampler_kwargs=self.kwargs)
result_kwargs = dict(
label=self.label, outdir=self.outdir,
sampler=self.__class__.__name__.lower(),
search_parameter_keys=self.__search_parameter_keys,
fixed_parameter_keys=self.__fixed_parameter_keys,
priors=self.priors, meta_data=self.meta_data,
injection_parameters=self.injection_parameters,
sampler_kwargs=self.kwargs)
if result_class is None:
result = Result(**result_kwargs)
elif issubclass(result_class, Result):
result = result_class(**result_kwargs)
else:
raise ValueError(
"Input result_class={} not understood".format(result_class))
return result
@property
......
......@@ -70,7 +70,7 @@ class Pymultinest(NestedSampler):
'The length of {} exceeds 78 characters. '
' Post-processing will fail because the file names will be cut'
' off. Please choose a shorter "outdir" or "label".'
.format(self.__kwargs['outputfiles_basename']))
.format(self.kwargs['outputfiles_basename']))
check_directory_exists_and_if_not_mkdir(
self.kwargs['outputfiles_basename'])
NestedSampler._verify_kwargs_against_default_kwargs(self)
......
from . import (calibration, conversion, detector, likelihood, prior, series, source,
utils, waveform_generator)
utils, waveform_generator, result)
from .waveform_generator import WaveformGenerator
from .likelihood import GravitationalWaveTransient
......@@ -358,8 +358,8 @@ class InterferometerStrainData(object):
-------
array_like: An array of boolean values
"""
return ((self.frequency_array > self.minimum_frequency) &
(self.frequency_array < self.maximum_frequency))
return ((self.frequency_array >= self.minimum_frequency) &
(self.frequency_array <= self.maximum_frequency))
@property
def alpha(self):
......@@ -1304,7 +1304,7 @@ class Interferometer(object):
`waveform_generator.frequency_domain_strain()`. If
`waveform_generator` is also given, the injection_polarizations will
be calculated directly and this argument can be ignored.
waveform_generator: bilby.gw.waveform_generator
waveform_generator: bilby.gw.waveform_generator.WaveformGenerator
A WaveformGenerator instance using the source model to inject. If
`injection_polarizations` is given, this will be ignored.
......@@ -2158,7 +2158,7 @@ def get_interferometer_with_fake_noise_and_injection(
`waveform_generator.frequency_domain_strain()`. If
`waveform_generator` is also given, the injection_polarizations will
be calculated directly and this argument can be ignored.
waveform_generator: bilby.gw.waveform_generator
waveform_generator: bilby.gw.waveform_generator.WaveformGenerator
A WaveformGenerator instance using the source model to inject. If
`injection_polarizations` is given, this will be ignored.
sampling_frequency: float
......
......@@ -14,8 +14,9 @@ from ..core.prior import Prior, Uniform
from .detector import InterferometerList
from .prior import BBHPriorDict
from .source import lal_binary_black_hole
from .utils import noise_weighted_inner_product
from .utils import noise_weighted_inner_product, build_roq_weights, blockwise_dot_product
from .waveform_generator import WaveformGenerator
from math import ceil
class GravitationalWaveTransient(likelihood.Likelihood):
......@@ -66,7 +67,13 @@ class GravitationalWaveTransient(likelihood.Likelihood):
self.phase_marginalization = phase_marginalization
self.priors = priors
self._check_set_duration_and_sampling_frequency_of_waveform_generator()
self.meta_data = self.interferometers.meta_data
self.meta_data = dict(
interferometers=self.interferometers.meta_data,
time_marginalization=self.time_marginalization,
phase_marginalization=self.phase_marginalization,
distance_marginalization=self.distance_marginalization,
waveform_arguments=waveform_generator.waveform_arguments,
frequency_domain_source_model=waveform_generator.frequency_domain_source_model)
if self.time_marginalization:
self._check_prior_is_set(key='geocent_time')
......@@ -379,12 +386,194 @@ class BasicGravitationalWaveTransient(likelihood.Likelihood):
return log_l.real
class ROQGravitationalWaveTransient(GravitationalWaveTransient):
"""A reduced order quadrature likelihood object
This uses the method described in Smith et al., (2016) Phys. Rev. D 94,
044031. A public repository of the ROQ data is available from
https://git.ligo.org/lscsoft/ROQ_data.
Parameters
----------
interferometers: list, bilby.gw.detector.InterferometerList
A list of `bilby.detector.Interferometer` instances - contains the
detector data and power spectral densities
waveform_generator: `bilby.waveform_generator.WaveformGenerator`
An object which computes the frequency-domain strain of the signal,
given some set of parameters
linear_matrix: str, array
Either a string point to the file from which to load the linear_matrix
array, or the array itself.
quadratic_matrix: str, array
Either a string point to the file from which to load the quadratic_matrix
array, or the array itself.
priors: dict, bilby.prior.PriorDict
A dictionary of priors containing at least the geocent_time prior
"""
def __init__(self, interferometers, waveform_generator,
linear_matrix, quadratic_matrix, priors):
GravitationalWaveTransient.__init__(
self, interferometers=interferometers,
waveform_generator=waveform_generator, priors=priors)
if isinstance(linear_matrix, str):
logger.info("Loading linear matrix from {}".format(linear_matrix))
linear_matrix = np.load(linear_matrix).T
if isinstance(quadratic_matrix, str):
logger.info("Loading quadratic_matrix from {}".format(quadratic_matrix))
quadratic_matrix = np.load(quadratic_matrix).T
self.linear_matrix = linear_matrix
self.quadratic_matrix = quadratic_matrix
self.time_samples = None
self.weights = dict()
self._set_weights()
self.frequency_nodes_linear =\
waveform_generator.waveform_arguments['frequency_nodes_linear']
def log_likelihood_ratio(self):
optimal_snr_squared = 0.
matched_filter_snr_squared = 0.
indices, in_bounds = self._closest_time_indices(
self.parameters['geocent_time'] - self.interferometers.start_time)
if not in_bounds:
return np.nan_to_num(-np.inf)
waveform = self.waveform_generator.frequency_domain_strain(
self.parameters)
if waveform is None:
return np.nan_to_num(-np.inf)
for ifo in self.interferometers:
f_plus = ifo.antenna_response(
self.parameters['ra'], self.parameters['dec'],
self.parameters['geocent_time'], self.parameters['psi'], 'plus')
f_cross = ifo.antenna_response(
self.parameters['ra'], self.parameters['dec'],
self.parameters['geocent_time'], self.parameters['psi'], 'cross')
dt = ifo.time_delay_from_geocenter(
self.parameters['ra'], self.parameters['dec'],
ifo.strain_data.start_time)
ifo_time = self.parameters['geocent_time'] + dt - \
ifo.strain_data.start_time
h_plus_linear = f_plus * waveform['linear']['plus']
h_cross_linear = f_cross * waveform['linear']['cross']
h_plus_quadratic = f_plus * waveform['quadratic']['plus']
h_cross_quadratic = f_cross * waveform['quadratic']['cross']
indices, in_bounds = self._closest_time_indices(ifo_time)
if not in_bounds:
return np.nan_to_num(-np.inf)
matched_filter_snr_squared_array = np.einsum(
'i,ji->j', np.conjugate(h_plus_linear + h_cross_linear),
self.weights[ifo.name + '_linear'][indices])
matched_filter_snr_squared += interp1d(
self.time_samples[indices],
matched_filter_snr_squared_array, kind='quadratic')(ifo_time)
optimal_snr_squared += \
np.vdot(np.abs(h_plus_quadratic + h_cross_quadratic)**2,
self.weights[ifo.name + '_quadratic'])
log_l = matched_filter_snr_squared - optimal_snr_squared / 2
return log_l.real
def _closest_time_indices(self, time):
"""
Get the closest an two neighbouring times
Parameters
----------
time: float
Time to check
Returns
-------
indices: list
Indices nearest to time.
in_bounds: bool
Whether the indices are for valid times.
"""
closest = np.argmin(np.absolute(self.time_samples - time))
indices = [closest + ii for ii in [-1, 0, 1]]
in_bounds = (indices[0] >= 0) & (indices[2] < self.time_samples.size)
return indices, in_bounds
def _set_weights(self):
"""
Setup the time-dependent ROQ weights.
This follows FIXME: Smith et al.
The times are chosen to allow all the merger times allows in the time
prior.
"""
for ifo in self.interferometers:
# only get frequency components up to maximum_frequency
self.linear_matrix = \
self.linear_matrix[:, :sum(ifo.frequency_mask)]
self.quadratic_matrix = \
self.quadratic_matrix[:, :sum(ifo.frequency_mask)]
# array of relative time shifts to be applied to the data
# 0.045s comes from time for GW to traverse the Earth
self.time_samples = np.linspace(
self.priors['geocent_time'].minimum - 0.045,
self.priors['geocent_time'].maximum + 0.045,
int(ceil((self.priors['geocent_time'].maximum -
self.priors['geocent_time'].minimum + 0.09) *
ifo.strain_data.sampling_frequency)))
self.time_samples -= ifo.strain_data.start_time
time_space = self.time_samples[1] - self.time_samples[0]
# array to be filled with data, shifted by discrete time_samples
tc_shifted_data = np.zeros([
len(self.time_samples),
len(ifo.frequency_array[ifo.frequency_mask])], dtype=complex)
# shift data to beginning of the prior
# increment by the time step
shifted_data =\
ifo.frequency_domain_strain[ifo.frequency_mask] * \
np.exp(2j * np.pi * ifo.frequency_array[ifo.frequency_mask] *
self.time_samples[0])
single_time_shift = np.exp(
2j * np.pi * ifo.frequency_array[ifo.frequency_mask] *
time_space)
for j in range(len(self.time_samples)):
tc_shifted_data[j] = shifted_data
shifted_data *= single_time_shift
# to not kill all computers this minimises the memory usage of the
# required inner products
max_block_gigabytes = 4
max_elements = int((max_block_gigabytes * 2 ** 30) / 8)
self.weights[ifo.name + '_linear'] = blockwise_dot_product(
tc_shifted_data /
ifo.power_spectral_density_array[ifo.frequency_mask],
self.linear_matrix, max_elements) * 4 / ifo.strain_data.duration
del tc_shifted_data
self.weights[ifo.name + '_quadratic'] = build_roq_weights(
1 / ifo.power_spectral_density_array[ifo.frequency_mask],
self.quadratic_matrix.real, 1 / ifo.strain_data.duration)
def get_binary_black_hole_likelihood(interferometers):
""" A rapper to quickly set up a likelihood for BBH parameter estimation
Parameters
----------
interferometers: list
interferometers: {bilby.gw.detector.InterferometerList, list}
A list of `bilby.detector.Interferometer` instances, typically the
output of either `bilby.detector.get_interferometer_with_open_data`
or `bilby.detector.get_interferometer_with_fake_noise_and_injection`
......
from __future__ import division
from ..core.result import Result as CoreResult
from ..core.utils import logger
class CompactBinaryCoalesenceResult(CoreResult):
def __init__(self, **kwargs):
super(CompactBinaryCoalesenceResult, self).__init__(**kwargs)
def __get_from_nested_meta_data(self, *keys):
dictionary = self.meta_data
try:
for k in keys:
item = dictionary[k]
dictionary = item
return item
except KeyError:
raise ValueError(
"No information stored for {}".format('/'.join(keys)))
@property
def time_marginalization(self):
""" Boolean for if the likelihood used time marginalization """
return self.__get_from_nested_meta_data(
'likelihood', 'time_marginalization')
@property
def phase_marginalization(self):
""" Boolean for if the likelihood used phase marginalization """
return self.__get_from_nested_meta_data(
'likelihood', 'phase_marginalization')
@property
def distance_marginalization(self):
""" Boolean for if the likelihood used distance marginalization """
return self.__get_from_nested_meta_data(
'likelihood', 'distance_marginalization')
@property
def waveform_approximant(self):
""" String of the waveform approximant """
return self.__get_from_nested_meta_data(
'likelihood', 'waveform_arguments', 'waveform_approximant')
@property
def reference_frequency(self):
""" Float of the reference frequency """
return self.__get_from_nested_meta_data(
'likelihood', 'waveform_arguments', 'reference_frequency')
@property
def frequency_domain_source_model(self):
""" The frequency domain source model (function)"""
return self.__get_from_nested_meta_data(
'likelihood', 'frequency_domain_source_model')
def detector_injection_properties(self, detector):
""" Returns a dictionary of the injection properties for each detector
The injection properties include the parameters injected, and
information about the signal to noise ratio (SNR) given the noise
properties.
Parameters
----------
detector: str [H1, L1, V1]
Detector name
Returns
-------
injection_properties: dict
A dictionary of the injection properties
"""
try:
return self.__get_from_nested_meta_data(
'likelihood', 'interferometers', detector)
except ValueError:
logger.info("No injection for detector {}".format(detector))
return None
CBCResult = CompactBinaryCoalesenceResult
......@@ -12,6 +12,7 @@ from .utils import (lalsim_SimInspiralTransformPrecessingNewInitialConditions,
try:
import lal
import lalsimulation as lalsim
except ImportError:
logger.warning("You do not have lalsuite installed currently. You will"
" not be able to use some of the prebuilt functions.")
......@@ -333,3 +334,112 @@ def lal_binary_neutron_star(
h_cross = h_cross[:len(frequency_array)]
return {'plus': h_plus, 'cross': h_cross}
def roq(frequency_array, mass_1, mass_2, luminosity_distance, a_1, tilt_1,
phi_12, a_2, tilt_2, phi_jl, iota, phase, **waveform_arguments):
"""
See https://git.ligo.org/lscsoft/lalsuite/blob/master/lalsimulation/src/LALSimInspiral.c#L1460
Parameters
----------
frequency_array: np.array
This input is ignored for the roq source model
mass_1: float
The mass of the heavier object in solar masses
mass_2: float
The mass of the lighter object in solar masses
luminosity_distance: float
The luminosity distance in megaparsec
a_1: float
Dimensionless primary spin magnitude
tilt_1: float
Primary tilt angle
phi_12: float
a_2: float
Dimensionless secondary spin magnitude
tilt_2: float
Secondary tilt angle
phi_jl: float
iota: float
Orbital inclination
phase: float
The phase at coalescence
Waveform arguments
------------------
Non-sampled extra data used in the source model calculation
frequency_nodes_linear: np.array
frequency_nodes_quadratic: np.array
reference_frequency: float
version: str
Note: for the frequency_nodes_linear and frequency_nodes_quadratic arguments,
if using data from https://git.ligo.org/lscsoft/ROQ_data, this should be
loaded as `np.load(filename).T`.
Returns
-------
waveform_polarizations: dict
Dict containing plus and cross modes evaluated at the linear and
quadratic frequency nodes.
"""
if mass_2 > mass_1:
return None
frequency_nodes_linear = waveform_arguments['frequency_nodes_linear']
frequency_nodes_quadratic = waveform_arguments['frequency_nodes_quadratic']
reference_frequency = getattr(waveform_arguments,
'reference_frequency', 20.0)
versions = dict(IMRPhenomPv2=lalsim.IMRPhenomPv2_V)
version = versions[getattr(waveform_arguments, 'version', 'IMRPhenomPv2')]
luminosity_distance = luminosity_distance * 1e6 * utils.parsec
mass_1 = mass_1 * utils.solar_mass
mass_2 = mass_2 * utils.solar_mass
if tilt_1 == 0 and tilt_2 == 0:
spin_1x = 0
spin_1y = 0
spin_1z = a_1
spin_2x = 0
spin_2y = 0
spin_2z = a_2
else:
iota, spin_1x, spin_1y, spin_1z, spin_2x, spin_2y, spin_2z = \
lalsim.SimInspiralTransformPrecessingNewInitialConditions(
iota, phi_jl, tilt_1, tilt_2, phi_12, a_1, a_2, mass_1, mass_2,
reference_frequency, phase)
chi_1_l, chi_2_l, chi_p, theta_jn, alpha, phase_aligned, zeta =\
lalsim.SimIMRPhenomPCalculateModelParametersFromSourceFrame(
mass_1, mass_2, reference_frequency, phase, iota, spin_1x,
spin_1y, spin_1z, spin_2x, spin_2y, spin_2z, version)
waveform_polarizations = dict()
h_linear_plus, h_linear_cross = lalsim.SimIMRPhenomPFrequencySequence(
frequency_nodes_linear, chi_1_l, chi_2_l, chi_p, theta_jn,
mass_1, mass_2, luminosity_distance,
alpha, phase_aligned, reference_frequency, version, None)
h_quadratic_plus, h_quadratic_cross = lalsim.SimIMRPhenomPFrequencySequence(
frequency_nodes_quadratic, chi_1_l, chi_2_l, chi_p, theta_jn,
mass_1, mass_2, luminosity_distance,
alpha, phase_aligned, reference_frequency, version, None)
waveform_polarizations['linear'] = dict(
plus=(np.cos(2 * zeta) * h_linear_plus.data.data +
np.sin(2 * zeta) * h_linear_cross.data.data),
cross=(np.cos(2 * zeta) * h_linear_cross.data.data -
np.sin(2 * zeta) * h_linear_plus.data.data))
waveform_polarizations['quadratic'] = dict(
plus=(np.cos(2 * zeta) * h_quadratic_plus.data.data +
np.sin(2 * zeta) * h_quadratic_cross.data.data),
cross=(np.cos(2 * zeta) * h_quadratic_cross.data.data -
np.sin(2 * zeta) * h_quadratic_plus.data.data))
return waveform_polarizations
......@@ -407,6 +407,7 @@ def read_frame_file(file_name, start_time, end_time, channel=None, buffer_time=1
"""
loaded = False
strain = None
if channel is not None:
try:
strain = TimeSeries.read(source=file_name, channel=channel, start=start_time, end=end_time, **kwargs)
......@@ -414,17 +415,24 @@ def read_frame_file(file_name, start_time, end_time, channel=None, buffer_time=1
logger.info('Successfully loaded {}.'.format(channel))
except RuntimeError:
logger.warning('Channel {} not found. Trying preset channel names'.format(channel))
for channel_type in ['GDS-CALIB_STRAIN', 'DCS-CALIB_STRAIN_C01', 'DCS-CALIB_STRAIN_C02']:
for ifo_name in ['H1', 'L1']:
channel = '{}:{}'.format(ifo_name, channel_type)
if loaded:
continue
try:
strain = TimeSeries.read(source=file_name, channel=channel, start=start_time, end=end_time, **kwargs)
loaded = True
logger.info('Successfully loaded {}.'.format(channel))
except RuntimeError:
pass
while not loaded:
ligo_channel_types = ['GDS-CALIB_STRAIN', 'DCS-CALIB_STRAIN_C01', 'DCS-CALIB_STRAIN_C02',
'DCH-CLEAN_STRAIN_C02']
virgo_channel_types = ['Hrec_hoft_V1O2Repro2A_16384Hz', 'FAKE_h_16384Hz_4R']
channel_types = dict(H1=ligo_channel_types, L1=ligo_channel_types, V1=virgo_channel_types)
for detector in channel_types.keys():
for channel_type in channel_types[detector]:
if loaded:
break
channel = '{}:{}'.format(detector, channel_type)
try:
strain = TimeSeries.read(source=file_name, channel=channel, start=start_time, end=end_time,
**kwargs)
loaded = True
logger.info('Successfully read strain data for channel {}.'.format(channel))
except RuntimeError:
pass
if loaded:
return strain
......@@ -433,15 +441,17 @@ def read_frame_file(file_name, start_time, end_time, channel=None, buffer_time=1
return None
def get_gracedb(gracedb, outdir, duration, calibration, detectors):
def get_gracedb(gracedb, outdir, duration, calibration, detectors, query_types=None):
candidate = gracedb_to_json(gracedb, outdir)
trigger_time = candidate['gpstime']
gps_start_time = trigger_time - duration
cache_files = []
for det in detectors:
if query_types is None:
query_types = [None] * len(detectors)
for i, det in enumerate(detectors):
output_cache_file = gw_data_find(
det, gps_start_time, duration, calibration,
outdir=outdir)
outdir=outdir, query_type=query_types[i])
cache_files.append(output_cache_file)
return candidate, cache_files
......@@ -487,7 +497,7 @@ def gracedb_to_json(gracedb, outdir=None):
def gw_data_find(observatory, gps_start_time, duration, calibration,
outdir='.'):
outdir='.', query_type=None):
""" Builds a gw_data_find call and process output
Parameters
......@@ -502,6 +512,8 @@ def gw_data_find(observatory, gps_start_time, duration, calibration,
Use C01 or C02 calibration
outdir: string
A path to the directory where output is stored
query_type: string
The LDRDataFind query type
Returns
-------
......@@ -514,11 +526,17 @@ def gw_data_find(observatory, gps_start_time, duration, calibration,
observatory_lookup = dict(H1='H', L1='L', V1='V')
observatory_code = observatory_lookup[observatory]
dtype = '{}_HOFT_C0{}'.format(observatory, calibration)
logger.info('Using LDRDataFind query type {}'.format(dtype))
if query_type is None:
logger.warning('No query type provided. This may prevent data from being read.')
if observatory_code is 'V':
query_type = 'V1Online'
else:
query_type = '{}_HOFT_C0{}'.format(observatory, calibration)
logger.info('Using LDRDataFind query type {}'.format(query_type))
cache_file = '{}-{}_CACHE-{}-{}.lcf'.format(
observatory, dtype, gps_start_time, duration)
observatory, query_type, gps_start_time, duration)
output_cache_file = os.path.join(outdir, cache_file)
gps_end_time = gps_start_time + duration
......@@ -527,7 +545,7 @@ def gw_data_find(observatory, gps_start_time, duration, calibration,
cl_list.append('--observatory {}'.format(observatory_code))
cl_list.append('--gps-start-time {}'.format(gps_start_time))
cl_list.append('--gps-end-time {}'.format(gps_end_time))
cl_list.append('--type {}'.format(dtype))
cl_list.append('--type {}'.format(query_type))
cl_list.append('--output {}'.format(output_cache_file))
cl_list.append('--url-type file')
cl_list.append('--lal-cache')
......@@ -614,6 +632,88 @@ def plot_skymap(result, center='120d -40d', nside=512):
fig.savefig('{}/{}_skymap.png'.format(result.outdir, result.label))
def build_roq_weights(data, basis, deltaF):
"""
for a data array and reduced basis compute roq weights
basis: (reduced basis element)*invV (the inverse Vandermonde matrix)
data: data set
PSD: detector noise power spectral density (must be same shape as data)
deltaF: integration element df
"""
weights = np.dot(data, np.conjugate(basis)) * deltaF * 4.
return weights
def blockwise_dot_product(matrix_a, matrix_b, max_elements=int(2 ** 27),
out=None):
"""
Memory efficient
Computes the dot product of two matrices in a block-wise fashion.
Only blocks of `matrix_a` with a maximum size of `max_elements` will be
processed simultaneously.
Parameters
----------
matrix_a, matrix_b: array-like
Matrices to be dot producted, matrix_b is complex conjugated.
max_elements: int
Maximum number of elements to consider simultaneously, should be memory
limited.
out: array-like
Output array
Return
------
out: array-like
Dot producted array
"""
def block_slices(dim_size, block_size):
"""Generator that yields slice objects for indexing into
sequential blocks of an array along a particular axis
Useful for blockwise dot
"""
count = 0
while True:
yield slice(count, count + block_size, 1)
count += block_size
if count > dim_size:
return
matrix_b = np.conjugate(matrix_b)
m, n = matrix_a.shape
n1, o = matrix_b.shape
if n1 != n:
raise ValueError(
'Matrices are not aligned, matrix a has shape ' +
'{}, matrix b has shape {}.'.format(matrix_a.shape, matrix_b.shape))
if matrix_a.flags.f_contiguous:
# prioritize processing as many columns of matrix_a as possible
max_cols = max(1, max_elements // m)
max_rows = max_elements // max_cols
else:
# prioritize processing as many rows of matrix_a as possible
max_rows = max(1, max_elements // n)
max_cols = max_elements // max_rows
if out is None:
out = np.empty((m, o), dtype=np.result_type(matrix_a, matrix_b))
elif out.shape != (m, o):
raise ValueError('Output array has incorrect dimensions.')
for mm in block_slices(m, max_rows):
out[mm, :] = 0
for nn in block_slices(n, max_cols):
a_block = matrix_a[mm, nn].copy() # copy to force a read
out[mm, :] += np.dot(a_block, matrix_b[nn, :])
del a_block
return out
def convert_args_list_to_float(*args_list):
""" Converts inputs to floats, returns a list in the same order as the input"""
try:
......
FROM ubuntu:18.04
LABEL name="bilby Base Enterprise Linux 7" \
maintainer="Gregory Ashton <gregory.ashton@ligo.org>" \
date="20190104"
ENV PATH /opt/conda/bin:$PATH
# Install backend
RUN apt-get update --fix-missing \
&& apt-get install -y libglib2.0-0 libxext6 libsm6 libxrender1 libgl1-mesa-glx \
dh-autoreconf build-essential libarchive-dev wget curl git libhdf5-serial-dev
# Install python2.7
RUN wget --quiet https://repo.anaconda.com/miniconda/Miniconda2-4.5.11-Linux-x86_64.sh -O ~/miniconda.sh && \
/bin/bash ~/miniconda.sh -b -p /opt/conda && \
rm ~/miniconda.sh && \
ln -s /opt/conda/etc/profile.d/conda.sh /etc/profile.d/conda.sh && \
echo ". /opt/conda/etc/profile.d/conda.sh" >> ~/.bashrc && \
echo "conda activate base" >> ~/.bashrc
# Install conda-installable programs
RUN conda install -y numpy scipy matplotlib pandas
RUN conda install -c conda-forge deepdish
# Install requirements
RUN pip install --upgrade pip \
&& pip install --upgrade setuptools \
&& pip install future \
pycondor>=0.5 \
configargparse \
flake8 \
mock \
pipenv \
coverage \
pytest-cov \
coverage-badge
# Install additional bilby requirements
RUN pip install corner lalsuite astropy gwpy theano
# Install samplers
RUN pip install cpnest dynesty emcee nestle ptemcee pymc3 pymultinest
# Install pymultinest requirements
RUN apt-get install -y libblas3 libblas-dev liblapack3 liblapack-dev \
libatlas3-base libatlas-base-dev cmake build-essential gfortran
# Install pymultinest
RUN git clone https://github.com/farhanferoz/MultiNest.git \
&& (cd MultiNest/MultiNest_v3.11_CMake/multinest && mkdir build && cd build && cmake .. && make)
ENV LD_LIBRARY_PATH $HOME/MultiNest/MultiNest_v3.11_CMake/multinest/lib:
# Install Polychord
RUN git clone https://github.com/PolyChord/PolyChordLite.git \
&& (cd PolyChordLite && make pypolychord MPI= && python setup.py install)
# Install PTMCMCSampler
RUN git clone https://github.com/jellis18/PTMCMCSampler.git \
&& (cd PTMCMCSampler && python setup.py install)
# Add the ROQ data to the image
RUN mkdir roq_basis \
&& cd roq_basis \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/B_linear.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/B_quadratic.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/fnodes_linear.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/fnodes_quadratic.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/params.dat \