Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • john-veitch/bilby
  • duncanmmacleod/bilby
  • colm.talbot/bilby
  • lscsoft/bilby
  • matthew-pitkin/bilby
  • salvatore-vitale/tupak
  • charlie.hoy/bilby
  • bfarr/bilby
  • virginia.demilio/bilby
  • vivien/bilby
  • eric-howell/bilby
  • sebastian-khan/bilby
  • rhys.green/bilby
  • moritz.huebner/bilby
  • joseph.mills/bilby
  • scott.coughlin/bilby
  • matthew.carney/bilby
  • hyungwon.lee/bilby
  • monica.rizzo/bilby
  • christopher-berry/bilby
  • lindsay.demarchi/bilby
  • kaushik.rao/bilby
  • charles.kimball/bilby
  • andrew.matas/bilby
  • juan.calderonbustillo/bilby
  • patrick-meyers/bilby
  • hannah.middleton/bilby
  • eve.chase/bilby
  • grant.meadors/bilby
  • khun.phukon/bilby
  • sumeet.kulkarni/bilby
  • daniel.reardon/bilby
  • cjhaster/bilby
  • sylvia.biscoveanu/bilby
  • james-clark/bilby
  • meg.millhouse/bilby
  • joshua.willis/bilby
  • nikhil.sarin/bilby
  • paul.easter/bilby
  • youngmin/bilby
  • daniel-williams/bilby
  • shanika.galaudage/bilby
  • bruce.edelman/bilby
  • avi.vajpeyi/bilby
  • isobel.romero-shaw/bilby
  • andrew.kim/bilby
  • dominika.zieba/bilby
  • jonathan.davies/bilby
  • marc.arene/bilby
  • srishti.tiwari/bilby-tidal-heating-eccentric
  • aditya.vijaykumar/bilby
  • michael.williams/bilby
  • cecilio.garcia-quiros/bilby
  • rory-smith/bilby
  • maite.mateu-lucena/bilby
  • wushichao/bilby
  • kaylee.desoto/bilby
  • brandon.piotrzkowski/bilby
  • rossella.gamba/bilby
  • hunter.gabbard/bilby
  • deep.chatterjee/bilby
  • tathagata.ghosh/bilby
  • arunava.mukherjee/bilby
  • philip.relton/bilby
  • reed.essick/bilby
  • pawan.gupta/bilby
  • francisco.hernandez/bilby
  • rhiannon.udall/bilby
  • leo.tsukada/bilby
  • will-farr/bilby
  • vijay.varma/bilby
  • jeremy.baier/bilby
  • joshua.brandt/bilby
  • ethan.payne/bilby
  • ka-lok.lo/bilby
  • antoni.ramos-buades/bilby
  • oliviastephany.wilk/bilby
  • jack.heinzel/bilby
  • samson.leong/bilby-psi4
  • viviana.caceres/bilby
  • nadia.qutob/bilby
  • michael-coughlin/bilby
  • hemantakumar.phurailatpam/bilby
  • boris.goncharov/bilby
  • sama.al-shammari/bilby
  • siqi.zhong/bilby
  • jocelyn-read/bilby
  • marc.penuliar/bilby
  • stephanie.letourneau/bilby
  • alexandresebastien.goettel/bilby
  • alec.gunny/bilby
  • serguei.ossokine/bilby
  • pratyusava.baral/bilby
  • sophie.hourihane/bilby
  • eunsub/bilby
  • james.hart/bilby
  • pratyusava.baral/bilby-tg
  • zhaozc/bilby
  • pratyusava.baral/bilby_SoG
  • tomasz.baka/bilby
  • nicogerardo.bers/bilby
  • soumen.roy/bilby
  • isaac.mcmahon/healpix-redundancy
  • asamakai.baker/bilby-frequency-dependent-antenna-pattern-functions
  • anna.puecher/bilby
  • pratyusava.baral/bilby-x-g
  • thibeau.wouters/bilby
  • christian.adamcewicz/bilby
  • raffi.enficiaud/bilby
109 results
Show changes
Commits on Source (34)
Showing
with 205 additions and 156 deletions
[run] [run]
omit = omit =
test/core/example_test.py test/integration/example_test.py
test/gw/example_test.py test/integration/noise_realisation_test.py
test/gw/noise_realisation_test.py test/integration/other_test.py
test/gw/other_test.py
...@@ -91,8 +91,8 @@ python-3.7-samplers: ...@@ -91,8 +91,8 @@ python-3.7-samplers:
script: script:
- python -m pip install . - python -m pip install .
- pytest test/core/sampler/sampler_run_test.py --durations 10 - pytest test/integration/sampler_run_test.py --durations 10
- pytest test/gw/sample_from_the_prior_test.py - pytest test/integration/sample_from_the_prior_test.py
# test samplers on python 3.6 # test samplers on python 3.6
python-3.6-samplers: python-3.6-samplers:
...@@ -101,7 +101,18 @@ python-3.6-samplers: ...@@ -101,7 +101,18 @@ python-3.6-samplers:
script: script:
- python -m pip install . - python -m pip install .
- pytest test/core/sampler/sampler_run_test.py - pytest test/integration/sampler_run_test.py
# Test containers are up to date
containers:
stage: test
image: bilbydev/v2-dockerfile-test-suite-python37
script:
- cd containers
- python write_dockerfiles.py
# Fail if differences exist. If this fails, you may need to run
# write_dockerfiles.py and commit the changes.
- git diff --exit-code
# Tests run at a fixed schedule rather than on push # Tests run at a fixed schedule rather than on push
scheduled-python-3.7: scheduled-python-3.7:
...@@ -113,9 +124,8 @@ scheduled-python-3.7: ...@@ -113,9 +124,8 @@ scheduled-python-3.7:
- python -m pip install . - python -m pip install .
# Run tests which are only done on schedule # Run tests which are only done on schedule
- pytest test/core/example_test.py - pytest test/integration/example_test.py
- pytest test/gw/example_test.py - pytest test/integration/sample_from_the_prior_test.py
- pytest test/gw/sample_from_the_prior_test.py
plotting: plotting:
stage: test stage: test
......
# All notable changes will be documented in this file # All notable changes will be documented in this file
## [1.0.2] 2020-09-14
Version 1.0.2 release of bilby
### Added
- Template for the docker files (!783)
- New delta_phase parameter (!850)
- Normalization factor to time-domain waveform plot (!867)
- JSON encoding for int and float types (!866)
- Various minor formatting additions (!870)
### Changes
- Switched to the conda-forge version of multinest and ultranest (!783)
- Updates KAGRA - K1 interferometer information (!861)
- Restructures to tests to be uniform across project (!834)
- Fix to distance and phase marginalization method (!875)
- Fixed roundoff of in-plane spins samples with vectorisation (!864)
- Fix to reference distance and interpolant behavior (!858)
- Fix to constraint prior sampling method (!863)
- Clean up of code (!854)
- Various minor bug, test and plotting fixes (!859, !874, !872, !865)
## [1.0.1] 2020-08-29 ## [1.0.1] 2020-08-29
Version 1.0.1 release of bilby Version 1.0.1 release of bilby
......
...@@ -444,9 +444,6 @@ class Constraint(Prior): ...@@ -444,9 +444,6 @@ class Constraint(Prior):
def prob(self, val): def prob(self, val):
return (val > self.minimum) & (val < self.maximum) return (val > self.minimum) & (val < self.maximum)
def ln_prob(self, val):
return np.log((val > self.minimum) & (val < self.maximum))
class PriorException(Exception): class PriorException(Exception):
""" General base class for all prior exceptions """ """ General base class for all prior exceptions """
...@@ -365,23 +365,21 @@ class PriorDict(dict): ...@@ -365,23 +365,21 @@ class PriorDict(dict):
return sample return sample
else: else:
needed = np.prod(size) needed = np.prod(size)
constraint_keys = list() for key in keys.copy():
for ii, key in enumerate(keys[-1::-1]):
if isinstance(self[key], Constraint): if isinstance(self[key], Constraint):
constraint_keys.append(-ii - 1) del keys[keys.index(key)]
for ii in constraint_keys[-1::-1]:
del keys[ii]
all_samples = {key: np.array([]) for key in keys} all_samples = {key: np.array([]) for key in keys}
_first_key = list(all_samples.keys())[0] _first_key = list(all_samples.keys())[0]
while len(all_samples[_first_key]) < needed: while len(all_samples[_first_key]) < needed:
samples = self.sample_subset(keys=keys, size=needed) samples = self.sample_subset(keys=keys, size=needed)
keep = np.array(self.evaluate_constraints(samples), dtype=bool) keep = np.array(self.evaluate_constraints(samples), dtype=bool)
for key in samples: for key in keys:
all_samples[key] = np.hstack( all_samples[key] = np.hstack([
[all_samples[key], samples[key][keep].flatten()]) all_samples[key], samples[key][keep].flatten()
all_samples = {key: np.reshape(all_samples[key][:needed], size) ])
for key in all_samples all_samples = {
if not isinstance(self[key], Constraint)} key: np.reshape(all_samples[key][:needed], size) for key in keys
}
return all_samples return all_samples
def normalize_constraint_factor(self, keys): def normalize_constraint_factor(self, keys):
......
...@@ -705,16 +705,20 @@ class Result(object): ...@@ -705,16 +705,20 @@ class Result(object):
""" """
latex_labels = [] latex_labels = []
for k in keys: for key in keys:
if k in self.search_parameter_keys: if key in self.search_parameter_keys:
idx = self.search_parameter_keys.index(k) idx = self.search_parameter_keys.index(key)
latex_labels.append(self.parameter_labels_with_unit[idx]) label = self.parameter_labels_with_unit[idx]
elif k in self.parameter_labels: elif key in self.parameter_labels:
latex_labels.append(k) label = key
else: else:
label = None
logger.debug( logger.debug(
'key {} not a parameter label or latex label'.format(k)) 'key {} not a parameter label or latex label'.format(key)
latex_labels.append(' '.join(k.split('_'))) )
if label is None:
label = key.replace("_", " ")
latex_labels.append(label)
return latex_labels return latex_labels
@property @property
......
...@@ -3,6 +3,7 @@ import sys ...@@ -3,6 +3,7 @@ import sys
import datetime import datetime
from collections import OrderedDict from collections import OrderedDict
import bilby
from ..utils import command_line_args, logger from ..utils import command_line_args, logger
from ..prior import PriorDict, DeltaFunction from ..prior import PriorDict, DeltaFunction
...@@ -107,7 +108,7 @@ def run_sampler(likelihood, priors=None, label='label', outdir='outdir', ...@@ -107,7 +108,7 @@ def run_sampler(likelihood, priors=None, label='label', outdir='outdir',
Returns Returns
------- -------
result result: bilby.core.result.Result
An object containing the results An object containing the results
""" """
......
from __future__ import absolute_import
import datetime import datetime
import dill import dill
import os import os
...@@ -326,10 +324,6 @@ class Dynesty(NestedSampler): ...@@ -326,10 +324,6 @@ class Dynesty(NestedSampler):
def run_sampler(self): def run_sampler(self):
import dynesty import dynesty
logger.info("Using dynesty version {}".format(dynesty.__version__)) logger.info("Using dynesty version {}".format(dynesty.__version__))
if self.kwargs['live_points'] is None:
self.kwargs['live_points'] = (
self.get_initial_points_from_prior(
self.kwargs['nlive']))
if self.kwargs.get("sample", "rwalk") == "rwalk": if self.kwargs.get("sample", "rwalk") == "rwalk":
logger.info( logger.info(
...@@ -351,10 +345,21 @@ class Dynesty(NestedSampler): ...@@ -351,10 +345,21 @@ class Dynesty(NestedSampler):
self._setup_pool() self._setup_pool()
self.sampler = dynesty.NestedSampler( if self.resume:
loglikelihood=_log_likelihood_wrapper, self.resume = self.read_saved_state(continuing=True)
prior_transform=_prior_transform_wrapper,
ndim=self.ndim, **self.sampler_init_kwargs) if self.resume:
logger.info('Resume file successfully loaded.')
else:
if self.kwargs['live_points'] is None:
self.kwargs['live_points'] = (
self.get_initial_points_from_prior(self.kwargs['nlive'])
)
self.sampler = dynesty.NestedSampler(
loglikelihood=_log_likelihood_wrapper,
prior_transform=_prior_transform_wrapper,
ndim=self.ndim, **self.sampler_init_kwargs
)
if self.check_point: if self.check_point:
out = self._run_external_sampler_with_checkpointing() out = self._run_external_sampler_with_checkpointing()
...@@ -424,10 +429,6 @@ class Dynesty(NestedSampler): ...@@ -424,10 +429,6 @@ class Dynesty(NestedSampler):
def _run_external_sampler_with_checkpointing(self): def _run_external_sampler_with_checkpointing(self):
logger.debug("Running sampler with checkpointing") logger.debug("Running sampler with checkpointing")
if self.resume:
resume_file_loaded = self.read_saved_state(continuing=True)
if resume_file_loaded:
logger.info('Resume file successfully loaded.')
old_ncall = self.sampler.ncall old_ncall = self.sampler.ncall
sampler_kwargs = self.sampler_function_kwargs.copy() sampler_kwargs = self.sampler_function_kwargs.copy()
......
...@@ -977,6 +977,10 @@ class BilbyJsonEncoder(json.JSONEncoder): ...@@ -977,6 +977,10 @@ class BilbyJsonEncoder(json.JSONEncoder):
def default(self, obj): def default(self, obj):
from .prior import MultivariateGaussianDist, Prior, PriorDict from .prior import MultivariateGaussianDist, Prior, PriorDict
from ..gw.prior import HealPixMapPriorDist from ..gw.prior import HealPixMapPriorDist
if isinstance(obj, np.integer):
return int(obj)
if isinstance(obj, np.floating):
return float(obj)
if isinstance(obj, PriorDict): if isinstance(obj, PriorDict):
return {'__prior_dict__': True, 'content': obj._get_json_dict()} return {'__prior_dict__': True, 'content': obj._get_json_dict()}
if isinstance(obj, (MultivariateGaussianDist, HealPixMapPriorDist, Prior)): if isinstance(obj, (MultivariateGaussianDist, HealPixMapPriorDist, Prior)):
......
...@@ -248,6 +248,14 @@ def convert_to_lal_binary_black_hole_parameters(parameters): ...@@ -248,6 +248,14 @@ def convert_to_lal_binary_black_hole_parameters(parameters):
converted_parameters[angle] =\ converted_parameters[angle] =\
np.arccos(converted_parameters[cos_angle]) np.arccos(converted_parameters[cos_angle])
if "delta_phase" in original_keys:
converted_parameters["phase"] = np.mod(
converted_parameters["delta_phase"]
- np.sign(np.cos(converted_parameters["theta_jn"]))
* converted_parameters["psi"],
2 * np.pi
)
added_keys = [key for key in converted_parameters.keys() added_keys = [key for key in converted_parameters.keys()
if key not in original_keys] if key not in original_keys]
...@@ -1009,18 +1017,19 @@ def generate_component_spins(sample): ...@@ -1009,18 +1017,19 @@ def generate_component_spins(sample):
['theta_jn', 'phi_jl', 'tilt_1', 'tilt_2', 'phi_12', 'a_1', 'a_2', ['theta_jn', 'phi_jl', 'tilt_1', 'tilt_2', 'phi_12', 'a_1', 'a_2',
'mass_1', 'mass_2', 'reference_frequency', 'phase'] 'mass_1', 'mass_2', 'reference_frequency', 'phase']
if all(key in output_sample.keys() for key in spin_conversion_parameters): if all(key in output_sample.keys() for key in spin_conversion_parameters):
output_sample['iota'], output_sample['spin_1x'],\ (
output_sample['spin_1y'], output_sample['spin_1z'], \ output_sample['iota'], output_sample['spin_1x'],
output_sample['spin_2x'], output_sample['spin_2y'],\ output_sample['spin_1y'], output_sample['spin_1z'],
output_sample['spin_2z'] =\ output_sample['spin_2x'], output_sample['spin_2y'],
transform_precessing_spins( output_sample['spin_2z']
output_sample['theta_jn'], output_sample['phi_jl'], ) = np.vectorize(bilby_to_lalsimulation_spins)(
output_sample['tilt_1'], output_sample['tilt_2'], output_sample['theta_jn'], output_sample['phi_jl'],
output_sample['phi_12'], output_sample['a_1'], output_sample['tilt_1'], output_sample['tilt_2'],
output_sample['a_2'], output_sample['phi_12'], output_sample['a_1'], output_sample['a_2'],
output_sample['mass_1'] * solar_mass, output_sample['mass_1'] * solar_mass,
output_sample['mass_2'] * solar_mass, output_sample['mass_2'] * solar_mass,
output_sample['reference_frequency'], output_sample['phase']) output_sample['reference_frequency'], output_sample['phase']
)
output_sample['phi_1'] =\ output_sample['phi_1'] =\
np.fmod(2 * np.pi + np.arctan2( np.fmod(2 * np.pi + np.arctan2(
......
# KAGRA at design sensitvity. # KAGRA at design sensitvity.
# https://arxiv.org/pdf/1102.5421.pdf
# WARNING: I think there's a typo in that reference for the orientation.
# https://dcc.ligo.org/LIGO-P1200087-v42/public # https://dcc.ligo.org/LIGO-P1200087-v42/public
#Kagra detector location: https://gwdoc.icrr.u-tokyo.ac.jp/cgi-bin/DocDB/ShowDocument?docid=3824
#The detector physical location needs to be converted into detector constants according to the BILBY convention.
#check the detector constant LAL value (careful about the convention) (https://lscsoft.docs.ligo.org/lalsuite/lal/group___detector_constants.html)
name = 'K1' name = 'K1'
power_spectral_density = PowerSpectralDensity(psd_file='KAGRA_design_psd.txt') power_spectral_density = PowerSpectralDensity(psd_file='KAGRA_design_psd.txt')
length = 3 length = 3
minimum_frequency = 20 minimum_frequency = 20
maximum_frequency = 2048 maximum_frequency = 2048
latitude = 36 + 15. / 60 + 00. / 3600 latitude = 36 + 24 / 60 + 42.69722 / 3600
longitude = 137 + 10. / 60 + 48. / 3600 longitude = 137 + 18 / 60 + 21.44171 / 3600
elevation = 414.0 elevation = 414.181
xarm_azimuth = 25.0 xarm_azimuth = 90 - 60.39623489157727
yarm_azimuth = 115.0 yarm_azimuth = 90 + 29.60357629670688
...@@ -38,6 +38,9 @@ class Interferometer(object): ...@@ -38,6 +38,9 @@ class Interferometer(object):
vertex = PropertyAccessor('geometry', 'vertex') vertex = PropertyAccessor('geometry', 'vertex')
detector_tensor = PropertyAccessor('geometry', 'detector_tensor') detector_tensor = PropertyAccessor('geometry', 'detector_tensor')
duration = PropertyAccessor('strain_data', 'duration')
sampling_frequency = PropertyAccessor('strain_data', 'sampling_frequency')
start_time = PropertyAccessor('strain_data', 'start_time')
frequency_array = PropertyAccessor('strain_data', 'frequency_array') frequency_array = PropertyAccessor('strain_data', 'frequency_array')
time_array = PropertyAccessor('strain_data', 'time_array') time_array = PropertyAccessor('strain_data', 'time_array')
minimum_frequency = PropertyAccessor('strain_data', 'minimum_frequency') minimum_frequency = PropertyAccessor('strain_data', 'minimum_frequency')
......
...@@ -398,7 +398,7 @@ def load_interferometer(filename): ...@@ -398,7 +398,7 @@ def load_interferometer(filename):
with open(filename, 'r') as parameter_file: with open(filename, 'r') as parameter_file:
lines = parameter_file.readlines() lines = parameter_file.readlines()
for line in lines: for line in lines:
if line[0] == '#': if line[0] == '#' or line[0] == '\n':
continue continue
split_line = line.split('=') split_line = line.split('=')
key = split_line[0].strip() key = split_line[0].strip()
......
...@@ -29,7 +29,7 @@ path_to_eos_tables = os.path.join(os.path.dirname(__file__), 'eos_tables') ...@@ -29,7 +29,7 @@ path_to_eos_tables = os.path.join(os.path.dirname(__file__), 'eos_tables')
list_of_eos_tables = os.listdir(path_to_eos_tables) list_of_eos_tables = os.listdir(path_to_eos_tables)
valid_eos_files = [i for i in list_of_eos_tables if 'LAL' in i] valid_eos_files = [i for i in list_of_eos_tables if 'LAL' in i]
valid_eos_file_paths = [os.path.join(path_to_eos_tables, filename) for filename in valid_eos_files] valid_eos_file_paths = [os.path.join(path_to_eos_tables, filename) for filename in valid_eos_files]
valid_eos_names = [i.split('_')[-1].strip('.dat') for i in valid_eos_files] valid_eos_names = [i.split('_', maxsplit=1)[-1].strip('.dat') for i in valid_eos_files]
valid_eos_dict = dict(zip(valid_eos_names, valid_eos_file_paths)) valid_eos_dict = dict(zip(valid_eos_names, valid_eos_file_paths))
......
...@@ -167,6 +167,9 @@ class GravitationalWaveTransient(Likelihood): ...@@ -167,6 +167,9 @@ class GravitationalWaveTransient(Likelihood):
self.distance_prior_array = np.array( self.distance_prior_array = np.array(
[self.priors['luminosity_distance'].prob(distance) [self.priors['luminosity_distance'].prob(distance)
for distance in self._distance_array]) for distance in self._distance_array])
if self.phase_marginalization:
max_bound = np.ceil(10 + np.log10(self._dist_multiplier))
self._setup_phase_marginalization(max_bound=max_bound)
self._setup_distance_marginalization( self._setup_distance_marginalization(
distance_marginalization_lookup_table) distance_marginalization_lookup_table)
for key in ['redshift', 'comoving_distance']: for key in ['redshift', 'comoving_distance']:
...@@ -597,8 +600,13 @@ class GravitationalWaveTransient(Likelihood): ...@@ -597,8 +600,13 @@ class GravitationalWaveTransient(Likelihood):
@property @property
def _ref_dist(self): def _ref_dist(self):
""" Smallest distance contained in priors """ """ Median distance in priors """
return self._distance_array[0] return self.priors['luminosity_distance'].rescale(0.5)
@property
def _dist_multiplier(self):
''' Maximum value of ref_dist/dist_array '''
return self._ref_dist / self._distance_array[0]
@property @property
def _optimal_snr_squared_ref_array(self): def _optimal_snr_squared_ref_array(self):
...@@ -632,7 +640,7 @@ class GravitationalWaveTransient(Likelihood): ...@@ -632,7 +640,7 @@ class GravitationalWaveTransient(Likelihood):
self._create_lookup_table() self._create_lookup_table()
self._interp_dist_margd_loglikelihood = UnsortedInterp2d( self._interp_dist_margd_loglikelihood = UnsortedInterp2d(
self._d_inner_h_ref_array, self._optimal_snr_squared_ref_array, self._d_inner_h_ref_array, self._optimal_snr_squared_ref_array,
self._dist_margd_loglikelihood_array, kind='cubic') self._dist_margd_loglikelihood_array, kind='cubic', fill_value=-np.inf)
@property @property
def cached_lookup_table_filename(self): def cached_lookup_table_filename(self):
...@@ -714,10 +722,10 @@ class GravitationalWaveTransient(Likelihood): ...@@ -714,10 +722,10 @@ class GravitationalWaveTransient(Likelihood):
self._dist_margd_loglikelihood_array -= log_norm self._dist_margd_loglikelihood_array -= log_norm
self.cache_lookup_table() self.cache_lookup_table()
def _setup_phase_marginalization(self): def _setup_phase_marginalization(self, min_bound=-5, max_bound=10):
self._bessel_function_interped = interp1d( self._bessel_function_interped = interp1d(
np.logspace(-5, 10, int(1e6)), np.logspace(-5, 10, int(1e6)) + np.logspace(-5, max_bound, int(1e6)), np.logspace(-5, max_bound, int(1e6)) +
np.log([i0e(snr) for snr in np.logspace(-5, 10, int(1e6))]), np.log([i0e(snr) for snr in np.logspace(-5, max_bound, int(1e6))]),
bounds_error=False, fill_value=(0, np.nan)) bounds_error=False, fill_value=(0, np.nan))
def _setup_time_marginalization(self): def _setup_time_marginalization(self):
......
...@@ -436,7 +436,8 @@ class CompactBinaryCoalescenceResult(CoreResult): ...@@ -436,7 +436,8 @@ class CompactBinaryCoalescenceResult(CoreResult):
go.Scatter( go.Scatter(
x=plot_times, x=plot_times,
y=infft( y=infft(
interferometer.whitened_frequency_domain_strain, interferometer.whitened_frequency_domain_strain *
np.sqrt(2. / interferometer.sampling_frequency),
sampling_frequency=interferometer.strain_data.sampling_frequency)[time_idxs], sampling_frequency=interferometer.strain_data.sampling_frequency)[time_idxs],
fill=None, fill=None,
mode='lines', line_color=DATA_COLOR, mode='lines', line_color=DATA_COLOR,
...@@ -461,7 +462,8 @@ class CompactBinaryCoalescenceResult(CoreResult): ...@@ -461,7 +462,8 @@ class CompactBinaryCoalescenceResult(CoreResult):
color=DATA_COLOR, label='ASD') color=DATA_COLOR, label='ASD')
axs[1].plot( axs[1].plot(
plot_times, infft( plot_times, infft(
interferometer.whitened_frequency_domain_strain, interferometer.whitened_frequency_domain_strain *
np.sqrt(2. / interferometer.sampling_frequency),
sampling_frequency=interferometer.strain_data.sampling_frequency)[time_idxs], sampling_frequency=interferometer.strain_data.sampling_frequency)[time_idxs],
color=DATA_COLOR, alpha=0.3) color=DATA_COLOR, alpha=0.3)
logger.debug('Plotted interferometer data.') logger.debug('Plotted interferometer data.')
...@@ -474,7 +476,8 @@ class CompactBinaryCoalescenceResult(CoreResult): ...@@ -474,7 +476,8 @@ class CompactBinaryCoalescenceResult(CoreResult):
fd_waveform = interferometer.get_detector_response(wf_pols, params) fd_waveform = interferometer.get_detector_response(wf_pols, params)
fd_waveforms.append(fd_waveform[frequency_idxs]) fd_waveforms.append(fd_waveform[frequency_idxs])
td_waveform = infft( td_waveform = infft(
fd_waveform / interferometer.amplitude_spectral_density_array, fd_waveform * np.sqrt(2. / interferometer.sampling_frequency) /
interferometer.amplitude_spectral_density_array,
self.sampling_frequency)[time_idxs] self.sampling_frequency)[time_idxs]
td_waveforms.append(td_waveform) td_waveforms.append(td_waveform)
fd_waveforms = asd_from_freq_series( fd_waveforms = asd_from_freq_series(
...@@ -601,7 +604,7 @@ class CompactBinaryCoalescenceResult(CoreResult): ...@@ -601,7 +604,7 @@ class CompactBinaryCoalescenceResult(CoreResult):
hf_inj_det = interferometer.get_detector_response( hf_inj_det = interferometer.get_detector_response(
hf_inj, self.injection_parameters) hf_inj, self.injection_parameters)
ht_inj_det = infft( ht_inj_det = infft(
hf_inj_det / hf_inj_det * np.sqrt(2. / interferometer.sampling_frequency) /
interferometer.amplitude_spectral_density_array, interferometer.amplitude_spectral_density_array,
self.sampling_frequency)[time_idxs] self.sampling_frequency)[time_idxs]
if format == "html": if format == "html":
......
...@@ -279,6 +279,16 @@ def optimal_snr_squared(signal, power_spectral_density, duration): ...@@ -279,6 +279,16 @@ def optimal_snr_squared(signal, power_spectral_density, duration):
return noise_weighted_inner_product(signal, signal, power_spectral_density, duration) return noise_weighted_inner_product(signal, signal, power_spectral_density, duration)
def overlap(signal_a, signal_b, power_spectral_density=None, delta_frequency=None,
lower_cut_off=None, upper_cut_off=None, norm_a=None, norm_b=None):
low_index = int(lower_cut_off / delta_frequency)
up_index = int(upper_cut_off / delta_frequency)
integrand = np.conj(signal_a) * signal_b
integrand = integrand[low_index:up_index] / power_spectral_density[low_index:up_index]
integral = (4 * delta_frequency * integrand) / norm_a / norm_b
return sum(integral).real
__cached_euler_matrix = None __cached_euler_matrix = None
__cached_delta_x = None __cached_delta_x = None
......
FROM bilbydev/bilby-test-suite-python37
LABEL name="bilby" \
maintainer="Gregory Ashton <gregory.ashton@ligo.org>" \
date="20190130"
RUN pip install bilby
FROM ubuntu:18.04
LABEL name="bilby Base Enterprise Linux 7" \
maintainer="Gregory Ashton <gregory.ashton@ligo.org>" \
date="20190104"
ENV PATH /opt/conda/bin:$PATH
# Install backend
RUN apt-get update --fix-missing \
&& apt-get install -y libglib2.0-0 libxext6 libsm6 libxrender1 libgl1-mesa-glx \
dh-autoreconf build-essential libarchive-dev wget curl git libhdf5-serial-dev
# Install python2.7
RUN wget --quiet https://repo.anaconda.com/miniconda/Miniconda2-4.5.11-Linux-x86_64.sh -O ~/miniconda.sh && \
/bin/bash ~/miniconda.sh -b -p /opt/conda && \
rm ~/miniconda.sh && \
ln -s /opt/conda/etc/profile.d/conda.sh /etc/profile.d/conda.sh && \
echo ". /opt/conda/etc/profile.d/conda.sh" >> ~/.bashrc && \
echo "conda activate base" >> ~/.bashrc
# Install conda-installable programs
RUN conda install -y numpy scipy matplotlib pandas==0.23
RUN conda install -c conda-forge deepdish arviz
# Install requirements
RUN pip install --upgrade pip \
&& pip install --upgrade setuptools \
&& pip install future \
pycondor>=0.5 \
configargparse \
flake8 \
mock \
pipenv \
coverage \
pytest-cov \
coverage-badge
# Install additional bilby requirements
RUN pip install corner lalsuite astropy gwpy theano tables
# Install samplers
RUN pip install cpnest dynesty emcee nestle ptemcee pymc3 pymultinest kombine ultranest dnest4
# Install pymultinest requirements
RUN apt-get install -y libblas3 libblas-dev liblapack3 liblapack-dev \
libatlas3-base libatlas-base-dev cmake build-essential gfortran
# Install pymultinest
RUN git clone https://github.com/farhanferoz/MultiNest.git \
&& (cd MultiNest/MultiNest_v3.11_CMake/multinest && mkdir build && cd build && cmake .. && make)
ENV LD_LIBRARY_PATH $HOME/MultiNest/MultiNest_v3.11_CMake/multinest/lib:
# Install Polychord
RUN git clone https://github.com/PolyChord/PolyChordLite.git \
&& (cd PolyChordLite && python setup.py --no-mpi install)
# Install PTMCMCSampler
RUN git clone https://github.com/jellis18/PTMCMCSampler.git \
&& (cd PTMCMCSampler && python setup.py install)
# Add the ROQ data to the image
RUN mkdir roq_basis \
&& cd roq_basis \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/B_linear.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/B_quadratic.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/fnodes_linear.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/fnodes_quadratic.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/params.dat \
FROM ubuntu:18.04 FROM continuumio/miniconda3
LABEL name="bilby Base miniconda3" \
maintainer="Gregory Ashton <gregory.ashton@ligo.org>"
LABEL name="bilby Base Enterprise Linux 7" \ RUN conda update -n base -c defaults conda
maintainer="Gregory Ashton <gregory.ashton@ligo.org>" \
date="20190104"
ENV PATH /opt/conda/bin:$PATH ENV conda_env python{python_major_version}{python_minor_version}
# Install backend RUN conda create -n ${{conda_env}} python={python_major_version}.{python_minor_version}
RUN apt-get update --fix-missing \ RUN echo "source activate ${{conda_env}}" > ~/.bashrc
&& apt-get install -y libglib2.0-0 libxext6 libsm6 libxrender1 libgl1-mesa-glx \ ENV PATH /opt/conda/envs/${{conda_env}}/bin:$PATH
dh-autoreconf build-essential libarchive-dev wget curl git libhdf5-serial-dev RUN /bin/bash -c "source activate ${{conda_env}}"
RUN conda info
RUN python --version
# Install python3.7 # Install pymultinest requirements
RUN wget --quiet https://repo.anaconda.com/miniconda/Miniconda3-4.5.11-Linux-x86_64.sh -O ~/miniconda.sh && \ RUN apt-get update
/bin/bash ~/miniconda.sh -b -p /opt/conda && \ RUN apt-get install -y libblas3 libblas-dev liblapack3 liblapack-dev \
rm ~/miniconda.sh && \ libatlas3-base libatlas-base-dev cmake build-essential gfortran
/opt/conda/bin/conda clean -tipsy && \
ln -s /opt/conda/etc/profile.d/conda.sh /etc/profile.d/conda.sh && \
echo ". /opt/conda/etc/profile.d/conda.sh" >> ~/.bashrc && \
echo "conda activate base" >> ~/.bashrc
# Update pip and setuptools
RUN pip install --upgrade pip \
&& LC_ALL=C pip install --upgrade setuptools
# Install conda-installable programs # Install conda-installable programs
RUN conda install -y numpy scipy matplotlib pandas RUN conda install -n ${{conda_env}} -y matplotlib numpy scipy pandas astropy flake8 mock
RUN conda install -n ${{conda_env}} -c anaconda coverage configargparse future
# Install conda-forge-installable programs # Install conda-forge-installable programs
RUN conda install -c conda-forge deepdish arviz RUN conda install -n ${{conda_env}} -c conda-forge black ligo-gracedb gwpy lalsuite ligo.skymap pytest-cov deepdish arviz
# Install requirements # Install pip-requirements
RUN pip install future \ RUN pip install --upgrade pip
pycondor>=0.5 \ RUN pip install --upgrade setuptools coverage-badge
configargparse \
flake8 \
mock \
pipenv \
coverage \
pytest-cov \
coverage-badge
# Install documentation requirements # Install documentation requirements
RUN pip install sphinx numpydoc nbsphinx sphinx_rtd_theme sphinx-tabs RUN pip install sphinx numpydoc nbsphinx sphinx_rtd_theme sphinx-tabs autodoc
# Install additional bilby requirements
RUN pip install corner lalsuite astropy gwpy theano healpy tables
# Install samplers
RUN pip install cpnest dynesty emcee nestle ptemcee pymc3 pymultinest kombine ultranest dnest4
# Install pymultinest requirements
RUN apt-get install -y libblas3 libblas-dev liblapack3 liblapack-dev \
libatlas3-base libatlas-base-dev cmake build-essential gfortran
# Install pymultinest
RUN git clone https://github.com/farhanferoz/MultiNest.git \
&& (cd MultiNest/MultiNest_v3.11_CMake/multinest && mkdir build && cd build && cmake .. && make)
ENV LD_LIBRARY_PATH $HOME/MultiNest/MultiNest_v3.11_CMake/multinest/lib: # Install dependencies and samplers
RUN pip install corner lalsuite theano healpy cython tables
RUN pip install cpnest dynesty emcee nestle ptemcee pymc3 kombine dnest4
RUN conda install -n ${{conda_env}} -c conda-forge pymultinest ultranest
# Install Polychord # Install Polychord
RUN git clone https://github.com/PolyChord/PolyChordLite.git \ RUN git clone https://github.com/PolyChord/PolyChordLite.git \
...@@ -75,4 +52,4 @@ RUN mkdir roq_basis \ ...@@ -75,4 +52,4 @@ RUN mkdir roq_basis \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/B_quadratic.npy \ && wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/B_quadratic.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/fnodes_linear.npy \ && wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/fnodes_linear.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/fnodes_quadratic.npy \ && wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/fnodes_quadratic.npy \
&& wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/params.dat \ && wget https://git.ligo.org/lscsoft/ROQ_data/raw/master/IMRPhenomPv2/4s/params.dat