Commit 608c9103 authored by Rachel Gray's avatar Rachel Gray
Browse files

Merge branch 'pending_review' into 'master'

Merge of Pending review in master

See merge request cbc-cosmo/gwcosmo!28
parents b9c1bdbc 33c07244
*.dat filter=lfs diff=lfs merge=lfs -text
*.txt filter=lfs diff=lfs merge=lfs -text
gwcosmo/data/*.dat filter=lfs diff=lfs merge=lfs -text
gwcosmo/data/*.txt filter=lfs diff=lfs merge=lfs -text
*.ipynb
*~
*.egg-info
*.pyc
......
---
image: ligo/publisher:latest
publish:
before_script:
- echo $CI_COMMIT_SHA | cut -b1-8 > docs/latex/gitID.txt
script:
- cd docs/latex
- make gwcosmo_method.pdf
- cd ..
after_script:
- rm docs/latex/gitID.txt
cache:
key: "$CI_PROJECT_NAMESPACE-$CI_PROJECT_NAME-$CI_JOB_NAME"
untracked: true
artifacts:
expire_in: 4w
paths:
- docs/latex/gwcosmo_method.pdf
# Contributing
Contributors may familiarize themselves with GWCosmo itself by going through the
[First Steps with GWCosmo](https://www.lsc-group.phys.uwm.edu/ligovirgo/cbcnote/Cosmology/gwcosmo/outline) tutorial.
# Working with large files
The best and recommended way to work with large files and git is to use Git lfs
To get started with Git lfs:
Download and install the Git lfs command line extension. You only have to set up Git LFS once using Homebrew or Macports:
$ brew install git-lfs
$ port install git-lfs
Then cd into the gwcosmo directory and do:
$ git lfs install
Select the file types that you want Git LFS to manage (these go into the .gitattributes file):
$ git lfs track "*.gwf"
Make sure .gitattributes is tracked:
$ git add .gitattributes
There is no step three. Just commit and push to GitHub as you normally would.
$ git add ./data/catalog_data/DES.dat
$ git commit -m "Added DES catalogue file"
$ git push origin master
\ No newline at end of file
# Quick start
## To install
GWCosmo requires Python >= 3.5.
The easiest way to install it is with `virtualenv` and `pip`:
$ virtualenv --system-site-packages ~/gwcosmo
$ source ~/gwcosmo/bin/activate
$ pip install git+https://git.ligo.org/cbc-cosmo/gwcosmo
* **Note:** GWCosmo requires a fairly new version of `setuptools`. If you get
an error message that looks like this:
pkg_resources.VersionConflict: (setuptools 0.9.8 (gwcelery/lib/python2.7/site-packages), Requirement.parse('setuptools>=30.3.0'))
then run `pip install --upgrade setuptools` and try again.
## To test
With `setup.py`:
$ python setup.py test
# GWcosmo
A package to estimate cosmological parameters using gravitational-wave observations. If you use GWcosmo in a scientific publication, please cite
```
@article{Gray:2019ksv,
author = "Gray, Rachel and others",
title = "{Cosmological inference using gravitational wave standard sirens: A mock data analysis}",
eprint = "1908.06050",
archivePrefix = "arXiv",
primaryClass = "gr-qc",
reportNumber = "LIGO-P1900017",
doi = "10.1103/PhysRevD.101.122001",
journal = "Phys. Rev. D",
volume = "101",
number = "12",
pages = "122001",
year = "2020"
}
```
## How-to install
You will need an [Anaconda distribution](https://www.anaconda.com/). The conda distribution is correctly initialized when, if you open your terminal, you will see the name of the python environment used. The default name is `(base)`.
Once the conda distribution is installed and activated on your machine, please follow these steps:
* Clone the gwcosmo repository with
```
git clone <repository>
```
the name of the repository can be copied from the git interface (top right button). If you do not have ssh key on git, please use the `https` protocol
* Enter in the cloned directory
* Create a conda virtual environment to host gwcosmo. Use
```
conda create -n gwcosmo python=3.6
```
* When the virtual environment is ready, activate it with (your python distribution will change to `gwcosmo`)
```
conda activate gwcosmo
```
* Run the following line to install all the python packages required by `gwcosmo`
```
pip install -r requirements
```
* Install `gwcosmo` by running
```
python setup.py install
```
* You are ready to use `gwcosmo`. Note that, if you modify the code, you can easily reinstall it by using
```
python setup.py install --force
```
gwcosmo
=======
A package to estimate cosmological parameters using gravitational-wave observations.
- `Installation instructions <https://ldas-jobs.ligo.caltech.edu/~ignacio.magana/_build/html/installation.html>`__
- `Contributing <https://git.ligo.org/cbc-cosmo/gwcosmo/blob/master/CONTRIBUTING.md>`__
- `Documentation <https://ldas-jobs.ligo-la.caltech.edu/~ignacio.magana/_build/html/index.html>`__
- `Issue tracker <https://git.ligo.org/cbc-cosmo/gwcosmo/issues>`__
Auto-built PDF of Technical Document
------------------------------------
PDFs are automatically generated when the content changes. The following link is updated automatically.
[Latest PDF](https://git.ligo.org/cbc-cosmo/gwcosmo/-/jobs/artifacts/master/browse?job=publish) (automatically built when commits are added to the repository).
#!/usr/bin/env python
#!/usr/bin/env python3
"""
This script combines individual H0 posteriors.
Ignacio Magana
......@@ -9,7 +9,7 @@ import sys
from optparse import Option, OptionParser
#Global Imports
import matplotlib
import matplotlib
matplotlib.use('agg')
import matplotlib.pyplot as plt
matplotlib.rcParams['font.family']= 'Times New Roman'
......@@ -62,7 +62,7 @@ for path, subdirs, files in os.walk(dir):
filepath = os.path.join(path, name)
if filepath[-4:] == '.npz':
dir_list.append(filepath)
outputfile = str(opts.outputfile)
......@@ -143,4 +143,4 @@ plt.legend(loc='upper right',fontsize=10)
plt.tight_layout()
plt.savefig('./'+outputfile+'.png',dpi=800)
np.savez('./'+outputfile+'.npz',[H0,likelihood_comb,posterior_uniform_norm,posterior_log_norm])
\ No newline at end of file
np.savez('./'+outputfile+'.npz',[H0,likelihood_comb,posterior_uniform_norm,posterior_log_norm])
#!/usr/bin/env python
#!/usr/bin/env python3
"""
This script calculates pdets.
Ignacio Magana, Rachel Gray
......@@ -32,15 +32,29 @@ parser = OptionParser(
usage = "%prog [options]",
option_list = [
Option("--mass_distribution", default=None,
help="Choose between BNS, NSBH or BBH-powerlaw mass distributions for default Pdet calculations."),
Option("--psd", default=None,
help="Select between 'O1' and 'O2' and 'O3' PSDs, for default Pdet calculations. By default we use aLIGO at design sensitivity."),
help="Choose between BNS or NSBH/BBH-powerlaw, NSBH/BBH-powerlaw-gaussian, NSBH/BBH-broken-powerlaw mass distributions for default Pdet calculations."),
Option("--psd", default=None, type=str,
help="Select between 'O1' and 'O2' and 'O3' PSDs, for default Pdet calculations. By default we use aLIGO at design sensitivity."),
Option("--powerlaw_slope", default='1.6', type=float,
help="Set powerlaw slope for BBH powerlaw mass distribution."),
Option("--powerlaw_slope_2", default='0.0', type=float,
help="Set second powerlaw slope for BBH with broken powerlaw mass distribution."),
Option("--beta", default='0.0', type=float,
help="Set powerlaw slope for the second black hole."),
Option("--minimum_mass", default='5.0', type=float,
help="Set minimum mass in the source fram for BBH powerlaw mass distribution (default is 5)."),
help="Set minimum mass in the source frame for BBH (default is 5)."),
Option("--maximum_mass", default='100.0', type=float,
help="Set maximum mass in the source fram for BBH powerlaw mass distribution (default is 100)."),
help="Set maximum mass in the source frame for BBH mass distribution (default is 100)."),
Option("--mu_g", default='35.0', type=float,
help="Set the mu of the gaussian peak in case of BBH-powerlaw-gaussian mass distribution."),
Option("--lambda_peak", default='0.2', type=float,
help="Set the lambda of the gaussian peak in case of BBH-powerlaw-gaussian mass distribution."),
Option("--sigma_g", default='5.0', type=float,
help="Set the sigma of the gaussian peak in case of BBH-powerlaw-gaussian mass distribution."),
Option("--delta_m", default='0.', type=float,
help="Set the smoothing parameter in case of BBH-powerlaw-gaussian or BBH-broken-powerlaw mass distributions."),
Option("--b", default='0.5', type=float,
help="Set the fraction at which the powerlaw breaks in case of BBH-broken-powerlaw mass distribution."),
Option("--linear_cosmology", default='False',
help="Assume a linear cosmology."),
Option("--basic_pdet", default='False',
......@@ -63,8 +77,18 @@ parser = OptionParser(
help="Directory of constant_H0 Pdets to combine into single Pdet pickle."),
Option("--outputfile", default=None,
help="Name of output pdet file."),
Option("--Omega_m", default='0.308', type=float,
help="Omega of matter."),
Option("--snr", default='12.0', type=float,
help="Network SNR threshold.")
help="Network SNR threshold."),
Option("--detected_masses", default='False',
help="Set to True if you want to keep track of the detected masses."),
Option("--detectors", default='HLV',
help="Set the detectors to use for the pickle (default=HLV)."),
Option("--det_combination", default='True',
help="Set whether or not to consider all possible detectors combinations (default=True)."),
Option("--seed", default='1000', type=int,
help="Set the random seed.")
])
opts, args = parser.parse_args()
print(opts)
......@@ -76,18 +100,26 @@ for option in parser.option_list:
missing.extend(option._long_opts)
if len(missing) > 0:
parser.error('Missing required options: {0}'.format(str(missing)))
mass_distribution = str(opts.mass_distribution)
psd = str(opts.psd)
psd = opts.psd
seed = int(opts.seed)
alpha = float(opts.powerlaw_slope)
Mmin = float(opts.minimum_mass)
Mmax = float(opts.maximum_mass)
alpha_2 = float(opts.powerlaw_slope_2)
beta = float(opts.beta)
mu_g = float(opts.mu_g)
lambda_peak = float(opts.lambda_peak)
sigma_g = float(opts.sigma_g)
delta_m = float(opts.delta_m)
b = float(opts.b)
min_H0 = float(opts.min_H0)
max_H0 = float(opts.max_H0)
bins_H0 = float(opts.bins_H0)
bins_H0 = int(opts.bins_H0)
det_combination = str2bool(opts.det_combination)
linear = str2bool(opts.linear_cosmology)
basic = str2bool(opts.basic_pdet)
full_waveform = str2bool(opts.full_waveform)
......@@ -95,6 +127,17 @@ Nsamps = int(opts.Nsamps)
network_snr_threshold = float(opts.snr)
constant_H0 = str2bool(opts.constant_H0)
pdet_path = str(opts.combine)
Omega_m=float(opts.Omega_m)
detected_masses = str2bool(opts.detected_masses)
detector = str(opts.detectors)
dets = []
if 'H' in detector:
dets.append('H1')
if 'L' in detector:
dets.append('L1')
if 'V' in detector:
dets.append('V1')
if constant_H0 is True:
H0 = float(opts.H0)
......@@ -107,113 +150,121 @@ else:
kind = 'inspiral'
if opts.combine is None:
if mass_distribution == 'BBH-powerlaw':
print("Calculating Pdet with a " + mass_distribution + " mass distribution with alpha = " + str(alpha)
+ " at "+ psd + " sensitivity using the " + kind)
else:
print("Calculating Pdet with a " + mass_distribution + " mass distribution at " + psd + " sensitivity using the " + kind)
pdet = gwcosmo.detection_probability.DetectionProbability(mass_distribution=mass_distribution, asd=psd, basic=basic,
linear=linear, alpha=alpha, Mmin=Mmin, Mmax=Mmax,
full_waveform=full_waveform, Nsamps=Nsamps,
constant_H0=constant_H0, H0=H0, network_snr_threshold=network_snr_threshold)
if opts.outputfile is None:
if mass_distribution == 'BBH-powerlaw':
pdet_path = '{}PSD_{}_alpha_{}_Mmin_{}_Mmax_{}_Nsamps{}_{}.p'.format(psd, mass_distribution, alpha, Mmin, Mmax, Nsamps, kind)
if psd != None:
if mass_distribution == 'BNS':
pdet_path = '{}PSD_{}_Nsamps{}_{}.p'.format(psd, mass_distribution, Nsamps, kind)
else:
pdet_path = '{}PSD_{}_alpha_{}_Mmin_{}_Mmax_{}_Nsamps{}_{}.p'.format(psd, mass_distribution, alpha, Mmin, Mmax, Nsamps, kind)
else:
pdet_path = '{}PSD_{}_Nsamps{}_{}.p'.format(psd, mass_distribution, Nsamps, kind)
if mass_distribution == 'BNS':
pdet_path = '{}_Nsamps{}_{}.p'.format(mass_distribution, Nsamps, kind)
else:
pdet_path = '{}_alpha_{}_Mmin_{}_Mmax_{}_Nsamps{}_{}.p'.format(mass_distribution, alpha, Mmin, Mmax, Nsamps, kind)
else:
pdet_path = str(opts.outputfile)
pdet = gwcosmo.detection_probability.DetectionProbability(mass_distribution=mass_distribution, asd=psd, detected_masses=detected_masses, basic=basic, detectors=dets,
linear=linear, alpha=alpha, Mmin=Mmin, Mmax=Mmax, Omega_m=Omega_m, alpha_2=alpha_2, mu_g=mu_g, sigma_g=sigma_g,
lambda_peak=lambda_peak, beta=beta, full_waveform=full_waveform, Nsamps=Nsamps, det_combination = det_combination,
b=b, delta_m=delta_m, constant_H0=constant_H0, H0=H0, network_snr_threshold=network_snr_threshold, path=pdet_path, seed=seed)
pickle.dump( pdet, open( pdet_path, "wb" ) )
else:
probs = {}
detected = {}
for file in os.listdir(pdet_path):
if file.endswith(".p"):
pdets = pickle.load(open(os.path.join(pdet_path,str(file)), 'rb'))
psd = pdets.asd
alpha = pdets.alpha
Mmin = pdets.Mmin
Mmax = pdets.Mmax
mass_distribution = pdets.mass_distribution
detected_masses = pdets.detected_masses
break
for h0 in H0:
try:
pdets = pickle.load(open(pdet_path+'/pdet_'+psd+'_'+str(alpha)+'_'+str(int(h0))+'.p', 'rb'))
probs[h0] = pdets.prob
except:
print("Could not load "+'pdet_'+psd+'_'+str(alpha)+'_'+str(int(h0))+'.p')
for file in os.listdir(pdet_path):
if file.endswith(".p"):
pdets = pickle.load(open(os.path.join(pdet_path,str(file)), 'rb'))
h0 = pdets.H0vec
probs[h0] = pdets.prob
if detected_masses==True:
detected[h0] = pdets.detected
H0vec = np.array(list(probs.keys()))
prob = np.array(list(probs.values()))
H0vec = sorted(H0vec)
values = []
for h0 in H0vec:
values.append(probs[h0])
prob = np.array(values)
print('Total number of H0 bins: '+str(len(H0vec)))
print(H0vec)
print(prob)
pdet = gwcosmo.likelihood.detection_probability.DetectionProbability(
mass_distribution=mass_distribution, alpha=alpha,
asd=psd, detectors=['H1', 'L1'], Nsamps=2,
network_snr_threshold=12.0, Omega_m=0.308,
linear=False, basic=False, M1=50., M2=50.,
constant_H0=False, H0=H0vec, full_waveform=True)
Nsamps = pdets.Nsamps
RAs = pdets.RAs
Decs = pdets.Decs
incs = pdets.incs
psis = pdets.psis
phis = pdets.phis
mass_distribution = pdets.mass_distribution
dl_array = pdets.dl_array
m1 = pdets.m1/1.988e30
m2 = pdets.m2/1.988e30
alpha = pdets.alpha
Mmin = pdets.Mmin
Mmax = pdets.Mmax
psd = pdets.asd
full_waveform = pdets.full_waveform
network_snr_threshold = pdets.snr_threshold
Omega_m = pdets.Omega_m
linear = pdets.linear
seed = pdets.seed
detectors = pdets.detectors
if full_waveform is True:
mass_distribution=pdets.mass_distribution, alpha=pdets.alpha, Mmin=pdets.Mmin, Mmax=pdets.Mmax,
asd=pdets.asd, detectors=pdets.detectors, detected_masses=detected_masses, alpha_2=pdets.alpha_2,
Nsamps=2, network_snr_threshold=pdets.snr_threshold,seed=pdets.seed, beta=pdets.beta, mu_g=pdets.mu_g,
Omega_m=pdets.Omega_m,linear=pdets.linear, basic=False, M1=pdets.M1, M2=pdets.M2, lambda_peak=pdets.lambda_peak,
sigma_g=pdets.sigma_g, delta_m=pdets.delta_m, b=pdets.b, constant_H0=False, H0=H0vec, full_waveform=pdets.full_waveform)
if pdets.full_waveform is True:
kind = 'full_waveform'
else:
kind = 'inspiral'
pdet.Nsamps = pdets.Nsamps
pdet.H0vec = H0vec
pdet.Nsamps = Nsamps
pdet.prob = prob
pdet.RAs = RAs
pdet.Decs = Decs
pdet.incs = incs
pdet.psis = psis
pdet.phis = phis
pdet.mass_distribution = mass_distribution
pdet.dl_array = dl_array
pdet.m1 = m1
pdet.m2 = m2
pdet.Omega_m = Omega_m
pdet.linear = linear
pdet.seed = seed
pdet.detectors = detectors
pdet.network_snr_threshold = network_snr_threshold
pdet.RAs = pdets.RAs
pdet.Decs = pdets.Decs
pdet.incs = pdets.incs
pdet.psis = pdets.psis
pdet.phis = pdets.phis
pdet.dl_array = pdets.dl_array
pdet.m1 = pdets.m1/1.9884754153381438e30
pdet.m2 = pdets.m2/1.9884754153381438e30
pdet.dets = pdets.dets
pdet.psds = pdets.psds
pdet.z_array = pdets.z_array
logit_prob=logit(prob)
for i in range (len(logit_prob)):
logit_prob[i]=np.where(logit_prob[i]==float('+inf'), 100, logit_prob[i])
logit_prob[i]=np.where(logit_prob[i]==float('+inf'), 100, logit_prob[i])
logit_prob[i]=np.where(logit_prob[i]==float('-inf'), -33, logit_prob[i])
interp_average = interp2d(pdet.z_array, pdet.H0vec, logit_prob, kind='cubic')
pdet.interp_average = interp_average
if mass_distribution == 'BBH-powerlaw':
pdet_path = '{}PSD_{}_alpha_{}_Mmin_{}_Mmax_{}_Nsamps{}_{}_snr_{}.p'.format(psd, mass_distribution,
str(alpha), str(Mmin), str(Mmax),
str(Nsamps), kind,network_snr_threshold)
if pdets.asd != None:
if pdets.mass_distribution == 'BBH-powerlaw' or pdets.mass_distribution == 'NSBH-powerlaw':
pdet_path = '{}PSD_{}_alpha_{}_beta_{}_Mmin_{}_Mmax_{}_Nsamps{}_{}_snr_{}'.format(pdets.asd, pdets.mass_distribution,
str(pdets.alpha), str(pdets.beta), str(pdets.Mmin), str(pdets.Mmax),
str(pdets.Nsamps), kind, str(pdets.snr_threshold))
elif pdets.mass_distribution == 'BBH-powerlaw-gaussian' or pdets.mass_distribution == 'NSBH-powerlaw-gaussian':
pdet_path = '{}PSD_{}_alpha_{}_beta_{}_Mmin_{}_Mmax_{}_mu_{}_lambda_{}_sigma_{}_delta_{}_Nsamps{}_{}_snr_{}'.format(pdets.asd,
pdets.mass_distribution, str(pdets.alpha), str(pdets.beta), str(pdets.Mmin),
str(pdets.Mmax), str(pdets.mu_g), str(pdets.lambda_peak), str(pdets.sigma_g), str(pdets.delta_m),str(pdets.Nsamps), kind, str(pdets.snr_threshold))
elif pdets.mass_distribution == 'BBH-broken-powerlaw' or pdets.mass_distribution == 'NSBH-broken-powerlaw':
pdet_path = '{}PSD_{}_alpha1_{}_alpha2_{}_beta_{}_Mmin_{}_Mmax_{}_delta_{}_Nsamps{}_{}_snr_{}'.format(pdets.asd, pdets.mass_distribution,
str(pdets.alpha), str(pdets.alpha_2), str(pdets.beta), str(pdets.Mmin),
str(pdets.Mmax), str(pdets.delta_m),str(pdets.Nsamps), kind, str(pdets.snr_threshold))
else:
pdet_path = '{}PSD_{}_Nsamps{}_{}_snr_{}'.format(pdets.psd, pdets.mass_distribution, str(pdets.Nsamps), kind,str(pdets.snr_threshold))
else:
pdet_path = '{}PSD_{}_Nsamps{}_{}_snr_{}.p'.format(psd, mass_distribution, str(Nsamps), kind,network_snr_threshold)
pickle.dump( pdet, open( pdet_path, "wb" ) )
if pdets.mass_distribution == 'BBH-powerlaw' or pdets.mass_distribution == 'NSBH-powerlaw':
pdet_path = '{}_alpha_{}_beta_{}_Mmin_{}_Mmax_{}_Nsamps{}_{}_snr_{}'.format(pdets.mass_distribution,
str(pdets.alpha), str(pdets.beta), str(pdets.Mmin), str(pdets.Mmax),
str(pdets.Nsamps), kind, str(pdets.snr_threshold))
elif pdets.mass_distribution == 'BBH-powerlaw-gaussian' or pdets.mass_distribution == 'NSBH-powerlaw-gaussian':
pdet_path = '{}_alpha_{}_beta_{}_Mmin_{}_Mmax_{}_mu_{}_lambda_{}_sigma_{}_delta_{}_Nsamps{}_{}_snr_{}'.format(pdets.mass_distribution,
str(pdets.alpha), str(pdets.beta), str(pdets.Mmin),
str(pdets.Mmax), str(pdets.mu_g), str(pdets.lambda_peak), str(pdets.sigma_g), str(pdets.delta_m),str(pdets.Nsamps), kind, str(pdets.snr_threshold))
elif pdets.mass_distribution == 'BBH-broken-powerlaw' or pdets.mass_distribution == 'NSBH-broken-powerlaw':
pdet_path = '{}_alpha1_{}_alpha2_{}_beta_{}_Mmin_{}_Mmax_{}_delta_{}_Nsamps{}_{}_snr_{}'.format(pdets.mass_distribution,
str(pdets.alpha), str(pdets.alpha_2), str(pdets.beta), str(pdets.Mmin),
str(pdets.Mmax), str(pdets.delta_m),str(pdets.Nsamps), kind, str(pdets.snr_threshold))
else:
pdet_path = '{}_Nsamps{}_{}_snr_{}'.format(pdets.mass_distribution, str(pdets.Nsamps), kind,str(pdets.snr_threshold))
if detected_masses==True:
np.savez(pdet_path+'_detected.npz',[detected,pdet.m1,pdet.m2])
pickle.dump( pdet, open( pdet_path+'.p', "wb" ) )
#!/usr/bin/env python3
# code to generate a .dag file for submitting jobs to the cluster
import os
import numpy as np
import htcondor
from htcondor import dags
from optparse import Option, OptionParser, OptionGroup
import gwcosmo
path = os.path.abspath(os.path.dirname(__file__))
dag=dags.DAG()
parser = OptionParser(
description = __doc__,
usage = "%prog [options]",
option_list = [
Option("--min_H0", default='20.0', type=float,
help="Minimum value of H0"),
Option("--max_H0", default='200.0', type=float,
help="Maximum value of H0"),
Option("--bins_H0", default='200', type=int,
help="Number of H0 bins"),
Option("--posterior_samples", default=None,
help="Path to LALinference posterior samples file in format (.dat or hdf5)"),
Option("--posterior_samples_field", default=None,
help="Internal field of the posterior samples file, e.g. h5 or json field"),
Option("--skymap", default=None,
help="Path to LALinference 3D skymap file in format (.fits or fits.gz)"),
Option("--Pdet", default=None,
help="Path to precomputed probability of detection pickle"),
Option("--galaxy_weighting", default='True',
help="Weight potential host galaxies by luminosity? (Default=True)"),
Option("--assume_complete_catalog", default='False',
help="Assume a complete catalog? (Default=False)"),
Option("--redshift_uncertainty", default='True',
help="Marginalise over redshift uncertainties (default=True)"),
Option("--redshift_evolution", default='None',
help="Allow GW host probability to evolve with redshift. Select between None, PowerLaw or Madau (Default=None)"),
Option("--Lambda", default='3.0', type=float,
help="Set rate evolution parameter Lambda for redshift evolution (For Madau model this is equal to alpha)"),
Option("--Madau_beta", default='3.0', type=float,
help="Set Beta for Madau model. (Not used if redshift_evolution=None or PowerLaw)"),
Option("--Madau_zp", default='0.0', type=float,
help="Set zp for Madau model. (Not used if redshift_evolution=None or PowerLaw)"),
Option("--Kcorrections", default='False',
help="Apply K-corrections."),
Option("--reweight_posterior_samples", default='False',
help="Reweight posterior samples with the same priors used to calculate the selection effects."),
Option("--zmax", default='10.0', type=float,
help="Upper redshift limit for integrals (default=10)"),
Option("--zcut", default=None,
help="Hard redshift cut to apply to the galaxy catalogue (default=None)"),
Option("--mth", default=None,
help="Override the apparent magnitude threshold of the catalogue, if provided (default=None)"),
Option("--schech_alpha", default=None,
help="Override the default value for slope of schechter function for given band, if provided (default=None)"),
Option("--schech_Mstar", default=None,
help="Override the default value for Mstar of schechter function for given band, if provided (default=None)"),
Option("--schech_Mmin", default=None,
help="Override the default value for Mmin of schechter function for given band, if provided (default=None)"),
Option("--schech_Mmax", default=None,
help="Override the default value for Mmax of schechter function for given band, if provided (default=None)"),
Option("--nside", default='32', type=int,
help="skymap nside choice for reading in galaxies from the overlap of catalogue and skymap (default=32)"),
Option("--sky_area", default='0.999', type=float,
help="contour boundary for galaxy catalogue method (default=0.999)"),
Option("--min_pixels", default=30, type=int,
help="minimum number of pixels desired to cover sky area of event (for use with pixel method only)"),
Option("--outputfile", default='Posterior',
help="Name of output file"),
Option("--cpus", default=1, type=int,
help="Number of cpus asked for each run (default=1)"),
Option("--ram", default=1000, type=int,
help="RAM asked for each run (default=1 GB)"),
Option("--search_tag", default='ligo.prod.o1.cbc.hubble.gwcosmo', type=str,
help="Search tag for the runs -- used in LIGO clusters (default=ligo.prod.o1.cbc.hubble.gwcosmo)"),
Option("--run_on_ligo_cluster", default='True', type=str,
help="Set to true if running on a LIGO cluster (default=True)"),
Option("--seed", default=None, type=int, help="Random seed"),
Option("--numerical", default='True', type=str,
help="If set to true numerical integration will be used for the calculation of integrals")
])
catalog_option_group = OptionGroup(parser, "Galaxy Catalog Options","""
Use these options to control the galaxy catalog input""")
# Add the catalog options --catalog --catalog_band
for opt in gwcosmo.prior.catalog.catalog_options:
catalog_option_group.add_option(opt)