... | @@ -18,12 +18,22 @@ which relies on standard python libraries (numpy, matplotlib) and |
... | @@ -18,12 +18,22 @@ which relies on standard python libraries (numpy, matplotlib) and |
|
|
|
|
|
## identified bugs and associated fixes
|
|
## identified bugs and associated fixes
|
|
|
|
|
|
**WRITE ME**
|
|
**bad jacobian from detector-frame to source-frame component masses**
|
|
|
|
|
|
|
|
In the previous version of the code, the default PE prior was computed on-the-fly instead of being read from the input CSV. In particular, the default PE prior was flat in detector-frame component masses, and the corresponding distribution over source-frame component masses was computed [here](https://git.ligo.org/reed.essick/mmax-model-selection/-/blob/c8b0d297d11b0a350c23ee44832d24b899729d8f/mmax_model_selection/utils.py#L285). However, the correct transformation would be
|
|
|
|
|
|
|
|
```math
|
|
|
|
p(m_\mathrm{src}) = p(m_\mathrm{det}) \left| \frac{dm_\mathrm{det}}{dm_\mathrm{src}}\right| = p(m_\mathrm{det}) (1+z)
|
|
|
|
```
|
|
|
|
|
|
estimate how much the following could have impacted our conclusions, etc.
|
|
for each component mass, meaning that the total contribution to the weights should have been `(1+z)**+2` instead of `(1+z)**-2` as was actually implemented. However, as the redshift was small for the NSBH sources considered, this factor does not significantly affect our conclusions in any way.
|
|
|
|
|
|
|
|
The new version of the code relies on the user to specify the draw probability, so this is rendered moot. Within the tests described below, this was implemented correctly within [this script](https://git.ligo.org/reed.essick/mmax-model-selection/-/blob/gw-distributions/test/o3-nsbh/src/convert-event-csv#L48). **Update link once merge request is accepted**
|
|
|
|
|
|
|
|
**bad population reweighing**
|
|
|
|
|
|
|
|
**WRITE ME**
|
|
|
|
|
|
* bad jacobian from detector-frame to source-frame masses in the old code. This is fixed now.
|
|
|
|
- Add links to where this was/is defined, etc.
|
|
|
|
* bad population weights? did I have the normalizations messed up?
|
|
* bad population weights? did I have the normalizations messed up?
|
|
- note that there is excellent agreement when we have a fixed population (all uncertainty is from the EoS), but there is bad agreement when we have uncertainty in the population and in the EoS. This suggests the issue is associated with the marginalization over the population.
|
|
- note that there is excellent agreement when we have a fixed population (all uncertainty is from the EoS), but there is bad agreement when we have uncertainty in the population and in the EoS. This suggests the issue is associated with the marginalization over the population.
|
|
- add links to where this was/is defined, etc.
|
|
- add links to where this was/is defined, etc.
|
... | | ... | |