Changes
Page history
Update Additional Review (marginalization over joint population+EoS uncertainty)
authored
Aug 15, 2023
by
Reed Essick
Show whitespace changes
Inline
Side-by-side
Additional-Review-(marginalization-over-joint-population+EoS-uncertainty).md
View page @
4a7fd158
...
@@ -41,13 +41,13 @@ w_i = \frac{\sum_p p(\theta_i|\Lambda_p)}{p(\theta_i|\mathrm{default\ PE})}
...
@@ -41,13 +41,13 @@ w_i = \frac{\sum_p p(\theta_i|\Lambda_p)}{p(\theta_i|\mathrm{default\ PE})}
This implies that the total sum would be
This implies that the total sum would be
```
math
```
math
S_\mathrm{old} = \sum_i w_i \Theta_i = \sum_i \left(\frac{\sum_p p(\theta_i
)
|\Lambda_p)}{p(\theta_i|\mathrm{default\ PE})}\right) \Theta_i = \sum_p \sum_i \frac{p(\theta_i|\Lambda_p)}{p(\theta_i|\mathrm{default\ PE})} \Theta_i
S_\mathrm{old} = \sum_i w_i \Theta_i = \sum_i \left(\frac{\sum_p p(\theta_i|\Lambda_p)}{p(\theta_i|\mathrm{default\ PE})}\right) \Theta_i = \sum_p \sum_i \frac{p(\theta_i|\Lambda_p)}{p(\theta_i|\mathrm{default\ PE})} \Theta_i
```
```
It is straightforward to show that
It is straightforward to show that
```
math
```
math
\sum_i \frac{p(\theta_i|\Lambda)}{p(\theta_i|\mathrm{default\ PE})} \Theta_i \approx \
int d\theta p(\the
ta|\
mathrm{data},\Lambda) \T
heta(\theta
) \frac{p(
\mathrm{data}
|
\Lambda)
}{p(\mathrm{data}|\mathrm{default\ PE})}
\sum_i \frac{p(\theta_i|\Lambda)}{p(\theta_i|\mathrm{default\ PE})} \Theta_i \approx \
left(\frac{p(\mathrm{da
ta
}
|\
Lambda)}{p(\mathrm{data}|\mathrm{default\ PE})}\right) \int d\t
heta
p
(\theta
|
\mathrm{data}
,
\Lambda)
\Theta(\theta)
```
```
which implies the total sum would approximate
which implies the total sum would approximate
...
...
...
...