Skip to content
Snippets Groups Projects

Reconstruct marginalised parameters

Merged Colm Talbot requested to merge reconstruct_marginalised_parameters into master

Closes #324 (closed)

This extends the previous functions to reconstruct the distance posterior when using a marginalised likelihood to allow reconstruction of all marginalised parameters.

Distance and phase work very nicely.

The time reconstruction is a little problematic, I think this is because the times at which the likelihood is sampled are too coarse and for each sample, there is only one time with any support. There is also some buggy behaviour when reconstructing distance and time, this may be due to the fact that the time marginalised likelihood does not incorporate the change in the response function between the beginning of the segment and the sample time.

Edited by Colm Talbot

Merge request reports

Loading
Loading

Activity

Filter activity
  • Approvals
  • Assignees & reviewers
  • Comments (from bots)
  • Comments (from users)
  • Commits & branches
  • Edits
  • Labels
  • Lock status
  • Mentions
  • Merge request status
  • Tracking
  • Colm Talbot added 1 commit

    added 1 commit

    Compare with previous version

  • Here's a skeleton code of the time_marginalization recovery. We should attempt to condense this whole code as two functions, 1 that generates a single posterior value of time from the marginalized likelihood and 1 that allows to loop over all the samples to give the full posterior, just as in the distance marginalization case. Let me know if you need me to be clearer on a certain step? I could also write all the formulas up to make it readable?

    #Read in Result 
    from bilby.core import result
    Result = result.read_in_result(outdir = outdir , label = label)
    #total number of posterior samples
    num_samp = len(Result.posterior)
    #delete log_likelihood column in data frame
    All_param = bilby.gw.conversion.fill_from_fixed_priors(sample=Result.posterior, priors=prior )
    for key in ['log_likelihood']: 
        del All_param[key]
    #Generate random sampled values for coal. time
    for i in range(num_samp):
        All_param['geocent_time'][i] =  likelihood.priors['geocent_time'].sample()
    ##################################################################################################################
    from scipy import integrate
    from scipy.special import logsumexp
    
    '''
    Function to calculate log_like for a single discretized value of t
    '''
    #Define get_log_likelihood function, for a single set set of posterior samples, k, it calculates the time marginalized likelihood of sample k
    
    def get_log_like(sample, likelihood, log_noise_evid): #Takes 3ms
        signal_polarizations = likelihood.waveform_generator.frequency_domain_strain(sample)
        rho_opt_sq = 0
        rho_mf_sq_array = np.zeros(likelihood.interferometers.frequency_array[0:-1].shape,dtype=np.complex128)
        #no need to worry about rho_opt_sq as it does not change with a new reference time
        for ifo in likelihood.interferometers:
            signal = ifo.get_detector_response(signal_polarizations, sample)
            rho_opt_sq += ifo.optimal_snr_squared(signal=signal) 
            rho_mf_sq_array += (4 /likelihood.waveform_generator.duration) * np.fft.fft(signal[0:-1]*ifo.frequency_domain_strain.conjugate()[0:-1]/ifo.power_spectral_density_array[0:-1])
        #log likelihood from C4
        log_like = log_noise_evid + logsumexp(rho_mf_sq_array.real, b=likelihood.time_prior_array) - (rho_opt_sq / 2)
        return log_like
    
    ##################################################################################################################
    #Since prior of coal time is uniform over the duration, prior_tc is easy to calculate
    #Consider that t_max may not be duration as we must remove edge effects, so:
    t_max = duration - 0.2
    t_min = 0 + 0.2
    prior_tc = 1/(t_max - t_min)
    #Sort time samples
    t_sorted = np.sort(All_param['geocent_time'])
    #Create array to store recovered time samples in
    time_recov = np.zeros(N)
    ##################################################################################################################
    for i in range(N):
        log_bayes_fact = 26.691
        #Create log_likelihood array 
        log_l_array = np.zeros(N)
    
    '''
    Calculating likelihood as a function of discretized time samples
    '''
        #For each sample, theta_k, calculate the likelihood as a function of discretized time samples
        for j in range(N):
            #Create dict object with posterior samples from all param except time
            samp = dict(geocent_time = t_sorted[j],
                          ra = All_param['ra'][i],
                          dec = All_param['dec'][i],
                          psi = All_param['psi'][i],
                          rise_time = All_param['rise_time'][i],
                          ampl = All_param['ampl'][i])
            #Fill up log_likelihood array for each sample in t_sorted, see C29
            log_l_array[j] = get_log_like(samp, likelihood)
    
         #Once you have the log likelihood array for each value in t_sorted, for a given posterior sample k, you can calculate the posterior 
    
    '''
    Posterior
    '''
        #**Posterior**, see C30
        post_k = (prior_tc) * (np.exp(log_l_array - np.max(log_l_array))) * (1/ np.exp(log_bayes_fact))
        
        #Integrate posterior along time samples to get cumulative posterior distribution, see C31
        CPD_k = integrate.cumtrapz(post_k, t_sorted)
        #normalize
        A = np.trapz(post_k, t_sorted)
        CPD_k = (1/A)*CPD_k
        #Choose unifrom random number from CDF_tc, see C32
        rand_val_cpd = np.random.choice(CPD_k)
        #Find corresponding index
        index = np.where(CPD_k == rand_val_cpd)[0][0]
    
    '''
    Recovered Posterior value of t
    '''
        #Find corresponding sample of time, this is the **recovered time sample** 
        time_recov[i] = t_sorted[index]
    
    
  • Gregory Ashton changed milestone to %0.4.0

    changed milestone to %0.4.0

  • Gregory Ashton removed milestone

    removed milestone

  • mentioned in issue #272 (closed)

  • Colm Talbot changed milestone to %0.3.6

    changed milestone to %0.3.6

  • Gregory Ashton changed milestone to %0.3.7

    changed milestone to %0.3.7

  • Gregory Ashton changed milestone to %0.4.0

    changed milestone to %0.4.0

  • Gregory Ashton changed milestone to %Future

    changed milestone to %Future

  • mentioned in issue #324 (closed)

  • Shanika Galaudage changed the description

    changed the description

  • Ethan Payne mentioned in merge request !377 (closed)

    mentioned in merge request !377 (closed)

  • @colm.talbot do you want to kill this MR, in lieu of the updated merge request I opened at !377 (closed) ??

  • Colm Talbot added 237 commits

    added 237 commits

    • 52458138...937b6349 - 233 commits from branch master
    • 83560bca - Merge branch 'master' into reconstruct_marginalised_parameters
    • d7717e5e - Merge branch 'master' into reconstruct_marginalised_parameters
    • 5c577de7 - Merge branch 'master' into reconstruct_marginalised_parameters
    • ce8a510f - make time spacing finer in recovery

    Compare with previous version

  • Colm Talbot unmarked as a Work In Progress

    unmarked as a Work In Progress

  • Author Maintainer

    This now works! The attached image is poorly sampled (particularly in phase). But the recovered posterior agrees well with the posterior which was sampled directly.

    The solution to the time posterior recovery was to increase the resolution of the FFT. This just involved a bunch of zero-padding. It may break down for super loud signals.

    image

  • mentioned in issue #323 (closed)

  • mentioned in issue #331 (closed)

  • Loading
  • Loading
  • Loading
  • Loading
  • Loading
  • Loading
  • Loading
  • Loading
  • Loading
  • Loading
Please register or sign in to reply
Loading