Skip to content
GitLab
  • Menu
Projects Groups Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
  • Sign in
  • B bilby_pipe
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
    • Locked Files
  • Issues 10
    • Issues 10
    • List
    • Boards
    • Service Desk
    • Milestones
    • Iterations
    • Requirements
  • Merge requests 9
    • Merge requests 9
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
    • Test Cases
  • Deployments
    • Deployments
    • Environments
    • Releases
  • Packages & Registries
    • Packages & Registries
    • Container Registry
  • Monitor
    • Monitor
    • Incidents
  • Analytics
    • Analytics
    • Value stream
    • CI/CD
    • Code review
    • Insights
    • Issue
    • Repository
  • Wiki
    • Wiki
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • lscsoft
  • bilby_pipe
  • Issues
  • #219
Closed
Open
Created Jun 04, 2021 by Gregory Ashton@gregory.ashtonMaintainer0 of 2 tasks completed0/2 tasks

Add standardised and down-sampled result file

Currently, the output of bilby_pipe can be confusing to users. If n-parallel=1, they get a single result file. But, if n-parallel>1, they get n-parallel result files + a merge file (which is the "final" data product).

In addition, the result files can be very large (several hundred Mb) if combining multiple runs.

To resolve both issues we could

  • Add a directory {outdir}/output/{label}_final_result.json where {label} would be the full label (including gpstime etc)
  • Add an option max-samples=30000 which would thin the original sample set (preserving the order) until the number of samples is less than max-samples

Thoughts?

Assignee
Assign to
Time tracking