Add in "level A" jobs
This changes the way multiple jobs are handled. Previously, they simply used the condor "Queue". Instead, this now creates a list of level_A
jobs (I'm not 100% sure about using that term), each level A job has one data generation and at least one data analysis job which are then chained parent-child.
The result is that if one level A job fails, it only takes down that chain, not the whole DAG.
During the course of this change, I made several other fixes listed here:
-
bilby_pipe_generation
now takes an argument--injection-file
(previously it tried to infer it which was buggy and wrong) -
generation
andanalysis
now take an argument--idx
which passes around which "job" they are in the level-A job list -
create-output
is made optional (running large number of jobs can cause race conditions as they all try to read the rcparams file)
Edited by Gregory Ashton