Differences between revisions 6 and 7
Revision 6 as of 2015-07-22 00:01:24
Size: 5314
Comment:
Revision 7 as of 2015-07-22 00:01:52
Size: 5273
Comment:
Deletions are marked like this. Additions are marked like this.
Line 91: Line 91:
{{{
cd dir/where/HTML/will/be/saved
Line 94: Line 92:
gw_summary day --ifo c1 --config-file path/to/configfile
}}}
`gw_summary day --ifo c1 --config-file path/to/configfile`

Summary pages

Introduction

The summary pages are a website showing plots of useful channels updated every ~30min. They can be found here:

https://nodus.ligo.caltech.edu:30889/detcharsummary/

Different kinds of plots are can be produced: time series, spectra, spectrograms, Rayleigh gaussianity statistics, etc. This is all produced using the GWsumm software (https://github.com/gwpy/gwsumm) developed for the big detectors (https://ldas-jobs.ligo.caltech.edu/~detchar/summary/).

Configuration

The content of the pages is controled by configuration files found in:

nodus:/cvs/cds/caltech/chans/GWsummaries/

These are synced to the LDAS computer cluster, where the data are processed. The filenames must begin with the characters c1 and have the .ini extension. A special case is the defaults.ini file, which contains HTML and other general information. Although this file is always loaded, its settings can be overwritten by custom files (e.g. if the same property is defined in defaults.ini and c1-lsc.ini, the latter will take precedence).

The remote LDAS folder mirrors the local one in nodus, where the files are version controlled (note that there no longer is version control on the LDAS side, as this was redundant).

For information on the INI format itself see: https://ldas-jobs.ligo.caltech.edu/~duncan.macleod/gwsumm/latest/ or http://www.ligo.caltech.edu/~misi/iniguide.pdf

Technical info

Workflow

The central part of the process are cron-like gwsumm jobs executed on the cluster every 30min. This is the chain of events:

1. A cron-type Condor (http://research.cs.wisc.edu/htcondor/manual/) job wakes up and executes the gw_daily_summary bash script which:

  • Sets up Python environment;
  • Rsyncs nodus and LDAS directories containing config files;

  • Lists the config files present;
  • Executes a gw_summary_pipe job with proper options and waits for it to finish.

2. Files are processed in parallel in the cluster:

  • gw_summary_pipe spawns multiple gw_summary jobs (one per config file);

  • Each gw_summary job corresponds to a node in the Condor DAG;

  • Condor jobs are processed in the local universe, maximum two at a time (default).

3. Output is synced back to nodus:

  • Handled by a regular cron tab running every 15 minutes;
  • Only HTML from local time "yesterday," "today" and "tomorrow" (if exists) are rsynced.

Note that this whole process depends on the 40m frames being available in the cluster; the processes responsible for that are handled by Dan Kozak. Also, a similiar process takes place once a day to re-process the plots from the previous UTC day (gw_daily_summary_catchup script).

Software

Several independent pieces come into play:

  1. GWpy (https://gwpy.github.io/): python module to handle LIGO data developed by Duncan Macleod;

  2. GWsumm (https://github.com/gwpy/gwsumm and https://ldas-jobs.ligo.caltech.edu/~duncan.macleod/gwsumm/latest/index.html): python module and associated executables (gw_summary and gw_summary_pipe) that produced detector summary pages, based on GWpy;

  3. Configuration files: stored locally in nodus and VCS version controlled;

  4. 40m-specific scripts: git repository containing bash scripts and condor submit files that make the necessary preparations to run GWsumm jobs for the 40m, also take care of syncing HTML, plots and config files between nodus and LDAS;

  5. Data transfer: cron jobs that sync frames from nodus to LDAS, managed by Dan Kozak.

Notes

  • On the cluster side, everything (except the frame transfer) is executed from the 40m shared LDAS account.

  • To add features to gwsumm: fork the repository, install as user and modify bash script to look at the local python path (rather than detchar's).

Running your own

If you would like to test a configuration file, you can run the code manually.

LDAS cluster

From any CIT headnode, you can use the default detchar installation; just source the following script:

source . /home/detchar/opt/gwpysoft/etc/gwpy-user-env.sh

You will also need to obtain Kerberos credentials before proceding:

kinit albert.einstein

Then, to use a given configuration file to create HTML and plots for today's data, cd into the desired destination directory and run:

gw_summary day --ifo c1 --config-file path/to/configfile

You can add as many --config-file options as desired. In general, you will want to point to the file containing the default options:

gw_summary day --ifo c1 --config-file path/to/defaults.ini  --config-file path/to/configfile

If you want to use data from a day other than today, just add the date in YYYYMMDD format after day, e.g.:

gw_summary day 20150721 --ifo c1 --config-file path/to/configfile

For testing purposes, it is usually quicker to process a shorter amount of time, say 10 minutes. You can do this by providing specific GPS start and end times:

gw_summary gps 1121558360 1121558960 --ifo c1 --config-file path/to/configfile

Finally, use the --verbose option to see a more detailed output.

DailySummaryHelp (last edited 2022-06-23 20:03:35 by TegaedoATligoDOTorg)