Skip to content

Introduction to PyCBC Singularity Images

wrapperband edited this page May 13, 2019 · 3 revisions

Introduction to PyCBC Singularity Images

Please read the overview of containerization for those not familiar with container technology.

Building Singularity PyCBC Images

PyCBC provides a base EL7 docker image that contains a PyCBC development environment and the files to set up the Singularity image. Currently this is built and published manually to the pycbc-base-el7 repository on DockerHub.

The PyCBC a Docker image inherits from the base DockerHub image and the pycbc-el7 Docker container is built by travis and then pushed to DockerHub. Images are tagged as with either a release version or latest for the current master.

The script cvmfs-singularity-sync is run by the OSG team and automatically turns new PyCBC Docker images into Singularity images that are published in /cvmfs/singularity.opensciencegrid.org/pycbc/.

For example:

[dbrown@sugwg-osg ~]$ ls /cvmfs/singularity.opensciencegrid.org/pycbc/pycbc-el7\:v1.13.0/
bin  cvmfs  dev  etc  home  lib  lib64  media  mnt  opt  proc  pycbc  root  run  sbin  singularity  srv  sys  tmp  usr  var

These images can be run interactively with the command:

[dbrown@sugwg-osg ~]$ singularity shell --home  /home/dbrown:/srv --pwd /srv --bind /cvmfs --contain --ipc --pid  /cvmfs/singularity.opensciencegrid.org/pycbc/pycbc-el7\:v1.13.0/
Singularity: Invoking an interactive shell within container...

PyCBC Singularity 93f31c70608b9e35668b70f5798a532b7280e11377371c960e1446d338c877:~> pycbc_inspiral --version | head
--- PyCBC Version --------------------------
Version: 1.13.0
Branch: None
Tag: v1.13.0
Id: 8be203575c720bfc7f1b7a936c90251dede05679
Builder: Unknown User <>
Build date: 2018-11-07 13:15:25 +0000
Repository status is CLEAN: All modifications committed

Imported from: /opt/pycbc/pycbc-software/lib/python2.7/site-packages/pycbc/__init__.pyc

Using Singularity in a Workflow

There are two ways to use Singularity in a workflow:

  1. Configure Pegasus to manage the containers. In this case, you need to provide the relevant sections in an ini file to tell PyCBC to pass the right arguments to Pegasus. Pegasus writes a job wrapper script that contains two scripts: the first stages in the container and launches it, then second is run inside the container and performs data transfers and executes the PyCBC payload inside the container.
  2. Configure OSG glide-ins to manage the containers. In this case, the use of Singularity is transparent to Pegasus. You put a ClassAd in the jobs that tells the OSG glide in to run the user's jobs inside the specified container. You can then expect your jobs to run inside the container environment.

Managing Containers Using Pegasus.

This requires Pegasus 4.9.1 and is how I run on OrangeGrid which has a Ubuntu OS. To configure in this more, pycbc_make_coinc_search_workflow needs the --config-overrides

    pegasus_profile-inspiral:pycbc|site:orangegrid \
    pegasus_profile-inspiral:hints|execution.site:orangegrid \

that tell Pegasus to run pycbc_inspiral on the OrangeGrid site, and the --config-overrides

    pegasus_profile-inspiral:container|image:file://localhost/cvmfs/singularity.opensciencegrid.org/pycbc/pycbc-el7:${PYCBC_TAG} \
    pegasus_profile-inspiral:container|image_site:orangegrid \
    pegasus_profile-inspiral:container|mount:/cvmfs:/cvmfs:ro \
    executables:inspiral:/opt/pycbc/pycbc-software/bin/pycbc_inspiral \

that tell Pegasus to run the job in a container on OrangeGrid, mount /cvmfs inside the container, and use the pycbc_inspiral executable in the container's PATH.

Managing Containers using OSG

OSG has a nice way of automatically starting jobs inside Singularity Containers so that Pegasus does not need to manage this directly. The Pegasus job wrapper script is run inside a pre-launched singularity image that is specified by Condor ClassAds in the job. This is how in I run jobs on OSGConnect. There are ClassAds in this site template that tell Condor to request an OSG node with Singularity installed. The image version is then specified with an --append-site-profile to pycbc_submit_dax:

      --append-site-profile "osgconnect:condor|+SingularityImage:\"/cvmfs/singularity.opensciencegrid.org/pycbc/pycbc-el7:${PYCBC_TAG}\"" \

Putting it together

Here is an example script that can generate a workflow for all three environments: Pegasus-managed Singularity, OSG-managed Singularity, and a regular LDG-style workflow.

Clone this wiki locally