Skip to content

DisplacedHiggs/LLDJstandalones

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLDJstandalones

ntuple based analysis package for long lived displaced jet analyses

Download

# Fermilab uses tcsh by default even though it has bash! 
# This framework is based in bash and
# technically maybe you don't need this,
# but tcshers be warned
bash --login

# Set up the area
export SCRAM_ARCH=slc6_amd64_gcc530;
scram pro -n LLDJ_slc6_530_CMSSW_8_0_26_patch1 CMSSW CMSSW_8_0_26_patch1;
cd LLDJ_slc6_530_CMSSW_8_0_26_patch1/src;
cmsenv;


## CMSSW imports and customizations
git cms-merge-topic ikrav:egm_id_80X_v3_photons

scramv1 build -j 10;

## LLDJstandalones Framework checkout

# first fork the repository to make your own workspace
git clone https://github.com/<mygithubusername>/LLDJstandalones.git;
pushd LLDJstandalones;

 # If you want to check out a specific branch
 # git fetch origin
 # git branch -v -a # list branches available, find yours
 # git checkout -b NAMEOFBRANCH origin/NAMEOFBRANCH 

 # add DisplacedHiggs as upstream
 git remote add upstream https://github.com/DisplacedHiggs/LLDJstandalones.git
cd LLDJstandalones

# compile a clean area
scramv1 build -j 10;

## Every time you log in
# set up some environment variables (bash)
source LLDJstandalones/setup.sh

How to use

set up

Make sure to run source setup.sh from the LLDJstandalones directory first to set up environment variables used in scripts. In particular, this sets up $nversion and $aversion

ntuplizer

cd ${CMSSW_BASE}/LLDJstandalones/ntuples/config

To run local jobs, do cmsRun run_mc_80XAOD.py and cmsRun run_data_80XAOD.py Then submit CRAB jobs using bash submitcrab.sh which uses crab_template.py CRAB directores are in ..config/gitignore/$nversion Finished jobs appear at FNAL in /store/group/lpchbb/LLDJntuples/$nversion

analyzer

analyzer prep

To run the analyzer first we need to run other things from ..LLDJntuples/commontools

  • Lists
  1. bash makemasterlist.sh makes master lists of ntuples from which other lists are derived
  2. bash makelists.sh makes lists of files and puts them in lists folder, split by sample
  3. bash countevents.sh calls countevents.cxx and makes .info files in lists folder
  4. bash findTTavgweights.sh runs over TTbar samples calculates the average TTbar weight
  • Scale factors
  1. EGamma SFs in commontools/elesf - scale factors are provided from POG as a TH2F histogram from https://twiki.cern.ch/twiki/bin/view/CMS/EgammaIDRecipesRun2
  2. Pileup reweighting in commontools/pileup, see README
  3. bash collectjsons.sh - get JSONs from finished CRAB jobs, compare to golden JSON
  4. bash makeinputhistos.sh - make input SF histograms using database (on lxplus)
  5. bash runMakePUweights.sh - make weight histograms
  6. Make sure SF files are copied into LLDJntuples/analyzer directory

run analyzer

Run local jobs from LLDJntuples/analyzers folder

  1. make compiles main.C into executable runanalyzer.exe
  2. ./runanalyzer.exe --<flags> call executable analyzer by hand (you must specify flags)
  3. or edit and run bash runAnalyzers.sh which loops through different options for calling runanalyzer.exe

submit condor job

From submitters folder

  1. in submitjobs.sh set doSubmit=false to be safe while testing
  2. bash submitjobs.sh creates submit area in gitignore. The job that actually runs on the condor nodes is runsubmitter.sh
  3. voms-proxy-init --voms cms --valid 100:00 set up your proxy
  4. set doSubmit=true and run bash submitjobs.sh - then optionally add info to the autogenerated txt file about the submit

While jobs are running / finished

  1. bash checkjobs.sh check to see if see if condor jobs are done
  2. bash haddjobs.sh merge analyzed output locally into LLDJstandalones/roots/$aversion
  3. bash cpeos.sh copy hadded analyzer jobs to EOS
  4. delete those files copied over, no script for this

make plots

First thing to do is merge the histograms from the analyzer, do this with bash runPlotterStackedRegion then take it from there

tagging variable uncertainty calculation

  1. run bash runPlotterStackedRegion over the unshifted analyzer output.
  2. run bash runPlotterTagvarUnc.sh to get plots starting with tvu and values of shifted cuts based on integral
  3. put new shift cut values in analyzers/analyzer_config.C as the variables like tag_shiftXXXX and rerun analyzer

About

standalone packages for long lived jet analyses

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published