diff --git a/index.html b/index.html index fde6a554..e8b7fe5c 100644 --- a/index.html +++ b/index.html @@ -467,10 +467,12 @@

FLAF&par Task workflow managed is done via LAW (Luigi Analysis Framework).

How to install

    -
  1. Setup ssh keys:
  2. -
  3. On GitHub settings/keys
  4. -

    On CERN GitLab profile/keys

    +

    Setup ssh keys:

    +
  5. Clone the repository: diff --git a/search/search_index.json b/search/search_index.json index 63d8fa3e..1743fa1d 100644 --- a/search/search_index.json +++ b/search/search_index.json @@ -1 +1 @@ -{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"FLAF","text":"

    FLAF - Flexible LAW-based Analysis Framework. Task workflow managed is done via LAW (Luigi Analysis Framework).

    "},{"location":"#how-to-install","title":"How to install","text":"
    1. Setup ssh keys:
    2. On GitHub settings/keys
    3. On CERN GitLab profile/keys

    4. Clone the repository:

      git clone --recursive git@github.com:cms-flaf/Framework.git FLAF\n

    "},{"location":"#how-to-load-environment","title":"How to load environment","text":"

    Following command activates the framework environment:

    source env.sh\n

    "},{"location":"#how-to-run-limits","title":"How to run limits","text":"
    1. As a temporary workaround, if you want to run multiplie commands, to avoid delays to load environment each time run:

      cmbEnv /bin/zsh # or /bin/bash\n
      Alternatively add cmbEnv in front of each command. E.g.
      cmbEnv python3 -c 'print(\"hello\")'\n

    2. Create datacards.

      python3 StatInference/dc_make/create_datacards.py --input PATH_TO_SHAPES  --output PATH_TO_CARDS --config PATH_TO_CONFIG\n
      Available configurations:

      • For X->HH>bbtautau Run 2: StatInference/config/x_hh_bbtautau_run2.yaml
      • For X->HH->bbWW Run 3: StatInference/config/x_hh_bbww_run3.yaml
    3. Run limits.

      law run PlotResonantLimits --version dev --datacards 'PATH_TO_CARDS/*.txt' --xsec fb --y-log\n
      Hints:

      • use --workflow htcondor to submit on HTCondor (by default it runs locally)
      • add --remove-output 4,a,y to remove previous output files
      • add --print-status 0 to get status of the workflow (where 0 is a depth). Useful to get the output file name.
      • for more details see cms-hh inference documentation
    "},{"location":"#how-to-run-nanoaod-nanoaod-skims-production","title":"How to run nanoAOD->nanoAOD skims production","text":"
    law run CreateNanoSkims --version prod_v1 --periods 2016,2016APV,2017,2018 --ignore-missing-samples True\n
    "},{"location":"#how-to-run-hhbtag-training-skim-ntuple-production","title":"How to run HHbtag training skim ntuple production","text":"
    python Studies/HHBTag/CreateTrainingSkim.py --inFile $CENTRAL_STORAGE/prod_v1/nanoAOD/2018/GluGluToBulkGravitonToHHTo2B2Tau_M-350.root --outFile output/skim.root --mass 350 --sample GluGluToBulkGraviton --year 2018 >& EventInfo.txt\npython Common/SaveHisto.txt --inFile $CENTRAL_STORAGE/prod_v1/nanoAOD/2018/GluGluToBulkGravitonToHHTo2B2Tau_M-350.root --outFile output/skim.root\n
    "},{"location":"#how-to-run-histogram-production","title":"How to run Histogram production","text":"

    Please, see the file all_commands.txt (to be updated)

    "}]} \ No newline at end of file +{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"FLAF","text":"

    FLAF - Flexible LAW-based Analysis Framework. Task workflow managed is done via LAW (Luigi Analysis Framework).

    "},{"location":"#how-to-install","title":"How to install","text":"
    1. Setup ssh keys:

      • On GitHub settings/keys
      • On CERN GitLab profile/keys
    2. Clone the repository:

      git clone --recursive git@github.com:cms-flaf/Framework.git FLAF\n

    "},{"location":"#how-to-load-environment","title":"How to load environment","text":"

    Following command activates the framework environment:

    source env.sh\n

    "},{"location":"#how-to-run-limits","title":"How to run limits","text":"
    1. As a temporary workaround, if you want to run multiplie commands, to avoid delays to load environment each time run:

      cmbEnv /bin/zsh # or /bin/bash\n
      Alternatively add cmbEnv in front of each command. E.g.
      cmbEnv python3 -c 'print(\"hello\")'\n

    2. Create datacards.

      python3 StatInference/dc_make/create_datacards.py --input PATH_TO_SHAPES  --output PATH_TO_CARDS --config PATH_TO_CONFIG\n
      Available configurations:

      • For X->HH>bbtautau Run 2: StatInference/config/x_hh_bbtautau_run2.yaml
      • For X->HH->bbWW Run 3: StatInference/config/x_hh_bbww_run3.yaml
    3. Run limits.

      law run PlotResonantLimits --version dev --datacards 'PATH_TO_CARDS/*.txt' --xsec fb --y-log\n
      Hints:

      • use --workflow htcondor to submit on HTCondor (by default it runs locally)
      • add --remove-output 4,a,y to remove previous output files
      • add --print-status 0 to get status of the workflow (where 0 is a depth). Useful to get the output file name.
      • for more details see cms-hh inference documentation
    "},{"location":"#how-to-run-nanoaod-nanoaod-skims-production","title":"How to run nanoAOD->nanoAOD skims production","text":"
    law run CreateNanoSkims --version prod_v1 --periods 2016,2016APV,2017,2018 --ignore-missing-samples True\n
    "},{"location":"#how-to-run-hhbtag-training-skim-ntuple-production","title":"How to run HHbtag training skim ntuple production","text":"
    python Studies/HHBTag/CreateTrainingSkim.py --inFile $CENTRAL_STORAGE/prod_v1/nanoAOD/2018/GluGluToBulkGravitonToHHTo2B2Tau_M-350.root --outFile output/skim.root --mass 350 --sample GluGluToBulkGraviton --year 2018 >& EventInfo.txt\npython Common/SaveHisto.txt --inFile $CENTRAL_STORAGE/prod_v1/nanoAOD/2018/GluGluToBulkGravitonToHHTo2B2Tau_M-350.root --outFile output/skim.root\n
    "},{"location":"#how-to-run-histogram-production","title":"How to run Histogram production","text":"

    Please, see the file all_commands.txt (to be updated)

    "}]} \ No newline at end of file