Start by creating and activating a conda environment using the provided environment.yml
environment configuration file:
conda env create -f environment.yml
conda activate NSBI
Now, install the custom nsbi-common-utils
library into the NSBI conda environment:
python -m pip install .
This should be followed by creating a Jupyter kernel for the environment so that it can be found and picked up by Jupyter Lab instances:
conda install ipykernel
python -m ipykernel install --user --name NSBI --display-name "NSBI"
The notebooks presented here aim to serve as a scaled-down and simplified tutorial for the workflow building related to the NSBI analysis recently published by ATLAS:
- An implementation of neural simulation-based inference for parameter estimation in ATLAS (https://arxiv.org/pdf/2412.01600)
- Measurement of off-shell Higgs boson production in the
decay channel using a neural simulation-based inference technique in 13 TeV p-p collisions with the ATLAS detector (https://arxiv.org/pdf/2412.01548)
The aim is to demonstrate the workflow associated with a full-scale LHC analysis. The physics results presented in the notebooks only serve as examples of the workflow. The code in this tutorial is partially derived from the original ATLAS analysis code written by Jay Sandesara [git], R.D. Schaffer [git] and Arnaud Maury [git].
Note: The tutorial is a scaled down version of the original workflow.
The workflow currently uses the ttbar open data released by CMS, and the Higgs to tau tau dataset from FAIR universe challenge. More open datasets will be added in the future.
To use the library nsbi_common_utils
developed here in general cases outside of this tutorial, do:
python -m pip install --upgrade 'nsbi-common-utils @ git+https://github.com/iris-hep/NSBI-workflow-tutorial.git'
Workflow bluprint (tentative):
This work is being supported by the U.S. National Science Foundation (NSF) cooperative agreements OAC-1836650 and PHY-2323298 (IRIS-HEP).
nsbi-common-utils
is distributed under the terms of the MIT license.