@@ -14,8 +14,9 @@ Core deep learning and I/O functions and classes are designed to be problem gene
1414That is, they can be used without any specific strict workflow and can handle near arbitrary
1515inputs as suitable (parquet files, ROOT files ...).
1616
17- Many high energy physics applications such as the signal-from-background discrimination problem
18- fit under certain "quasi-templated YAML-python-workflow" as manifested from the implemented applications.
17+ Many high energy physics applications such as the signal-from-background discrimination or
18+ reweighting (or morphing) problems problem fit under certain "quasi-templated YAML-python-workflow"
19+ as is manifest from the implemented applications.
1920
2021
2122YAML-configuration files
@@ -45,13 +46,14 @@ specific I/O functions.
4546 -icedqcd DQCD analysis application [large scale new physics analysis, domain adaptation]
4647 -icefit Core fitting and statistics [tag & probe ++]
4748 -icehgcal HGCAL detector application [graph neural networks]
48- -icehnl HNL analysis application [neural mutual information with BDT and MLP]
49+ -icehnl HNL analysis application [neural mutual information with ICEBOOST and MLP]
4950 -iceid Electron ID application
51+ -icemc Simple MC tools
5052 -icenet Core deep learning & I/O functions
5153 -iceplot Core plotting tools
5254 -iceqsub SGE submission steering functions
5355 -icetrg HLT trigger application
54- -icezee High-dimensional reweighting application [advanced MLP models and regularization]
56+ -icezee High-dimensional reweighting application
5557 -tests Tests, continuous integration (CI) and bash-launch scripts
5658 -output HDF5, pickle outputs
5759 -dev Development code
@@ -65,7 +67,7 @@ AI-algorithms and models
6567Various ML and AI-models are implemented and supported. From a fixed dimensional input models
6668such as boosted decision trees (BDT) via XGBoost enhanced with a custom torch autograd driven loss function,
6769aka ``ICEBOOST ``, to more complex "Geometric Deep Learning" with graph neural networks using torch-geometric
68- as a low-level backend.
70+ as a low-level backend, and normalizing flows .
6971
7072The library is ultimately agnostic regarding the underlying models, i.e.
7173new torch models or loss functions can be easily added and other computational libraries such as JAX can be used.
@@ -80,7 +82,7 @@ Reasily available models such as
8082 2. Kolmogorov-Arnold representation theorem networks [pytorch]
8183 3. Lipschitz continuous MLPs [pytorch]
8284 4. Graph Neural Nets (graph-, node-, edge-level inference) [pytorch-geometric]
83- 5. Deep Normalizing Flow (BNAF) based pdfs & likelihood ratios [pytorch]
85+ 5. Deep Normalizing Flow (BNAF), Spline Flows based pdfs & likelihood ratios [pytorch]
8486 6. Neural mutual information estimator (MINE) and non-linear distance correlation (DCORR) [pytorch]
8587 7. MaxOUT multilayer feedforward network [pytorch]
8688 8. Permutation Equivariant Networks (DeepSets) [pytorch]
0 commit comments