You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: index.md
+17-19Lines changed: 17 additions & 19 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,8 +7,20 @@ layout: default
7
7
8
8
# Softwares
9
9
10
-

11
10
11
+
## FastGPs
12
+
13
+
```
14
+
pip install fastgps
15
+
```
16
+
17
+
Gaussian process regression (GPR) models typically require $\mathcal{O}(n^2)$ storage and $\mathcal{O}(n^3)$ computations. [FastGPs](https://alegresor.github.io/fastgps) implements GPR which requires only $\mathcal{O}(n)$ storage and $\mathcal{O}(n \log n)$ computations by pairing certain quasi-random sampling locations with matching kernels to yield structured Gram matrices. We support
18
+
- GPU scaling,
19
+
- batched inference,
20
+
- robust hyperparameter optimization, and
21
+
- multi-task GPR.
22
+
23
+

12
24
13
25
## QMCPy
14
26
@@ -18,10 +30,10 @@ pip install qmcpy
18
30
19
31
[QMCPy](https://qmcsoftware.github.io/QMCSoftware/) is a Python package for Quasi-Monte Carlo (QMC) which contains
20
32
- quasi-random (low discrepancy) sequence generators and randomization routines, including
21
-
-**lattices** with
33
+
-*lattices* with
22
34
- extensible constructions
23
35
- random shifts
24
-
-**digital nets** (e.g. Sobol' points) with
36
+
-*digital nets* (e.g. Sobol' points) with
25
37
- extensible constructions
26
38
- random digital shifts,
27
39
- linear matrix scrambling,
@@ -36,31 +48,17 @@ pip install qmcpy
36
48
- a suite of diverse use cases, and
37
49
- automatic variable transforms.
38
50
39
-
51
+

40
52
41
53

42
54
43
-
## FastGPs
44
-
45
-
```
46
-
pip install fastgps
47
-
```
48
-
49
-
Gaussian process regression (GPR) models typically require $\mathcal{O}(n^2)$ storage and $\mathcal{O}(n^3)$ computations. [FastGPs](https://alegresor.github.io/fastgps) implements GPR which requires only $\mathcal{O}(n)$ storage and $\mathcal{O}(n \log n)$ computations by pairing certain quasi-random sampling locations with matching kernels to yield structured Gram matrices. We support
50
-
- GPU scaling,
51
-
- batched inference,
52
-
- robust hyperparameter optimization, and
53
-
- multi-task GPR.
54
-
55
-

56
-
57
55
## QMCGenerators.jl
58
56
59
57
```
60
58
] add QMCGenerators
61
59
```
62
60
63
-
[QMCGenerators.jl](https://alegresor.github.io/QMCGenerators.jl/) is a Julia package includes routines to generate and randomize quasi-random sequences used in Quasi-Monte Carlo. Supports the suite of low discrepancy sequence generators and randomization routines available in [QMCPy](https://qmcsoftware.github.io/QMCSoftware/). This package is a translation and enhancement of Dirk Nuyens' [Magic Point Shop](https://people.cs.kuleuven.be/~dirk.nuyens/qmc-generators/).
61
+
[QMCGenerators.jl](https://alegresor.github.io/QMCGenerators.jl/) is a Julia package which includes routines to generate and randomize quasi-random sequences used in Quasi-Monte Carlo. This supports the suite of low discrepancy sequence generators and randomization routines available in [QMCPy](https://qmcsoftware.github.io/QMCSoftware/), see the description above. This package is a translation and enhancement of Dirk Nuyens' [Magic Point Shop](https://people.cs.kuleuven.be/~dirk.nuyens/qmc-generators/).
\newentry{\normalfont{2017 - 2021}}{\textbf{B.S. in Applied Math, Minor in Computer Science.} IIT. Summa Cum Laude. GPA $3.94 / 4$.}
75
60
76
61
\subsection{Experiences}
77
-
\newentry{\normalfont{Jan - Dec 2025}}{\textbf{DOE SCGSR Fellow in Applied Mathematics} at \textbf{Sandia National Laboratory} in Livermore, CA. I am researching Gaussian process based scientific ML models for machine precision operator learning. I am also developing fast, scalable multi-task Gaussian processes for multi-fidelity modeling. We are preparing publications and open-source software with scalable GPU support e.g. see\texttt{FastGPs} below.}
78
-
\newentry{\normalfont{Summer 2024}}{\textbf{Scientific Machine Learning Researcher} at \textbf{FM (Factory Mutual Insurance Company).} I built SciML models, including Physics Informed Neural Networks (PINNs) and Deep Operator Networks (DeepONets), for solving Radiative Transport Equations (RTEs) used to speed up CFD fire dynamics simulations. Resulted in publication of \citetitle{sorokin.RTE_DeepONet}.}
62
+
\newentry{\normalfont{Jan - Dec 2025}}{\textbf{DOE SCGSR Fellow in Applied Mathematics} at \textbf{Sandia National Laboratory} in Livermore, CA. I am researching Gaussian process based scientific ML models for machine precision PDE solutions. I am also developing fast, scalable multi-task Gaussian processes for multi-fidelity modeling. We are preparing publications and open-source software with HPC support such as\texttt{FastGPs} below.}
63
+
\newentry{\normalfont{Summer 2024}}{\textbf{Scientific Machine Learning Researcher} at \textbf{FM (Factory Mutual Insurance Company).} I built scientific ML models, including Physics Informed Neural Networks (PINNs) and Deep Operator Networks (DeepONets), for solving Radiative Transport Equations (RTEs) used to speed up CFD fire dynamics simulations. Resulted in publication of \citetitle{sorokin.RTE_DeepONet}.}
79
64
\newentry{\normalfont{Summer 2023}}{\textbf{Graduate Intern} at \textbf{Los Alamos National Laboratory.} I modeled the solution processes of PDEs with random coefficients using efficient and error aware Gaussian processes. Resulted in publication of \citetitle{sorokin.gp4darcy}.}
80
65
\newentry{\normalfont{Summer 2022}}{\textbf{Givens Associate Intern} at \textbf{Argonne National Laboratory}. I researched methods to efficiently estimate failure probability using Monte Carlo with non-parametric importance sampling. Resulted in publication of \citetitle{sorokin.adaptive_prob_failure_GP}.}
81
66
\newentry{\normalfont{Summer 2021}}{\textbf{ML Engineer Intern} at \textbf{SigOpt, an Intel Company}. I developed novel meta-learning techniques for model-aware hyperparameter tuning via Bayesian optimization. In a six person ML engineering team, I contributed production code and learned key elements of the AWS stack. Resulted in publication of \citetitle{sorokin.sigopt_mulch}.}
0 commit comments