Skip to content

Commit ea6b83f

Browse files
committed
FM poster + workshop paper
1 parent a328112 commit ea6b83f

File tree

4 files changed

+8
-4
lines changed

4 files changed

+8
-4
lines changed

index.md

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -43,11 +43,15 @@ pip install qmcpy
4343

4444
## Fast Gaussian Process Regression for Smooth Functions
4545

46-
2024 Illinois Institute of Technology Menger Day
46+
[2024 NeurIPS Workshop on Data-driven and Differentiable Simulations, Surrogates, and Solvers](https://neurips.cc/virtual/2024/workshop/84720)
4747

48+
<embed src="./posters/2024_RTEDeepONet_NeurIPSD3S3.pdf" type="application/pdf" width="1000" height="1500"/>
4849

49-
<embed src="./posters/2024_FastGP_MengerIIT.pdf" type="application/pdf" width="1000" height="750"/>
50+
## Fast Gaussian Process Regression for Smooth Functions
5051

52+
2024 Illinois Institute of Technology Menger Day
53+
54+
<embed src="./posters/2024_FastGP_MengerIIT.pdf" type="application/pdf" width="1000" height="750"/>
5155

5256
## Probabilistic Models for PDEs with Random Coefficients
5357

5.19 MB
Binary file not shown.

resume/sorokin_resume.pdf

1.09 KB
Binary file not shown.

resume/sorokin_resume.tex

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -89,7 +89,7 @@ \section{Education}
8989
\cvitem{2017 - 2021}{\textbf{B.S. in Applied Math, Minor in Computer Science.} IIT. Summa cum laude. GPA $3.94 / 4$.}
9090

9191
\section{Experiences}
92-
\cvitem{Summer 2024}{\textbf{Scientific Machine Learning Researcher} at \textbf{FM (Factory Mutual Insurance Company).} I built SciML models including Physics Informed Neural Networks (PINNs) and Deep Operator Networks (DeepONets) for solving Radiative Transport Equations (RTEs). These deep learning models were trained on large scale GPUs and used to speed up CFD fire dynamics simulations.}
92+
\cvitem{Summer 2024}{\textbf{Scientific Machine Learning Researcher} at \textbf{FM (Factory Mutual Insurance Company).} I built SciML models including Physics Informed Neural Networks (PINNs) and Deep Operator Networks (DeepONets) for solving Radiative Transport Equations (RTEs). These deep learning models were trained on large scale GPUs and used to speed up CFD fire dynamics simulations. Resulted in publication of \citetitle{sorokin.RTE_DeepONet}.}
9393
\cvitem{Summer 2023}{\textbf{Graduate Intern} at \textbf{Los Alamos National Laboratory.} I modeled the solution processes of PDEs with random coefficients using efficient and error aware Gaussian processes. Resulted in publication of \citetitle{sorokin.gp4darcy}.}
9494
\cvitem{Summer 2022}{\textbf{Givens Associate Intern} at \textbf{Argonne National Laboratory}. I researched methods to efficiently estimate failure probability using Monte Carlo with non-parametric importance sampling. Resulted in publication of \citetitle{sorokin.adaptive_prob_failure_GP}.}
9595
\cvitem{Summer 2021}{\textbf{ML Engineer Intern} at \textbf{SigOpt, an Intel Company}. I developed novel meta-learning techniques for model-aware hyperparameter tuning via Bayesian optimization. In a six person ML engineering team, I contributed production code and learned key elements of the AWS stack. Resulted in publication of \citetitle{sorokin.sigopt_mulch}.}
@@ -100,7 +100,7 @@ \section{Experiences}
100100

101101
\section{Projects}
102102
\cvitem{\textbf{Fast Gaussian Processes with Derivatives for Solving PDEs}}{The cost of Gaussian process regression can be reduced from $\mathcal{O}(n^3)$ to $\mathcal{O}(n \log n)$ when one has control over the design of experiments. This is achieved by pairing quasi-random sampling with matching kernels to induce structure in the kernel matrix. My PhD research studies generalizations for quickly incorporating gradient information into the ML model and using these efficient strategies to solve PDEs with either random or deterministic coefficients.}
103-
\cvitem{\textbf{QMCPy Software}}{I lead development of the open source project QMCPy, a Quasi-Monte Carlo Python Library. This package provides high quality quasi-random sequence generators, automatic variable transformations, adaptive stopping criteria algorithms, and diverse use cases. Over the past five years, this project has grown to dozens of collaborators and resulted in numerous conference presentations and publications \cite{choi.QMC_software, sorokin.MC_vector_functions_integrals,choi.challenges_great_qmc_software,sorokin.QMC_IS_QMCPy}. See \itlink{qmcpy.org}{https://qmcpy.org} for more information.}
103+
\cvitem{\textbf{QMCPy Software}}{I lead development of the open source project QMCPy, a Quasi-Monte Carlo Python Library. This package provides high quality quasi-random sequence generators, automatic variable transformations, adaptive stopping criteria algorithms, and diverse use cases. Over the past five years, this project has grown to dozens of collaborators and \cite{choi.challenges_great_qmc_software,choi.QMC_software,sorokin.MC_vector_functions_integrals,sorokin.QMC_IS_QMCPy}. See \itlink{qmcpy.org}{https://qmcpy.org} for more information.}
104104
\cvitem{\textbf{Argonne: AI on Supercomputers}}{I studied \emph{AI Driven Science on Supercomputers} during my time at \emph{Argonne National Laboratory}. Key topics included handling large scale data pipelines and parallel training for neural networks. %Coursework at \itlink{github.com/alegresor/ai-science-training-series}{https://github.com/alegresor/ai-science-training-series}.
105105
}
106106

0 commit comments

Comments
 (0)