Skip to content

Commit bf54f69

Browse files
committed
reorder some experiences
1 parent 48a55b8 commit bf54f69

File tree

2 files changed

+3
-3
lines changed

2 files changed

+3
-3
lines changed

resume/sorokin_resume.pdf

-2 Bytes
Binary file not shown.

resume/sorokin_resume.tex

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -66,11 +66,11 @@ \subsection{Experiences}
6666
\newentry{\normalfont{2023.05 - 2023.08}}{\textbf{Graduate Intern} at \textbf{Los Alamos National Laboratory.} I modeled multi-fidelity solutions to PDE with random coefficients using efficient and error aware Gaussian processes regression models \cite{sorokin.gp4darcy}.}
6767
\newentry{\normalfont{2022.05 - 2022.08}}{\textbf{Givens Associate Intern} at \textbf{Argonne National Laboratory}. I derived error bounds and proposed a sequential sampling method for efficiently estimating failure probabilities with probabilistic models \cite{sorokin.adaptive_prob_failure_GP}.}
6868
\newentry{\normalfont{2021.05 - 2021.08}}{\textbf{ML Engineer Intern} at \textbf{SigOpt, an Intel Company}. In a six-person ML team, I contributed production code for meta-learning model-aware hyperparameter tuning via Bayesian optimization \cite{sorokin.sigopt_mulch}.}
69-
\newentry{\normalfont{2021.08 - 2025.01}}{\textbf{Teaching Assistant} at \textbf{IIT}. I led reviews for PhD qualifying exams in analysis and computational math.}
70-
\newentry{\normalfont{2018.05 - 2019.08}}{\textbf{Instructor} for the \textbf{STARS Computing Corps' Computer Discover Program.} I taught and developed curriculum for middle school and high school girls to learn programmatic thinking in Python.}
7169
\newentry{\normalfont{2022.09 - 2022.11}}{\textbf{Participant} in \textbf{Argonne National Laboratory's Course on AI Driven Science on Supercomputers}. Key topics included handling large scale data pipelines and parallel training for neural networks.} %\itlink{github.com/alegresor/ai-science-training-series}{https://github.com/alegresor/ai-science-training-series}.
72-
\subsection{Open-Source Software}
70+
\newentry{\normalfont{2018.05 - 2019.08}}{\textbf{Instructor} for the \textbf{STARS Computing Corps' Computer Discover Program.} I taught and developed curriculum for middle school and high school girls to learn programmatic thinking in Python.}
71+
\newentry{\normalfont{2021.08 - 2025.01}}{\textbf{Teaching Assistant} at \textbf{IIT}. I led reviews for PhD qualifying exams in analysis and computational math.}
7372

73+
\subsection{Open-Source Software}
7474
\newentry{\texttt{QMCPy}}{\textbf{Quasi-Monte Carlo Python Software} (\href{https://qmcsoftware.github.io/QMCSoftware}{qmcsoftware.github.io/QMCSoftware}). I led dozens of collaborators across academia and industry to develop QMC sequence generators, automatic variable transformations, adaptive error estimation algorithms, and diverse use cases \cite{sorokin.thesis,sorokin.2025.ld_randomizations_ho_nets_fast_kernel_mats,choi.challenges_great_qmc_software,choi.QMC_software,sorokin.MC_vector_functions_integrals,sorokin.QMC_IS_QMCPy,hickernell.qmc_what_why_how,jain.bernstein_betting_confidence_intervals}.}
7575
\newentry{\texttt{FastGPs}}{\textbf{Scalable Gaussian Process Regression in Python} (\href{https://alegresor.github.io/fastgps}{alegresor.github.io/fastgps}). This supports GPU scaling, batched inference, robust hyperparameter optimization, multi-fidelity GPs, and efficient Bayesian cubature. \texttt{FastGPs} is the first package to implement GPs which require only $\mathcal{O}(n)$ storage and $\mathcal{O}(n \log n)$ computations compared to the typical $\mathcal{O}(n^2)$ storage and $\mathcal{O}(n^3)$ computations requirements.}
7676
\newentry{\scalebox{.9}{\texttt{QMCGenerators.jl}}}{\textbf{Randomized Quasi-Monte Carlo Sequences in Julia} (\href{https://alegresor.github.io/QMCGenerators.jl}{alegresor.github.io/QMCGenerators.jl}).}

0 commit comments

Comments
 (0)