Skip to content
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/actions/spelling/allow/terms.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
AARCH
AIML
Backpropagation
BGZF
CINT
CMSSW
Expand Down
4 changes: 4 additions & 0 deletions _data/standing_meetings.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,10 @@
time_cest: "17:00"
connect: "[Link to zoom](https://princeton.zoom.us/j/97915651167?pwd=MXJ1T2lhc3Z5QWlYbUFnMTZYQlNRdz09)"
agenda:
- title: "Enhancing LLM Training Efficiency with Clad for Backpropagation"
date: 2025-06-05 15:00:00 +0200
speaker: "Rohan Timmaraju"
link: "[Slides](/assets/presentations/CaaS_Weekly_05_06_2025_Rohan_Timmaraju_LLM_Training.pdf)"
- title: "Improve automatic differentiation of object-oriented paradigms using Clad"
date: 2025-06-05 16:00:00 +0200
speaker: "Petro Zarytskyi"
Expand Down
1 change: 0 additions & 1 deletion _posts/2025-05-21-enhancing-llm-training.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,6 @@ By successfully integrating Clad into a C++ LLM training pipeline, we aim to:
* **Offer a C++ Alternative:** Provide a foundation for more efficient, compiler-driven LLM training within the C++ ecosystems.
* **Learn and Share:** Gain insights into the practicalities of applying compiler-based AD to complex ML problems and share these learnings with the community.

I believe this project has the potential to make a valuable contribution to both the compiler research field and the ongoing efforts to make powerful AI models more accessible and efficient to train.

### Related Links

Expand Down
Binary file not shown.
Loading