Skip to content

Eliminates the reverse_forw pass generation when the custom pullback exists #1810

Merged
vgvassilev merged 1 commit intovgvassilev:masterfrom
Vedant2005goyal:issue-1808
Apr 4, 2026
Merged

Eliminates the reverse_forw pass generation when the custom pullback exists #1810
vgvassilev merged 1 commit intovgvassilev:masterfrom
Vedant2005goyal:issue-1808

Conversation

@Vedant2005goyal
Copy link
Copy Markdown
Contributor

This PR prevents the generation of reverse_mode_forward_pass when a custom pullback is defined and this prevents unwanted tape overhead .

Fixes #1808

…it removes unwanted tape overhead

Fixes issue-1808
@codecov
Copy link
Copy Markdown

codecov bot commented Apr 4, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.

📢 Thoughts on this report? Let us know!

@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Apr 4, 2026

clang-tidy review says "All clean, LGTM! 👍"

@vgvassilev vgvassilev requested a review from guitargeek April 4, 2026 12:38
Copy link
Copy Markdown
Collaborator

@guitargeek guitargeek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you very much, excellent! This is an important optimization and gives me exactly results I was expecting: fast gradients without tracker/tape operations if the neural net operators have custom pullbacks.

@vgvassilev vgvassilev merged commit b8933b9 into vgvassilev:master Apr 4, 2026
39 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Can't implement ReLU without incurring tape operations - checkpoint loop crashes

3 participants