Skip to content

Conversation

leomilano
Copy link
Contributor

@leomilano leomilano commented Sep 27, 2025

Fixes #3596

Description

This PR branch uses 'exp(x)' instead of 'sin(x)' as the 'y' (target) variable. The reason for this is faster and clearer convergence, since exp(x) is well approximated by a Taylor expansion around the origin. Also, all the derivatives are exp(0)=1, so the polynomial coefficients are trivially '1/n! '(for the 'n-th' term).

I am also improving the information output to the user inside the optimization loop.

Finally, I am making the top doc string a raw string, to avoid a python3 warning about it.

Checklist

  • The issue that is being fixed is referred in the description (see above "Fixes #ISSUE_NUMBER")
  • Only one issue is addressed in this pull request
  • Labels from the issue that this PR is fixing are added to this pull request
  • No unnecessary issues are included into this pull request.

cc @albanD @jbschlosser

Copy link

pytorch-bot bot commented Sep 27, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/tutorials/3597

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 44aac34 with merge base d7a5681 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

Copy link

meta-cla bot commented Sep 27, 2025

Hi @leomilano!

Thank you for your pull request and welcome to our community.

Action Required

In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at [email protected]. Thanks!

Copy link

meta-cla bot commented Sep 27, 2025

Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Meta Open Source project. Thanks!

@meta-cla meta-cla bot added the cla signed label Sep 27, 2025
@svekars svekars added the core Tutorials of any level of difficulty related to the core pytorch functionality label Sep 29, 2025
Copy link
Contributor

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@svekars
Copy link
Contributor

svekars commented Sep 29, 2025

@leomilano can you please fix lintrunner? Happy to merge after that

@leomilano
Copy link
Contributor Author

@svekars Will do, shortly

@leomilano
Copy link
Contributor Author

@leomilano can you please fix lintrunner? Happy to merge after that

Hi, @svekars , I just did - the latest commit passes "lintrunner -m" locally - Thanks much!

@svekars svekars merged commit de1ca9f into pytorch:main Sep 30, 2025
21 checks passed
jafraustro pushed a commit to jafraustro/tutorials that referenced this pull request Oct 1, 2025
…ograd (pytorch#3597)

Fixes pytorch#3596

This PR branch uses 'exp(x)' instead of 'sin(x)' as the 'y' (target)
variable. The reason for this is faster and clearer convergence, since
exp(x) is well approximated by a Taylor expansion around the origin.
Also, all the derivatives are exp(0)=1, so the polynomial coefficients
are trivially '1/n! '(for the 'n-th' term).

I am also improving the information output to the user inside the
optimization loop.

Finally, I am making the top doc string a raw string, to avoid a python3
warning about it.


cc @albanD @jbschlosser

---------

Co-authored-by: Svetlana Karslioglu <[email protected]>
jafraustro pushed a commit to jafraustro/tutorials that referenced this pull request Oct 1, 2025
…ograd (pytorch#3597)

Fixes pytorch#3596

This PR branch uses 'exp(x)' instead of 'sin(x)' as the 'y' (target)
variable. The reason for this is faster and clearer convergence, since
exp(x) is well approximated by a Taylor expansion around the origin.
Also, all the derivatives are exp(0)=1, so the polynomial coefficients
are trivially '1/n! '(for the 'n-th' term).

I am also improving the information output to the user inside the
optimization loop.

Finally, I am making the top doc string a raw string, to avoid a python3
warning about it.


cc @albanD @jbschlosser

---------

Co-authored-by: Svetlana Karslioglu <[email protected]>
@konjac
Copy link

konjac commented Oct 6, 2025

@leomilano @svekars @albanD

I have some doubts for this change. I think the updated file actually causes inconsistency.

  1. The comment line 5 & 6 are still about fitting for sin(x).
  2. This file is included in https://github.com/pytorch/tutorials/blame/3469d47af6e14742990210ec35933ebe73fe380c/beginner_source/pytorch_with_examples.rst#L113 . The whole tutorial page is about fitting for sin(x), incrementally from numpy and manual backward to auto grad. Shouldn't we keep them aligned?

Thus this example source should be reverted to sin(x) unless a comprehensive update to the whole tutorial page.

@leomilano
Copy link
Contributor Author

@leomilano @svekars @albanD

I have some doubts for this change. I think the updated file actually causes inconsistency.

  1. The comment line 5 & 6 are still about fitting for sin(x).
  2. This file is included in https://github.com/pytorch/tutorials/blame/3469d47af6e14742990210ec35933ebe73fe380c/beginner_source/pytorch_with_examples.rst#L113 . The whole tutorial page is about fitting for sin(x), incrementally from numpy and manual backward to auto grad. Shouldn't we keep them aligned?

Thus this example source should be reverted to sin(x) unless a comprehensive update to the whole tutorial page.

Good points, @konjac . Sorry about your point 1, I didn't notice. I could quickly address that, but I think point 2 is harder to address, I hadn't noticed the disconnect with the rest of the tutorial.

The other changes in the PR are minor improvements and I see no reason to revert those. How about this proposal for a new PR:

(a) move back to sin(x)
(b) add a comment above that line encouraging the developer taking the tutorial to try other functions with faster convergence, such as exp(x), a polynomial in X, etc.

My motivation for the change was that it wasn't clear to me that this thing was converging, so I made the local change I wound up submitting.

Anyways, if this sounds like a good idea, please let me know whether I should open a new issue or refer to this (closed) issue (would we need to reopen it?)

Thanks so much! - Leo

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed core Tutorials of any level of difficulty related to the core pytorch functionality
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Proposal: replace sin(x) with exp(x) in polynomial_autograd tutorial
4 participants