Skip to content

Commit a18131c

Browse files
corrected description of optimizers (#3802)
Fixes #3778 --------- Co-authored-by: abbycross <[email protected]>
1 parent fc964e9 commit a18131c

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

learning/courses/quantum-chem-with-vqe/classical-optimizers.ipynb

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -82,8 +82,9 @@
8282
"- [slsqp](https://docs.scipy.org/doc/scipy/reference/optimize.minimize-slsqp.html#optimize-minimize-slsqp): Sequential Least Squares Programming (SLSQP).\n",
8383
"- [nelder-mead](https://docs.scipy.org/doc/scipy/reference/optimize.minimize-neldermead.html#optimize-minimize-neldermead) Nelder-Mead algorithm.\n",
8484
"\n",
85-
"These, and most available classical optimization algorithms, are local minimizers, in that they use gradients to find local minima.\n",
86-
"These algorithms have several options in common, but with subtle differences. For example, all have the option to specify a maximum number of iterations using the `'maxiter': 200` notation from above. All have some option specifying a different stopping criterion based on function or variable values, though these criteria are slightly different for different algorithms. Cobyla, for example, allows you to specify a tolerance (for example, `'tol': 0.0001`) that is the lower bound on a \"trust region\", determined by using gradients. In comparison, SLSQP lets you specify a goal in the precision of the function used in the stopping criterion ('ftol'). Nelder-Mead lets you specify a tolerance in the difference between successive parameter ($x$) guesses (xatol) or a tolerance in the difference between successive values obtained for the cost function $f(x)$ (fatol) (or both).\n",
85+
"Most available classical optimization algorithms are local minimizers, in that they use various methods to find local minima, but they are not guaranteed to find global minima. Some classical optimizers explicitly estimate gradients and use those to final local minima. Others may use successive linear or quadratic approximations of the objective function to find minima.\n",
86+
"\n",
87+
"These algorithms have several options in common, but with subtle differences. For example, all have the option to specify a maximum number of iterations using the `'maxiter': 200` notation from above. All have some option specifying a different stopping criterion based on function or variable values, though these criteria are slightly different for different algorithms. COBYLA, for example, allows you to specify a tolerance (for example, `'tol': 0.0001`) that is the lower bound on a \"trust region\". In comparison, SLSQP lets you specify a goal in the precision of the function used in the stopping criterion ('ftol'). Nelder-Mead lets you specify a tolerance in the difference between successive parameter ($x$) guesses (xatol) or a tolerance in the difference between successive values obtained for the cost function $f(x)$ (fatol) (or both).\n",
8788
"For a complete list of available algorithms and options, visit [SciPy's minimize documentation](https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html)."
8889
]
8990
}

0 commit comments

Comments
 (0)