Skip to content

Commit edb44f5

Browse files
committed
fix typos
1 parent ee869e2 commit edb44f5

File tree

1 file changed

+32
-32
lines changed

1 file changed

+32
-32
lines changed

lectures/newton_method.md

Lines changed: 32 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ on a `GPU` is [available here](https://jax.quantecon.org/newtons_method.html)
3737

3838
Many economic problems involve finding [fixed
3939
points](https://en.wikipedia.org/wiki/Fixed_point_(mathematics)) or
40-
[zeros](https://en.wikipedia.org/wiki/Zero_of_a_function) (sometimes called
40+
[zeros](https://en.wikipedia.org/wiki/Zero_of_a_function) (also called
4141
"roots") of functions.
4242

4343
For example, in a simple supply and demand model, an equilibrium price is one
@@ -55,7 +55,7 @@ Newton's method does not always work but, in situations where it does,
5555
convergence is often fast when compared to other methods.
5656

5757
The lecture will apply Newton's method in one-dimensional and
58-
multi-dimensional settings to solve fixed-point and zero-finding problems.
58+
multidimensional settings to solve fixed-point and zero-finding problems.
5959

6060
* When finding the fixed point of a function $f$, Newton's method updates
6161
an existing guess of the fixed point by solving for the fixed point of a
@@ -69,10 +69,10 @@ To build intuition, we first consider an easy, one-dimensional fixed point
6969
problem where we know the solution and solve it using both successive
7070
approximation and Newton's method.
7171

72-
Then we apply Newton's method to multi-dimensional settings to solve
72+
Then we apply Newton's method to multidimensional settings to solve
7373
market for equilibria with multiple goods.
7474

75-
At the end of the lecture we leverage the power of automatic
75+
At the end of the lecture, we leverage the power of automatic
7676
differentiation in [`autograd`](https://github.com/HIPS/autograd) to solve a very high-dimensional equilibrium problem
7777

7878
```{code-cell} ipython3
@@ -117,21 +117,21 @@ zero population growth, the law of motion for capital is
117117
Here
118118

119119
- $k_t$ is capital stock per worker,
120-
- $A, \alpha>0$ are production parameters, $\alpha<1$
120+
- $A, \alpha>0$ are production parameters, $\alpha < 1$
121121
- $s>0$ is a savings rate, and
122122
- $\delta \in(0,1)$ is a rate of depreciation
123123

124124
In this example, we wish to calculate the unique strictly positive fixed point
125125
of $g$, the law of motion for capital.
126126

127-
In other words, we seek a $k^* > 0$ such that $g(k^*)=k^*$.
127+
In other words, we seek a $k^* > 0$ such that $g(k^*) = k^*$.
128128

129129
* such a $k^*$ is called a [steady state](https://en.wikipedia.org/wiki/Steady_state),
130130
since $k_t = k^*$ implies $k_{t+1} = k^*$.
131131

132-
Using pencil and paper to solve $g(k)=k$, you will be able to confirm that
132+
Using pencil and paper to solve $g(k) = k$, you will be able to confirm that
133133

134-
$$ k^* = \left(\frac{s A}{δ}\right)^{1/(1 - α)} $$
134+
$$ k^* = \left(\frac{s A}{\delta}\right)^{1/(1 - \alpha)} $$
135135

136136
### Implementation
137137

@@ -283,13 +283,13 @@ the function
283283
```{math}
284284
:label: motivation
285285
286-
\hat g(x) \approx g(x_0)+g'(x_0)(x-x_0)
286+
\hat g(x) \approx g(x_0) + g'(x_0)(x - x_0)
287287
```
288288

289289
We solve for the fixed point of $\hat g$ by calculating the $x_1$ that solves
290290

291291
$$
292-
x_1=\frac{g(x_0)-g'(x_0) x_0}{1-g'(x_0)}
292+
x_1 = \frac{g(x_0) - g'(x_0) x_0}{1 - g'(x_0)}
293293
$$
294294

295295
Generalising the process above, Newton's fixed point method iterates on
@@ -307,7 +307,7 @@ To implement Newton's method we observe that the derivative of the law of motion
307307
```{math}
308308
:label: newton_method2
309309
310-
g'(k) = \alpha s A k^{\alpha-1} + (1-\delta)
310+
g'(k) = \alpha s A k^{\alpha - 1} + (1 - \delta)
311311
312312
```
313313

@@ -385,15 +385,15 @@ the problem of finding fixed points.
385385

386386
### Newton's method for zeros
387387

388-
Let's suppose we want to find an $x$ such that $f(x)=0$ for some smooth
388+
Let's suppose we want to find an $x$ such that $f(x) = 0$ for some smooth
389389
function $f$ mapping real numbers to real numbers.
390390

391391
Suppose we have a guess $x_0$ and we want to update it to a new point $x_1$.
392392

393393
As a first step, we take the first-order approximation of $f$ around $x_0$:
394394

395395
$$
396-
\hat f(x) \approx f\left(x_0\right)+f^{\prime}\left(x_0\right)\left(x-x_0\right)
396+
\hat f(x) \approx f\left(x_0\right) + f^{\prime}\left(x_0\right)\left(x - x_0\right)
397397
$$
398398

399399
Now we solve for the zero of $\hat f$.
@@ -451,7 +451,7 @@ to implement Newton's method ourselves.)
451451
Now consider again the Solow fixed-point calculation, where we solve for $k$
452452
satisfying $g(k) = k$.
453453

454-
We can convert to this to a zero-finding problem by setting $f(x) := g(x)-x$.
454+
We can convert to this to a zero-finding problem by setting $f(x) := g(x) - x$.
455455

456456
Any zero of $f$ is clearly a fixed point of $g$.
457457

@@ -468,11 +468,11 @@ k_star_approx_newton = newton(
468468
k_star_approx_newton
469469
```
470470

471-
The result confirms the descent we saw in the graphs above: a very accurate result is reached with only 5 iterations.
471+
The result confirms the convergence we saw in the graphs above: a very accurate result is reached with only 5 iterations.
472472

473473

474474

475-
## Multivariate Newtons method
475+
## Multivariate Newton's method
476476

477477
In this section, we introduce a two-good problem, present a
478478
visualization of the problem, and solve for the equilibrium of the two-good market
@@ -481,11 +481,11 @@ using both a zero finder in `SciPy` and Newton's method.
481481
We then expand the idea to a larger market with 5,000 goods and compare the
482482
performance of the two methods again.
483483

484-
We will see a significant performance gain when using Netwon's method.
484+
We will see a significant performance gain when using Newton's method.
485485

486486

487487
(two_goods_market)=
488-
### A Two Goods market equilibrium
488+
### A two-goods market equilibrium
489489

490490
Let's start by computing the market equilibrium of a two-good problem.
491491

@@ -639,9 +639,9 @@ plot_excess_demand(ax, good=1)
639639
plt.show()
640640
```
641641

642-
We see the black contour line of zero, which tells us when $e_i(p)=0$.
642+
We see the black contour line of zero, which tells us when $e_i(p) = 0$.
643643

644-
For a price vector $p$ such that $e_i(p)=0$ we know that good $i$ is in equilibrium (demand equals supply).
644+
For a price vector $p$ such that $e_i(p) = 0$ we know that good $i$ is in equilibrium (demand equals supply).
645645

646646
If these two contour lines cross at some price vector $p^*$, then $p^*$ is an equilibrium price vector.
647647

@@ -693,7 +693,7 @@ This is indeed a very small error.
693693

694694
In many cases, for zero-finding algorithms applied to smooth functions, supplying the [Jacobian](https://en.wikipedia.org/wiki/Jacobian_matrix_and_determinant) of the function leads to better convergence properties.
695695

696-
Here we manually calculate the elements of the Jacobian
696+
Here, we manually calculate the elements of the Jacobian
697697

698698
$$
699699
J(p) =
@@ -747,9 +747,9 @@ This is a multivariate version of [](oneD-newton)
747747

748748
The iteration starts from some initial guess of the price vector $p_0$.
749749

750-
Here, instead of coding Jacobian by hand, We use the `jacobian()` function in the `autograd` library to auto-differentiate and calculate the Jacobian.
750+
Here, instead of coding Jacobian by hand, we use the `jacobian()` function in the `autograd` library to auto-differentiate and calculate the Jacobian.
751751

752-
With only slight modification, we can generalize [our previous attempt](first_newton_attempt) to multi-dimensional problems
752+
With only slight modification, we can generalize [our previous attempt](first_newton_attempt) to multidimensional problems
753753

754754
```{code-cell} ipython3
755755
def newton(f, x_0, tol=1e-5, max_iter=10):
@@ -865,7 +865,7 @@ A = \begin{bmatrix}
865865
1 & 5 & 1 \\
866866
\end{bmatrix},
867867
\quad
868-
s = 0.2, \quad α = 0.5, \quad δ = 0.8
868+
s = 0.2, \quad \alpha = 0.5, \quad \delta = 0.8
869869
$$
870870

871871
As before the law of motion is
@@ -875,7 +875,7 @@ As before the law of motion is
875875
g(k) := sAk^\alpha + (1-\delta) k
876876
```
877877

878-
However $k_t$ is now a $3 \times 1$ vector.
878+
However, $k_t$ is now a $3 \times 1$ vector.
879879

880880
Solve for the fixed point using Newton's method with the following initial values:
881881

@@ -890,7 +890,7 @@ $$
890890
````{hint}
891891
:class: dropdown
892892
893-
- The computation of the fixed point is equivalent to computing $k^*$ such that $f(k^*) - k^* = 0$.
893+
- The computation of the fixed point is equivalent to computing $k^*$ such that $g(k^*) - k^* = 0$.
894894
895895
- If you are unsure about your solution, you can start with the solved example:
896896
@@ -902,7 +902,7 @@ A = \begin{bmatrix}
902902
\end{bmatrix}
903903
```
904904
905-
with $s = 0.3$, $α = 0.3$, and $δ = 0.4$ and starting value:
905+
with $s = 0.3$, $\alpha = 0.3$, and $\delta = 0.4$ and starting value:
906906
907907
908908
```{math}
@@ -932,7 +932,7 @@ s = 0.2
932932
initLs = [np.ones(3), np.array([3.0, 5.0, 5.0]), np.repeat(50.0, 3)]
933933
```
934934

935-
Then define the multivariate version of the formula for the [law of motion of captial](motion_law)
935+
Then define the multivariate version of the formula for the [law of motion of capital](motion_law)
936936

937937
```{code-cell} ipython3
938938
def multivariate_solow(k, A=A, s=s, α=α, δ=δ):
@@ -955,7 +955,7 @@ We find that the results are invariant to the starting values given the well-def
955955

956956
But the number of iterations it takes to converge is dependent on the starting values.
957957

958-
Let substitute the output back to the formulate to check our last result
958+
Let's substitute the output back into the formula to check our last result
959959

960960
```{code-cell} ipython3
961961
multivariate_solow(k) - k
@@ -1033,7 +1033,7 @@ $$
10331033
\end{aligned}
10341034
$$
10351035

1036-
Set the tolerance to $0.0$ for more accurate output.
1036+
Set the tolerance to $1e-15$ for more accurate output.
10371037

10381038
```{exercise-end}
10391039
```
@@ -1053,7 +1053,7 @@ c = np.array([1.0, 1.0, 1.0])
10531053
initLs = [np.repeat(5.0, 3), np.ones(3), np.array([4.5, 0.1, 4.0])]
10541054
```
10551055

1056-
Lets run through each initial guess and check the output
1056+
Let's run through each initial guess and check the output
10571057

10581058
```{code-cell} ipython3
10591059
:tags: [raises-exception]
@@ -1069,7 +1069,7 @@ for init in initLs:
10691069
attempt += 1
10701070
```
10711071

1072-
We can find that Newton's method may fail for some starting values.
1072+
We can see that Newton's method may fail for some starting values.
10731073

10741074
Sometimes it may take a few initial guesses to achieve convergence.
10751075

0 commit comments

Comments
 (0)