You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We see the black contour line of zero, which tells us when $e_i(p)=0$.
642
+
We see the black contour line of zero, which tells us when $e_i(p) = 0$.
643
643
644
-
For a price vector $p$ such that $e_i(p)=0$ we know that good $i$ is in equilibrium (demand equals supply).
644
+
For a price vector $p$ such that $e_i(p) = 0$ we know that good $i$ is in equilibrium (demand equals supply).
645
645
646
646
If these two contour lines cross at some price vector $p^*$, then $p^*$ is an equilibrium price vector.
647
647
@@ -693,7 +693,7 @@ This is indeed a very small error.
693
693
694
694
In many cases, for zero-finding algorithms applied to smooth functions, supplying the [Jacobian](https://en.wikipedia.org/wiki/Jacobian_matrix_and_determinant) of the function leads to better convergence properties.
695
695
696
-
Here we manually calculate the elements of the Jacobian
696
+
Here, we manually calculate the elements of the Jacobian
697
697
698
698
$$
699
699
J(p) =
@@ -747,9 +747,9 @@ This is a multivariate version of [](oneD-newton)
747
747
748
748
The iteration starts from some initial guess of the price vector $p_0$.
749
749
750
-
Here, instead of coding Jacobian by hand, We use the `jacobian()` function in the `autograd` library to auto-differentiate and calculate the Jacobian.
750
+
Here, instead of coding Jacobian by hand, we use the `jacobian()` function in the `autograd` library to auto-differentiate and calculate the Jacobian.
751
751
752
-
With only slight modification, we can generalize [our previous attempt](first_newton_attempt) to multi-dimensional problems
752
+
With only slight modification, we can generalize [our previous attempt](first_newton_attempt) to multidimensional problems
753
753
754
754
```{code-cell} ipython3
755
755
def newton(f, x_0, tol=1e-5, max_iter=10):
@@ -865,7 +865,7 @@ A = \begin{bmatrix}
865
865
1 & 5 & 1 \\
866
866
\end{bmatrix},
867
867
\quad
868
-
s = 0.2, \quad α = 0.5, \quad δ = 0.8
868
+
s = 0.2, \quad \alpha = 0.5, \quad \delta = 0.8
869
869
$$
870
870
871
871
As before the law of motion is
@@ -875,7 +875,7 @@ As before the law of motion is
875
875
g(k) := sAk^\alpha + (1-\delta) k
876
876
```
877
877
878
-
However $k_t$ is now a $3 \times 1$ vector.
878
+
However, $k_t$ is now a $3 \times 1$ vector.
879
879
880
880
Solve for the fixed point using Newton's method with the following initial values:
881
881
@@ -890,7 +890,7 @@ $$
890
890
````{hint}
891
891
:class: dropdown
892
892
893
-
- The computation of the fixed point is equivalent to computing $k^*$ such that $f(k^*) - k^* = 0$.
893
+
- The computation of the fixed point is equivalent to computing $k^*$ such that $g(k^*) - k^* = 0$.
894
894
895
895
- If you are unsure about your solution, you can start with the solved example:
896
896
@@ -902,7 +902,7 @@ A = \begin{bmatrix}
902
902
\end{bmatrix}
903
903
```
904
904
905
-
with $s = 0.3$, $α = 0.3$, and $δ = 0.4$ and starting value:
905
+
with $s = 0.3$, $\alpha = 0.3$, and $\delta = 0.4$ and starting value:
0 commit comments