Skip to content

Commit 7779b07

Browse files
authored
Fix typos in normalizing_flows_in_pytensor notebook (#1611)
1 parent 46f8227 commit 7779b07

File tree

1 file changed

+7
-7
lines changed

1 file changed

+7
-7
lines changed

doc/gallery/applications/normalizing_flows_in_pytensor.ipynb

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@
3434
"- $f(\\cdot)$ if the (known!) PDF of the variable $X$\n",
3535
"- $G(\\cdot)$ is a function with nice properties.\n",
3636
"\n",
37-
"The \"nice properties\" require (in the most general case) that $G(x)$ is a $C^1$ diffeomorphism, which means that it is 1) continuous and differentiable almost everywhere; 2) it is bijective, and 3) its derivaties are also bijective. \n",
37+
"The \"nice properties\" require (in the most general case) that $G(x)$ is a $C^1$ diffeomorphism, which means that it is 1) continuous and differentiable almost everywhere; 2) it is bijective, and 3) its derivatives are also bijective. \n",
3838
"\n",
3939
"A simpler requirement is that $G(x)$ is continuous, bijective, and monotonic. That will get us 99% of the way there. Hey, $\\exp$ is continuous, bijective, and monotonic -- what a coincidence!\n"
4040
]
@@ -412,7 +412,7 @@
412412
],
413413
"source": [
414414
"z_values = pt.dvector(\"z_values\")\n",
415-
"# The funtion `pm.logp` does the magic!\n",
415+
"# The function `pm.logp` does the magic!\n",
416416
"z_logp = pm.logp(z, z_values, jacobian=True)\n",
417417
"# We do this rewrite to make the computation more stable.\n",
418418
"rewrite_graph(z_logp).dprint()"
@@ -668,7 +668,7 @@
668668
"id": "5f9a7a50",
669669
"metadata": {},
670670
"source": [
671-
"Theese distribution are essentially the same."
671+
"These distribution are essentially the same."
672672
]
673673
},
674674
{
@@ -715,7 +715,7 @@
715715
"\n",
716716
"So, the inverse of their composition is $G^{-1} \\equiv (J^{-1} \\circ H^{-1}) = J^{-1}(H^{-1}(x)) = J^{-1}(\\ln(x)) = \\frac{\\ln(x) - a}{b}$\n",
717717
"\n",
718-
"For the correction term, we need the determinant of the jacobian. Since $G$ is a scalar function, this is just the absolutel value of the gradient:\n",
718+
"For the correction term, we need the determinant of the jacobian. Since $G$ is a scalar function, this is just the absolute value of the gradient:\n",
719719
"\n",
720720
"$$\\left | \\frac{\\partial}{\\partial x}G^{-1} \\right | = \\left | \\frac{\\partial}{\\partial x} \\frac{\\ln(x) - a}{b} \\right | = \\left | \\frac{1}{b} \\cdot \\frac{1}{x} \\right | $$\n",
721721
"\n",
@@ -733,7 +733,7 @@
733733
"source": [
734734
"### Solution by hand\n",
735735
"\n",
736-
"We now implement theis analytic procesure in PyTensor:"
736+
"We now implement this analytic procedure in PyTensor:"
737737
]
738738
},
739739
{
@@ -803,7 +803,7 @@
803803
"id": "bcd081d3",
804804
"metadata": {},
805805
"source": [
806-
"We can verify these values are exaclty what we are expecting:"
806+
"We can verify these values are exactly what we are expecting:"
807807
]
808808
},
809809
{
@@ -859,7 +859,7 @@
859859
"id": "46834a6f",
860860
"metadata": {},
861861
"source": [
862-
"As above let's verify taht the results are consistent and correct:"
862+
"As above let's verify that the results are consistent and correct:"
863863
]
864864
},
865865
{

0 commit comments

Comments
 (0)