|
34 | 34 | "- $f(\\cdot)$ if the (known!) PDF of the variable $X$\n",
|
35 | 35 | "- $G(\\cdot)$ is a function with nice properties.\n",
|
36 | 36 | "\n",
|
37 |
| - "The \"nice properties\" require (in the most general case) that $G(x)$ is a $C^1$ diffeomorphism, which means that it is 1) continuous and differentiable almost everywhere; 2) it is bijective, and 3) its derivaties are also bijective. \n", |
| 37 | + "The \"nice properties\" require (in the most general case) that $G(x)$ is a $C^1$ diffeomorphism, which means that it is 1) continuous and differentiable almost everywhere; 2) it is bijective, and 3) its derivatives are also bijective. \n", |
38 | 38 | "\n",
|
39 | 39 | "A simpler requirement is that $G(x)$ is continuous, bijective, and monotonic. That will get us 99% of the way there. Hey, $\\exp$ is continuous, bijective, and monotonic -- what a coincidence!\n"
|
40 | 40 | ]
|
|
412 | 412 | ],
|
413 | 413 | "source": [
|
414 | 414 | "z_values = pt.dvector(\"z_values\")\n",
|
415 |
| - "# The funtion `pm.logp` does the magic!\n", |
| 415 | + "# The function `pm.logp` does the magic!\n", |
416 | 416 | "z_logp = pm.logp(z, z_values, jacobian=True)\n",
|
417 | 417 | "# We do this rewrite to make the computation more stable.\n",
|
418 | 418 | "rewrite_graph(z_logp).dprint()"
|
|
668 | 668 | "id": "5f9a7a50",
|
669 | 669 | "metadata": {},
|
670 | 670 | "source": [
|
671 |
| - "Theese distribution are essentially the same." |
| 671 | + "These distribution are essentially the same." |
672 | 672 | ]
|
673 | 673 | },
|
674 | 674 | {
|
|
715 | 715 | "\n",
|
716 | 716 | "So, the inverse of their composition is $G^{-1} \\equiv (J^{-1} \\circ H^{-1}) = J^{-1}(H^{-1}(x)) = J^{-1}(\\ln(x)) = \\frac{\\ln(x) - a}{b}$\n",
|
717 | 717 | "\n",
|
718 |
| - "For the correction term, we need the determinant of the jacobian. Since $G$ is a scalar function, this is just the absolutel value of the gradient:\n", |
| 718 | + "For the correction term, we need the determinant of the jacobian. Since $G$ is a scalar function, this is just the absolute value of the gradient:\n", |
719 | 719 | "\n",
|
720 | 720 | "$$\\left | \\frac{\\partial}{\\partial x}G^{-1} \\right | = \\left | \\frac{\\partial}{\\partial x} \\frac{\\ln(x) - a}{b} \\right | = \\left | \\frac{1}{b} \\cdot \\frac{1}{x} \\right | $$\n",
|
721 | 721 | "\n",
|
|
733 | 733 | "source": [
|
734 | 734 | "### Solution by hand\n",
|
735 | 735 | "\n",
|
736 |
| - "We now implement theis analytic procesure in PyTensor:" |
| 736 | + "We now implement this analytic procedure in PyTensor:" |
737 | 737 | ]
|
738 | 738 | },
|
739 | 739 | {
|
|
803 | 803 | "id": "bcd081d3",
|
804 | 804 | "metadata": {},
|
805 | 805 | "source": [
|
806 |
| - "We can verify these values are exaclty what we are expecting:" |
| 806 | + "We can verify these values are exactly what we are expecting:" |
807 | 807 | ]
|
808 | 808 | },
|
809 | 809 | {
|
|
859 | 859 | "id": "46834a6f",
|
860 | 860 | "metadata": {},
|
861 | 861 | "source": [
|
862 |
| - "As above let's verify taht the results are consistent and correct:" |
| 862 | + "As above let's verify that the results are consistent and correct:" |
863 | 863 | ]
|
864 | 864 | },
|
865 | 865 | {
|
|
0 commit comments