Skip to content

Commit f20ed05

Browse files
authored
Merge pull request #2790 from darcymeyer/master
Fixing some minor typos in the docs
2 parents 6bc6756 + 8fecc73 commit f20ed05

File tree

5 files changed

+10
-10
lines changed

5 files changed

+10
-10
lines changed

docs/source/advanced_theano.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ Using shared variables
77

88
Shared variables allow us to use values in theano functions that are
99
not considered an input to the function, but can still be changed
10-
later. They are very similar to global variables in may ways.::
10+
later. They are very similar to global variables in may ways::
1111

1212
a = tt.scalar('a')
1313
# Create a new shared variable with initial value of 0.1
@@ -23,7 +23,7 @@ their shape as long as the number of dimensions stays the same.
2323

2424
We can use shared variables in PyMC3 to fit the same model to several
2525
datasets without the need to recreate the model each time (which can
26-
be time consuming if the number of datasets is large).::
26+
be time consuming if the number of datasets is large)::
2727

2828
# We generate 10 datasets
2929
true_mu = [np.random.randn() for _ in range(10)]

docs/source/gp.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -243,8 +243,8 @@ also need to include the additional arguments, :code:`X`, :code:`y`, and
243243

244244
This second block produces the conditional distributions. Notice that extra
245245
arguments are required for conditionals of :math:`f1` and :math:`f2`, but not
246-
:math:`f`. This is because those arguments are cached when calling
247-
:code:`.marginal_likelihood` was called on :code:`gp`.
246+
:math:`f`. This is because those arguments are cached when
247+
:code:`.marginal_likelihood` is called on :code:`gp`.
248248

249249
.. note::
250250
When constructing conditionals, the additional arguments :code:`X`, :code:`y`,

docs/source/intro.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@ PyMC began development in 2003, as an effort to generalize the process of
6969
building Metropolis-Hastings samplers, with an aim to making Markov chain Monte
7070
Carlo (MCMC) more accessible to applied scientists.
7171
The choice to develop PyMC as a python module, rather than a standalone
72-
application, allowed the use MCMC methods in a larger modeling framework. By
72+
application, allowed the use of MCMC methods in a larger modeling framework. By
7373
2005, PyMC was reliable enough for version 1.0 to be released to the public. A
7474
small group of regular users, most associated with the University of Georgia,
7575
provided much of the feedback necessary for the refinement of PyMC to a usable

docs/source/prob_dists.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ A variable requires at least a ``name`` argument, and zero or more model paramet
2222
2323
p = pm.Beta('p', 1, 1, shape=(3, 3))
2424
25-
Probability distributions are all subclasses of ``Distribution``, which in turn has two major subclasses: ``Discrete`` and ``Continuous``. In terms of data types, a ``Continuous`` random variable is given whichever floating point type defined by ``theano.config.floatX``, while ``Discrete`` variables are given ``int16`` types when ``theano.config.floatX`` is ``float32``, and ``int64`` otherwise.
25+
Probability distributions are all subclasses of ``Distribution``, which in turn has two major subclasses: ``Discrete`` and ``Continuous``. In terms of data types, a ``Continuous`` random variable is given whichever floating point type is defined by ``theano.config.floatX``, while ``Discrete`` variables are given ``int16`` types when ``theano.config.floatX`` is ``float32``, and ``int64`` otherwise.
2626

2727
All distributions in ``pm.distributions`` will have two important methods: ``random()`` and ``logp()`` with the following signatures:
2828

@@ -128,4 +128,4 @@ The original variable is simply treated as a deterministic variable, since the v
128128
>>> model.deterministics
129129
[g]
130130

131-
By default, auto-transformed variables are ignored when summarizing and plotting model output.
131+
By default, auto-transformed variables are ignored when summarizing and plotting model output.

docs/source/theano.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ is similar to eg SymPy's `Symbol`)::
3838
y = tt.ivector('y')
3939

4040
Next, we use those variables to build up a symbolic representation
41-
of the output of our function. Note, that no computation is actually
41+
of the output of our function. Note that no computation is actually
4242
being done at this point. We only record what operations we need to
4343
do to compute the output::
4444

@@ -57,7 +57,7 @@ do to compute the output::
5757
we are working with symbolic input instead of plain arrays.
5858

5959
Now we can tell Theano to build a function that does this computation.
60-
With a typical configuration Theano generates C code, compiles it,
60+
With a typical configuration, Theano generates C code, compiles it,
6161
and creates a python function which wraps the C function::
6262

6363
func = theano.function([a, x, y], [out])
@@ -79,7 +79,7 @@ in `theano.sparse`. For a detailed overview of available operations,
7979
see `the theano api docs <http://deeplearning.net/software/theano/library/tensor/index.html>`_.
8080

8181
A notable exception where theano variables do *not* behave like
82-
NumPy arrays are operations involving conditional execution:
82+
NumPy arrays are operations involving conditional execution.
8383

8484
Code like this won't work as expected::
8585

0 commit comments

Comments
 (0)