@@ -29,74 +29,105 @@ Python.
2929Installation
3030------------
3131
32- Installation should be straight forward: :
32+ Installation should be straight forward from ` PyPI < https://pypi.org/ >`_ :
3333
34- pip install chaospy
34+ .. code-block :: bash
3535
36- And you should be ready to go.
36+ $ pip install chaospy
3737
3838 Example Usage
3939-------------
4040
41- ``chaospy `` is created to be simple and modular. A simple script to implement
42- point collocation method will look as follows:
41+ ``chaospy `` is created to work well inside numerical Python ecosystem. You
42+ therefore typically need to import `Numpy <https://numpy.org/ >`_ along side
43+ ``chaospy ``:
4344
4445.. code-block :: python
4546
46- import numpy
47- import chaospy
47+ >> > import numpy
48+ >> > import chaospy
4849
49- Wrap your code in a function:
50+ ``chaospy `` is problem agnostic, so you can use your own code using any means
51+ you find fit. The only requirement is that the output is compatible with
52+ `numpy.ndarray ` format:
5053
5154.. code-block :: python
5255
53- coordinates = numpy.linspace(0 , 10 , 100 )
54- def foo (coordinates , params ):
55- """ Function to do uncertainty quantification on."""
56- param_init, param_rate = params
57- return param_init* numpy.e** (- param_rate* coordinates)
56+ >> > coordinates = numpy.linspace(0 , 10 , 100 )
57+ >> > def forward_solver (coordinates , parameters ):
58+ ... """ Function to do uncertainty quantification on."""
59+ ... param_init, param_rate = parameters
60+ ... return param_init* numpy.e** (- param_rate* coordinates)
5861
59- Construct a multivariate probability distribution:
62+ We here assume that ``parameters `` contains aleatory variability with known
63+ probability. We formalize this probability in ``chaospy `` as a joint
64+ probability distribution. For example:
6065
6166.. code-block :: python
6267
63- distribution = chaospy.J(chaospy.Uniform(1 , 2 ), chaospy.Uniform(0.1 , 0.2 ))
68+ >> > distribution = chaospy.J(
69+ ... chaospy.Uniform(1 , 2 ), chaospy.Normal(0 , 2 ))
70+ >> > print (distribution)
71+ J(Uniform(lower = 1 , upper = 2 ), Normal(mu = 0 , sigma = 2 ))
6472
65- Construct polynomial chaos expansion:
73+ Most probability distributions have an associated expansion of orthogonal
74+ polynomials. These can be automatically constructed:
6675
6776.. code-block :: python
6877
69- polynomial_expansion = chaospy.generate_expansion(8 , distribution)
78+ >> > expansion = chaospy.generate_expansion(8 , distribution)
79+ >> > print (expansion[:5 ].round(8 ))
80+ [1.0 q1 q0- 1.5 q0* q1- 1.5 * q1 q0** 2 - 3.0 * q0+ 2.16666667 ]
7081
71- Generate random samples from for example Halton low-discrepancy sequence:
82+ Here the polynomial is defined positional, such that ``q0 `` and ``q1 `` refers
83+ to the uniform and normal distribution respectively.
84+
85+ The distribution can also be used to create (pseudo-)random samples and
86+ low-discrepancy sequences. For example to create Sobol sequence samples:
7287
7388.. code-block :: python
7489
75- samples = distribution.sample(1000 , rule = " halton" )
90+ >> > samples = distribution.sample(1000 , rule = " sobol" )
91+ >> > print (samples[:, :4 ].round(8 ))
92+ [[ 1.5 1.75 1.25 1.375 ]
93+ [ 0 . - 1.3489795 1.3489795 - 0.63727873 ]]
7694
77- Evaluate function for each sample :
95+ We can evaluating the forward solver using these samples :
7896
7997.. code-block :: python
8098
81- evals = numpy.array([foo(coordinates, sample) for sample in samples.T])
99+ >> > evaluations = numpy.array([
100+ ... forward_solver(coordinates, sample) for sample in samples.T])
101+ >> > print (evaluations[:3 , :5 ].round(8 ))
102+ [[1.5 1.5 1.5 1.5 1.5 ]
103+ [1.75 2.00546578 2.29822457 2.63372042 3.0181921 ]
104+ [1.25 1.09076905 0.95182169 0.83057411 0.72477163 ]]
82105
83- Bring the parts together using point collocation method:
106+ Having all these components in place, we have enough components to perform
107+ point collocation. Or in other words, we can create a polynomial approximation
108+ of ``forward_solver ``:
84109
85110.. code-block :: python
86111
87- foo_approx = chaospy.fit_regression(
88- polynomial_expansion, samples, evals)
112+ >> > approx_solver = chaospy.fit_regression(
113+ ... expansion, samples, evaluations)
114+ >> > print (approx_solver[:2 ].round(4 ))
115+ [q0 - 0.0002 * q0* q1** 3 + 0.0051 * q0* q1** 2 - 0.101 * q0* q1+ q0]
89116
90- Derive statistics from model approximation:
117+ Since the model approximations are polynomials, we can do inference on them
118+ directly. For example:
91119
92120.. code-block :: python
93121
94- expected = chaospy.E(foo_approx, distribution)
95- deviation = chaospy.Std(foo_approx, distribution)
96- sobol_main = chaospy.Sens_m(foo_approx, distribution)
97- sobol_total = chaospy.Sens_t(foo_approx, distribution)
122+ >> > expected = chaospy.E(approx_solver, distribution)
123+ >> > print (expected[:5 ].round(8 ))
124+ [1.5 1.53092356 1.62757217 1.80240142 2.07915608 ]
125+ >> > deviation = chaospy.Std(approx_solver, distribution)
126+ >> > print (deviation[:5 ].round(8 ))
127+ [0.28867513 0.43364958 0.76501802 1.27106355 2.07110879 ]
98128
99- For a more extensive guides on what is going on, see the `tutorial collection `_.
129+ For more extensive guides on this approach an others, see the `tutorial
130+ collection `_.
100131
101132.. _tutorial collection : https://chaospy.readthedocs.io/en/master/tutorials
102133
0 commit comments