Skip to content

Commit a2b24b6

Browse files
authored
Merge pull request #47 from probcomp/jls-back-in-sync
bring the `.ipynb`s and the Jupytext `.jl`s back into sync
2 parents 93d5942 + b980a35 commit a2b24b6

18 files changed

+80
-50
lines changed

tutorials/A Bottom-Up Introduction to Gen.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1151,9 +1151,9 @@
11511151
],
11521152
"metadata": {
11531153
"kernelspec": {
1154-
"display_name": "Julia 1.0.2",
1154+
"display_name": "Julia 1.1.1",
11551155
"language": "julia",
1156-
"name": "julia-1.0"
1156+
"name": "julia-1.1"
11571157
}
11581158
},
11591159
"nbformat": 4,

tutorials/A Bottom-Up Introduction to Gen.jl

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,9 @@
77
# format_version: '1.5'
88
# jupytext_version: 1.3.3
99
# kernelspec:
10-
# display_name: Julia 1.0.2
10+
# display_name: Julia 1.1.1
1111
# language: julia
12-
# name: julia-1.0
12+
# name: julia-1.1
1313
# ---
1414

1515
# # A Bottom-Up Introduction to Gen
@@ -48,7 +48,7 @@ end;
4848
# n = uniform_discrete(1, 10)
4949
# ```
5050
#
51-
# Then, with probability `p`, it multiplies by `n` by two:
51+
# Then, with probability `p`, it multiplies `n` by two:
5252
# ```julia
5353
# if bernoulli(p)
5454
# n *= 2

tutorials/Data-Driven Proposals in Gen.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1492,9 +1492,9 @@
14921492
],
14931493
"metadata": {
14941494
"kernelspec": {
1495-
"display_name": "Julia 1.0.2",
1495+
"display_name": "Julia 1.1.1",
14961496
"language": "julia",
1497-
"name": "julia-1.0"
1497+
"name": "julia-1.1"
14981498
}
14991499
},
15001500
"nbformat": 4,

tutorials/Data-Driven Proposals in Gen.jl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,9 @@
77
# format_version: '1.5'
88
# jupytext_version: 1.3.3
99
# kernelspec:
10-
# display_name: Julia 1.0.2
10+
# display_name: Julia 1.1.1
1111
# language: julia
12-
# name: julia-1.0
12+
# name: julia-1.1
1313
# ---
1414

1515
# # Tutorial: Data-Driven Proposals in Gen

tutorials/Introduction to Modeling in Gen.ipynb

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1828,9 +1828,15 @@
18281828
],
18291829
"metadata": {
18301830
"kernelspec": {
1831-
"display_name": "Julia 1.0.2",
1831+
"display_name": "Julia 1.1.1",
18321832
"language": "julia",
1833-
"name": "julia-1.0"
1833+
"name": "julia-1.1"
1834+
},
1835+
"language_info": {
1836+
"file_extension": ".jl",
1837+
"mimetype": "application/julia",
1838+
"name": "julia",
1839+
"version": "1.1.1"
18341840
}
18351841
},
18361842
"nbformat": 4,

tutorials/Introduction to Modeling in Gen.jl

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,9 @@
77
# format_version: '1.5'
88
# jupytext_version: 1.3.3
99
# kernelspec:
10-
# display_name: Julia 1.0.2
10+
# display_name: Julia 1.1.1
1111
# language: julia
12-
# name: julia-1.0
12+
# name: julia-1.1
1313
# ---
1414

1515
# # Tutorial: Introduction to Modeling in Gen
@@ -125,7 +125,7 @@ println(trace)
125125

126126
Gen.get_args(trace)
127127

128-
# The trace also contains the value of the random choices, stored in map from address to value called a *choice map*. This map is available through the API method [`get_choices`]():
128+
# The trace also contains the value of the random choices, stored in a map from address to value called a *choice map*. This map is available through the API method [`get_choices`]():
129129

130130
println(Gen.get_choices(trace))
131131

@@ -347,10 +347,10 @@ function predict_new_data(model, trace, new_xs::Vector{Float64}, param_addrs)
347347
end
348348

349349
# Run the model with new x coordinates, and with parameters
350-
# fixed to be the inferred values
350+
# fixed to be the inferred values.
351351
(new_trace, _) = Gen.generate(model, (new_xs,), constraints)
352352

353-
# Pull out the y-values and return them
353+
# Pull out the y-values and return them.
354354
ys = [new_trace[(:y, i)] for i=1:length(new_xs)]
355355
return ys
356356
end;
@@ -392,7 +392,7 @@ plot_predictions(xs, ys, new_xs, pred_ys)
392392

393393
# The results look reasonable, both within the interval of observed data and in the extrapolated predictions on the right.
394394

395-
# Now consider the same experiment run with following data set, which has significantly more noise.
395+
# Now consider the same experiment run with the following data set, which has significantly more noise.
396396

397397
ys_noisy = [5.092, 4.781, 2.46815, 1.23047, 0.903318, 1.11819, 2.10808, 1.09198, 0.0203789, -2.05068, 2.66031];
398398

@@ -415,7 +415,7 @@ plot_predictions(xs, ys_noisy, new_xs, pred_ys)
415415
return nothing
416416
end;
417417

418-
# Then, we compare the predictions using inference the unmodified and modified model on the `ys` data set:
418+
# Then, we compare the predictions using inference of the unmodified and modified models on the `ys` data set:
419419

420420
# +
421421
figure(figsize=(6,3))
@@ -433,7 +433,7 @@ plot_predictions(xs, ys, new_xs, pred_ys)
433433

434434
# Notice that there is more uncertainty in the predictions made using the modified model.
435435
#
436-
# We also compare the predictions using inference the unmodified and modified model on the `ys_noisy` data set:
436+
# We also compare the predictions using inference of the unmodified and modified models on the `ys_noisy` data set:
437437

438438
# +
439439
figure(figsize=(6,3))
@@ -454,7 +454,7 @@ plot_predictions(xs, ys_noisy, new_xs, pred_ys)
454454
# -------------------------
455455
# ### Exercise
456456
#
457-
# Write a modified version the sine model that makes noise into a random choice. Compare the predicted data with the observed data `infer_and_predict` and `plot_predictions` for the unmodified and modified model, and for the `ys_sine` and `ys_noisy` datasets. Discuss the results. Experiment with the amount of inference computation used. The amount of inference computation will need to be higher for the model with the noise random choice.
457+
# Write a modified version of the sine model that makes noise into a random choice. Compare the predicted data with the observed data using `infer_and_predict` and `plot_predictions` for the unmodified and modified models, and for the `ys_sine` and `ys_noisy` data sets. Discuss the results. Experiment with the amount of inference computation used. The amount of inference computation will need to be higher for the model with the noise as a random choice.
458458
#
459459
# We have provided you with starter code:
460460

@@ -667,7 +667,7 @@ overlay(render_combined_refactored, traces)
667667

668668
# ## 6. Modeling with an unbounded number of parameters <a name="infinite-space"></a>
669669

670-
# Gen's built-in modeling language can be used to express models that use an unbounded number of parameters. This section walks you through development of a model of data that does not a-priori specify an upper bound on the complexity of the model, but instead infers the complexity of the model as well as the parameters. This is a simple example of a *Bayesian nonparametric* model.
670+
# Gen's built-in modeling language can be used to express models that use an unbounded number of parameters. This section walks you through development of a model of data that does not a priori specify an upper bound on the complexity of the model, but instead infers the complexity of the model as well as the parameters. This is a simple example of a *Bayesian nonparametric* model.
671671

672672
# We will consider two data sets:
673673

@@ -689,7 +689,7 @@ scatter(xs_dense, ys_complex, color="black", s=10)
689689
gca().set_ylim((-1, 3))
690690
# -
691691

692-
# The data set on the left appears to be best explained as a contant function with some noise. The data set on the right appears to include two changepoints, with a constant function in between the changepoints. We want a model that does not a-priori choose the number of changepoints in the data. To do this, we will recursively partition the interval into regions. We define a Julia data structure that represents a binary tree of intervals; each leaf node represents a region in which the function is constant.
692+
# The data set on the left appears to be best explained as a contant function with some noise. The data set on the right appears to include two changepoints, with a constant function in between the changepoints. We want a model that does not a priori choose the number of changepoints in the data. To do this, we will recursively partition the interval into regions. We define a Julia data structure that represents a binary tree of intervals; each leaf node represents a region in which the function is constant.
693693

694694
struct Interval
695695
l::Float64
@@ -755,7 +755,7 @@ grid(render_segments_trace, traces)
755755

756756
# Because we only sub-divide an interval with 30% probability, most of these sampled traces have only one segment.
757757

758-
# Now that we have generative function that generates a random piecewise-constant function, we write a model that adds noise to the resulting constant functions to generate a data set of y-coordinates. The noise level will be a random choice.
758+
# Now that we have a generative function that generates a random piecewise-constant function, we write a model that adds noise to the resulting constant functions to generate a data set of y-coordinates. The noise level will be a random choice.
759759

760760
# +
761761
# get_value_at searches a binary tree for
@@ -774,7 +774,7 @@ function get_value_at(x::Float64, node::InternalNode)
774774
end
775775
end
776776

777-
# Out full model
777+
# Our full model
778778
@gen function changepoint_model(xs::Vector{Float64})
779779
node = @trace(generate_segments(minimum(xs), maximum(xs)), :tree)
780780
noise = @trace(gamma(1, 1), :noise)

tutorials/Iterative inference in Gen.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1201,9 +1201,9 @@
12011201
"encoding": "# -*- coding: utf-8 -*-"
12021202
},
12031203
"kernelspec": {
1204-
"display_name": "Julia 1.0.2",
1204+
"display_name": "Julia 1.1.1",
12051205
"language": "julia",
1206-
"name": "julia-1.0"
1206+
"name": "julia-1.1"
12071207
}
12081208
},
12091209
"nbformat": 4,

tutorials/Iterative inference in Gen.jl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,9 +8,9 @@
88
# format_version: '1.5'
99
# jupytext_version: 1.3.3
1010
# kernelspec:
11-
# display_name: Julia 1.0.2
11+
# display_name: Julia 1.1.1
1212
# language: julia
13-
# name: julia-1.0
13+
# name: julia-1.1
1414
# ---
1515

1616
# # Tutorial: Basics of Iterative Inference Programming in Gen

tutorials/Modeling with Black-Box Julia Code.ipynb

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -680,9 +680,15 @@
680680
],
681681
"metadata": {
682682
"kernelspec": {
683-
"display_name": "Julia 1.0.2",
683+
"display_name": "Cora Julia 1.1.1",
684684
"language": "julia",
685-
"name": "julia-1.0"
685+
"name": "cora-julia-1.1"
686+
},
687+
"language_info": {
688+
"file_extension": ".jl",
689+
"mimetype": "application/julia",
690+
"name": "julia",
691+
"version": "1.1.1"
686692
}
687693
},
688694
"nbformat": 4,

tutorials/Modeling with Black-Box Julia Code.jl

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,9 @@
77
# format_version: '1.5'
88
# jupytext_version: 1.3.3
99
# kernelspec:
10-
# display_name: Julia 1.0.2
10+
# display_name: Cora Julia 1.1.1
1111
# language: julia
12-
# name: julia-1.0
12+
# name: cora-julia-1.1
1313
# ---
1414

1515
# # Modeling with black-box Julia code
@@ -88,7 +88,7 @@ info = Dict("scene" => scene)
8888
viz = Viz(viz_server, joinpath(@__DIR__, "../inverse-planning/overlay-viz/dist"), info)
8989
displayInNotebook(viz)
9090

91-
# Next, we load a file that defines a `Path` data type (a sequence of `Points`), and a `plan_path` method, which uses a path planning algorithm based on rapidly exploring random tree (RRT, [1]) to find a sequence of `Point`s beginning with `start` and ending in `dest` such that the line segment between each consecutive pair of points does nt intersect any obstacles in the scene. The planning algorithm may fail to find a valid path, in which case it will return a value of type `Nothing`.
91+
# Next, we load a file that defines a `Path` data type (a sequence of `Points`), and a `plan_path` method, which uses a path planning algorithm based on rapidly exploring random tree (RRT, [1]) to find a sequence of `Point`s beginning with `start` and ending in `dest` such that the line segment between each consecutive pair of points does not intersect any obstacles in the scene. The planning algorithm may fail to find a valid path, in which case it will return a value of type `Nothing`.
9292
#
9393
# `path::Union{Path,Nothing} = plan_path(start::Point, dest::Point, scene::Scene, planner_params::PlannerParams)`
9494
#
@@ -217,7 +217,7 @@ end
217217
displayInNotebook(viz)
218218
# -
219219

220-
# In this visualization, the start location is represented by a blue dot, and the destination is represented by a red dot. The measured coordinates at each time point are represented by black dots. The path, if path planning was succesfull, is shown as a gray line fro the start point to the destination point. Notice that the speed of the agent is different in each case.
220+
# In this visualization, the start location is represented by a blue dot, and the destination is represented by a red dot. The measured coordinates at each time point are represented by black dots. The path, if path planning was succesfull, is shown as a gray line from the start point to the destination point. Notice that the speed of the agent is different in each case.
221221

222222
# ### Exercise
223223
#

0 commit comments

Comments
 (0)