Skip to content

Commit 957ef5a

Browse files
committed
2 parents 797641f + 982660e commit 957ef5a

File tree

6 files changed

+113
-20
lines changed

6 files changed

+113
-20
lines changed

docs/make.jl

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -115,6 +115,7 @@ makedocs(;
115115
pages = [
116116
"Home" => "index.md",
117117
"Installation" => "installation.md",
118+
"Projects" => "projects.md",
118119
"1: Introduction" => lecture_01,
119120
"2: The power of Type System & multiple dispatch" => lecture_02,
120121
"3: Design patterns" => lecture_03,

docs/src/index.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -59,9 +59,7 @@ Hint: The **first few homeworks are easier**. Use them to fill up your points.
5959
The final project will be individually agreed on for each student. Ideally you
6060
can use this project to solve a problem you have e.g. in your thesis, but don't
6161
worry - if you cannot come up with an own project idea, we will suggest one to
62-
you.
63-
64-
The evaluation criteria of the final project are ...
62+
you. More info and project suggestion can be found [here](@ref projects).
6563

6664

6765
## Grading

docs/src/lecture_06/hw.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ x + 2*y*z - c*x
1515
```
1616
return an array of *unique alphabetically sorted symbols* representing variables in an expression.
1717
```julia
18-
[:x, :y, :z, :c]
18+
[:c, :x, :y, :z]
1919
```
2020
Implement this in a function called `find_variables`. Note that there may be some edge cases that you may have to handle in a special way, such as
2121
- variable assignments `r = x*x` should return the variable on the left as well (`r` in this case)

docs/src/lecture_06/lab.md

Lines changed: 36 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -59,6 +59,7 @@ function implicit_len()
5959
end
6060
nothing #hide
6161
```
62+
For now do not try to understand the details, but focus on the overall differences such as length of the code.
6263

6364
!!! info "Redirecting `stdout`"
6465
If the output of the method introspection tools is too long you can use a general way of redirecting standard output `stdout` to a file
@@ -179,9 +180,36 @@ function polynomial(a, x)
179180
end
180181
```
181182

183+
!!! info "Splatting/slurping operator `...`"
184+
The operator `...` serves two purposes inside function calls [^3][^4]:
185+
- combines multiple arguments into one
186+
```@repl lab06_splat
187+
function printargs(args...)
188+
println(typeof(args))
189+
for (i, arg) in enumerate(args)
190+
println("Arg #$i = $arg")
191+
end
192+
end
193+
printargs(1, 2, 3)
194+
```
195+
- splits one argument into many different arguments
196+
```@repl lab06_splat
197+
function threeargs(a, b, c)
198+
println("a = $a::$(typeof(a))")
199+
println("b = $b::$(typeof(b))")
200+
println("c = $c::$(typeof(c))")
201+
end
202+
threeargs([1,2,3]...) # or with a variable threeargs(x...)
203+
```
204+
205+
[^3]: [https://docs.julialang.org/en/v1/manual/faq/#What-does-the-...-operator-do?](https://docs.julialang.org/en/v1/manual/faq/#What-does-the-...-operator-do?)
206+
[^4]: [https://docs.julialang.org/en/v1/manual/functions/#Varargs-Functions](https://docs.julialang.org/en/v1/manual/functions/#Varargs-Functions)
207+
182208
**HINTS**:
183-
- define two methods `_polynomial(x, a...)` and `_polynomial(x, a)`
184-
- recall that these kind of optimization are run just after type inference
209+
- define two methods `_polynomial!(ac, x, a...)` and `_polynomial!(ac, x, a)` for the case of ≥2 coefficients and the last coefficient
210+
- use splatting together with range indexing `a[1:end-1]...`
211+
- the correctness can be checked using the built-in `evalpoly`
212+
- recall that these kind of optimization are possible just around the type inference stage
185213
- use container of known length to store the coefficients
186214

187215
```@raw html
@@ -200,8 +228,8 @@ a = Tuple(ones(Int, 21)) # everything less than 22 gets inlined
200228
x = 2
201229
polynomial(a,x) == evalpoly(x,a) # compare with built-in function
202230
231+
# @code_llvm polynomial(a,x) # seen here too, but code_typed is a better option
203232
@code_lowered polynomial(a,x) # cannot be seen here as optimizations are not applied
204-
@code_llvm polynomial(a,x) # seen here too, but code_typed is a better option
205233
nothing #hide
206234
```
207235

@@ -216,7 +244,7 @@ nothing #hide
216244
## AST manipulation: The first steps to metaprogramming
217245
Julia is so called homoiconic language, as it allows the language to reason about its code. This capability is inspired by years of development in other languages such as Lisp, Clojure or Prolog.
218246

219-
There are two easy ways to extract/construct the code structure [^3]
247+
There are two easy ways to extract/construct the code structure [^5]
220248
- parsing code stored in string with internal `Meta.parse`
221249
```@repl lab06_meta
222250
code_parse = Meta.parse("x = 2") # for single line expressions (additional spaces are ignored)
@@ -317,15 +345,14 @@ Given a function `replace_i`, which replaces variables `i` for `k` in an express
317345
ex = :(i + i*i + y*i - sin(z))
318346
@test replace_i(ex) == :(k + k*k + y*k - sin(z))
319347
```
320-
write function `sreplace_i(s)` which does the same thing but instead of a parsed expression (AST) it manipulates a string, such as
348+
write a different function `sreplace_i(s)`, which does the same thing but instead of a parsed expression (AST) it manipulates a string, such as
321349
```@repl lab06_meta
322350
s = string(ex)
323351
```
324-
Think of some corner cases, that the method may not handle properly.
325-
326352
**HINTS**:
327353
- Use `Meta.parse` in combination with `replace_i` **ONLY** for checking of correctness.
328-
- You can use the `replace` function.
354+
- You can use the `replace` function in combination with regular expressions.
355+
- Think of some corner cases, that the method may not handle properly.
329356

330357
```@raw html
331358
</div></div>
@@ -394,9 +421,8 @@ eval(ex)
394421

395422
This kind of manipulation is at the core of some pkgs, such as aforementioned [`IntervalArithmetics.jl`](https://github.com/JuliaIntervals/IntervalArithmetic.jl) where every number is replaced with a narrow interval in order to find some bounds on the result of a computation.
396423

397-
398-
[^3]: Once you understand the recursive structure of expressions, the AST can be constructed manually like any other type.
399424
---
425+
[^5]: Once you understand the recursive structure of expressions, the AST can be constructed manually like any other type.
400426

401427
## Resources
402428
- Julia's manual on [metaprogramming](https://docs.julialang.org/en/v1/manual/metaprogramming/)

docs/src/lecture_06/lecture.md

Lines changed: 5 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -153,7 +153,8 @@ We can see that
153153
- The expression `a + b` has been also replaced with its implementation in terms of the `add_int` intrinsic and its result type annotated as Int64.
154154
- And the return type of the entire function body has been annotated as `Int64`.
155155
- The phi-instruction `%2 = φ (#1 => 1, #3 => %6)` is a **selector function**, which returns the value depending on from which branch do you come from. In this case, variable `%2` will have value 1, if the control was transfered from block `#1` and it will have value copied from variable `%6` if the control was transferreed from block `3` [see also](https://llvm.org/docs/LangRef.html#phi-instruction). The `φ` stands from *phony* variable.
156-
When we have called `@code_lower`, the role of types of the argument was in selecting the approapriate function body, they are needed for multiple dispatch. Contrary in `@code_typed`, the types of parameters determine the choice if inner methods that needs to be called (again the multiple dispatch), which can trigger other optimization, such as inlining, which seen in `One(n)`.
156+
157+
When we have called `@code_lower`, the role of types of arguments was in selecting - via multiple dispatch - the appropriate function body among different methods. Contrary in `@code_typed`, the types of parameters determine the choice of inner methods that need to be called (again with multiple dispatch). This process can trigger other optimization, such as inlining, as seen in the case of `one(n)` being replaced with `1` directly, though here this replacement is hidden in the `φ` function.
157158

158159
Note that the same view of the code is offered by the `@code_warntype` macro, which we have seen in the previous [lecture](@ref perf_lecture). The main difference from `@code_typed` is that it highlights type instabilities with red color and shows only unoptimized view of the code. You can view the unoptimized code with a keyword argument `optimize=false`:
159160
```julia
@@ -247,7 +248,7 @@ and the output is used mainly for debugging / inspection.
247248
Language introspection is very convenient for investigating, how things are implemented and how they are optimized / compiled to the native code.
248249

249250
!!! note "Reminder `@which`"
250-
Though we have already used it quite a few times, recall the very useful macro `@which`, which identifies the concrete function called in the function call. For example `@which mapreduce(sin, +, [1,2,3,4])`. Note again that the macro here is a convenience macro to obtain types of arguments from the expression. Under the hood, it calls `InteractiveUtils.which(function_name, (Base.typesof)(args...))`. Funnily enough, you can call `@which InteractiveUtils.which(+, (Base.typesof)(1,1))` to inspect, where `which` is defined.
251+
Though we have already used it quite a few times, recall the very useful macro `@which`, which identifies the concrete function called in a function call. For example `@which mapreduce(sin, +, [1,2,3,4])`. Note again that the macro here is a convenience macro to obtain types of arguments from the expression. Under the hood, it calls `InteractiveUtils.which(function_name, (Base.typesof)(args...))`. Funnily enough, you can call `@which InteractiveUtils.which(+, (Base.typesof)(1,1))` to inspect, where `which` is defined.
251252

252253
### Broadcasting
253254
Broadcasting is not a unique concept in programming languages (Python/Numpy, MATLAB), however its implementation in Julia allows to easily fuse operations. For example
@@ -349,7 +350,7 @@ The type returned by the quotation depends on what is quoted. Observe the return
349350
```julia
350351
:(1) |> typeof
351352
:(:x) |> typeof
352-
:(1 + x) |> typeof
353+
:(1 + x) |> typeof
353354
quote
354355
1 + x
355356
x + 1
@@ -428,9 +429,7 @@ The parsed code `p` is of type `Expr`, which according to Julia's help[^2] is *a
428429
429430
[^3]: An example provided by Stefan Karpinski [https://stackoverflow.com/questions/23480722/what-is-a-symbol-in-julia](https://stackoverflow.com/questions/23480722/what-is-a-symbol-in-julia)
430431
431-
!!! info
432-
### Expressions
433-
432+
!!! info "`Expr`essions"
434433
From Julia's help[^2]:
435434
436435
`Expr(head::Symbol, args...)`

docs/src/projects.md

Lines changed: 69 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,69 @@
1+
# [Projects](@id projects)
2+
3+
We want you to use Julia for something that is acutally useful for you.
4+
Therefore you can choose your final, graded project very freely.
5+
We will discuss your ideas with you individually and come up with a sufficiently
6+
extensive project together.
7+
8+
For you inspiration of what such a project could look like we have four
9+
suggestions for you (which you can of course choose to work on as well).
10+
11+
## The Equation Learner And Its Symbolic Representation
12+
13+
In many scientific and engineering one searches for interpretable (i.e.
14+
human-understandable) models instead of the black-box function approximators
15+
that neural networks provide.
16+
The [*equation learner*](http://proceedings.mlr.press/v80/sahoo18a.html) (EQL)
17+
is one approach that can identify concise equations that describe a given
18+
dataset.
19+
20+
The EQL is essentially a neural network with different unary or binary
21+
activation functions at each indiviual unit. The network weights are
22+
regularized during training to obtain a sparse model which hopefully results in
23+
a model that represents a simple equation.
24+
25+
The goal of this project is to implement the EQL, and if there is enough time
26+
the [*improved equation learner*](https://arxiv.org/abs/2105.06331) (iEQL).
27+
The equation learners should be tested on a few toy problems (possibly inspired
28+
by the tasks in the papers). Finally, you will implement functionality that
29+
can transform the learned model into a symbolic, human readable, and exectuable
30+
Julia expression.
31+
32+
## An Evolutionary Algorithm Applied To Julia's AST
33+
34+
Most of the approaches to equation learning have to be differentiable by default
35+
in order to use the traditional machinery of stochastic gradient descent with
36+
backpropagation. This often leads to equations with too many terms, requiring
37+
special techniques for enforcing sparsity for terms with low weights.
38+
39+
In Julia we can however use a different learning paradigm of evolutionary
40+
algorithms, which can work on discrete set of expressions. The goal is to
41+
write mutation and recombination - the basic operators of a genetic algorithm,
42+
but applied on top of Julia AST.
43+
44+
Data: AI Feyman [database](https://space.mit.edu/home/tegmark/aifeynman.html) on symbolic regression (from [article](https://arxiv.org/pdf/1905.11481.pdf)/[code](https://github.com/SJ001/AI-Feynman))
45+
Inspiration:
46+
- Logic Guided Genetic Algorithms [article](https://arxiv.org/pdf/2010.11328.pdf)/[code](https://github.com/DhananjayAshok/LGGA)
47+
- AI Feynman 2.0: Pareto-optimal symbolic regression exploiting graph modularity [article](https://arxiv.org/pdf/2006.10782)
48+
- Genetic Programming for Julia: fast performance and parallel island model implementation [report](http://courses.csail.mit.edu/18.337/2015/projects/MorganFrank/projectReport.pdf)
49+
50+
## Distributed Optimization Package
51+
52+
One click distributed optimization is at the heart of other machine learning
53+
and optimization libraries such as pytorch, however some equivalents are
54+
missing in the Julia's Flux ecosystem. The goal of this project is to explore,
55+
implement and compare at least two state-of-the-art methods of distributed
56+
gradient descent on data that will be provided for you.
57+
58+
Some of the work has already been done in this area by one of our former students,
59+
see [link](https://dspace.cvut.cz/handle/10467/97057).
60+
61+
## A Rule Learning Algorithm
62+
63+
[Rule-based models](https://christophm.github.io/interpretable-ml-book/rules.html)
64+
are simple and very interpretable models that have been around for a long time
65+
and are gaining popularity again.
66+
The goal of this project is to implement a
67+
[sequential covering](https://christophm.github.io/interpretable-ml-book/rules.html#sequential-covering)
68+
algorithm called [`RIPPER`](http://www.cs.utsa.edu/~bylander/cs6243/cohen95ripper.pdf)
69+
and evaluate it on a number of datasets.

0 commit comments

Comments
 (0)