You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- define two methods `_polynomial(x, a...)` and `_polynomial(x, a)`
184
-
- recall that these kind of optimization are run just after type inference
209
+
- define two methods `_polynomial!(ac, x, a...)` and `_polynomial!(ac, x, a)` for the case of ≥2 coefficients and the last coefficient
210
+
- use splatting together with range indexing `a[1:end-1]...`
211
+
- the correctness can be checked using the built-in `evalpoly`
212
+
- recall that these kind of optimization are possible just around the type inference stage
185
213
- use container of known length to store the coefficients
186
214
187
215
```@raw html
@@ -200,8 +228,8 @@ a = Tuple(ones(Int, 21)) # everything less than 22 gets inlined
200
228
x = 2
201
229
polynomial(a,x) == evalpoly(x,a) # compare with built-in function
202
230
231
+
# @code_llvm polynomial(a,x) # seen here too, but code_typed is a better option
203
232
@code_lowered polynomial(a,x) # cannot be seen here as optimizations are not applied
204
-
@code_llvm polynomial(a,x) # seen here too, but code_typed is a better option
205
233
nothing #hide
206
234
```
207
235
@@ -216,7 +244,7 @@ nothing #hide
216
244
## AST manipulation: The first steps to metaprogramming
217
245
Julia is so called homoiconic language, as it allows the language to reason about its code. This capability is inspired by years of development in other languages such as Lisp, Clojure or Prolog.
218
246
219
-
There are two easy ways to extract/construct the code structure [^3]
247
+
There are two easy ways to extract/construct the code structure [^5]
220
248
- parsing code stored in string with internal `Meta.parse`
221
249
```@repl lab06_meta
222
250
code_parse = Meta.parse("x = 2") # for single line expressions (additional spaces are ignored)
@@ -317,15 +345,14 @@ Given a function `replace_i`, which replaces variables `i` for `k` in an express
317
345
ex = :(i + i*i + y*i - sin(z))
318
346
@test replace_i(ex) == :(k + k*k + y*k - sin(z))
319
347
```
320
-
write function `sreplace_i(s)` which does the same thing but instead of a parsed expression (AST) it manipulates a string, such as
348
+
write a different function `sreplace_i(s)`, which does the same thing but instead of a parsed expression (AST) it manipulates a string, such as
321
349
```@repl lab06_meta
322
350
s = string(ex)
323
351
```
324
-
Think of some corner cases, that the method may not handle properly.
325
-
326
352
**HINTS**:
327
353
- Use `Meta.parse` in combination with `replace_i`**ONLY** for checking of correctness.
328
-
- You can use the `replace` function.
354
+
- You can use the `replace` function in combination with regular expressions.
355
+
- Think of some corner cases, that the method may not handle properly.
329
356
330
357
```@raw html
331
358
</div></div>
@@ -394,9 +421,8 @@ eval(ex)
394
421
395
422
This kind of manipulation is at the core of some pkgs, such as aforementioned [`IntervalArithmetics.jl`](https://github.com/JuliaIntervals/IntervalArithmetic.jl) where every number is replaced with a narrow interval in order to find some bounds on the result of a computation.
396
423
397
-
398
-
[^3]: Once you understand the recursive structure of expressions, the AST can be constructed manually like any other type.
399
424
---
425
+
[^5]: Once you understand the recursive structure of expressions, the AST can be constructed manually like any other type.
400
426
401
427
## Resources
402
428
- Julia's manual on [metaprogramming](https://docs.julialang.org/en/v1/manual/metaprogramming/)
Copy file name to clipboardExpand all lines: docs/src/lecture_06/lecture.md
+5-6Lines changed: 5 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -153,7 +153,8 @@ We can see that
153
153
- The expression `a + b` has been also replaced with its implementation in terms of the `add_int` intrinsic and its result type annotated as Int64.
154
154
- And the return type of the entire function body has been annotated as `Int64`.
155
155
- The phi-instruction `%2 = φ (#1 => 1, #3 => %6)` is a **selector function**, which returns the value depending on from which branch do you come from. In this case, variable `%2` will have value 1, if the control was transfered from block `#1` and it will have value copied from variable `%6`if the control was transferreed from block `3` [see also](https://llvm.org/docs/LangRef.html#phi-instruction). The `φ` stands from *phony* variable.
156
-
When we have called `@code_lower`, the role of types of the argument was in selecting the approapriate function body, they are needed for multiple dispatch. Contrary in`@code_typed`, the types of parameters determine the choice if inner methods that needs to be called (again the multiple dispatch), which can trigger other optimization, such as inlining, which seen in`One(n)`.
156
+
157
+
When we have called `@code_lower`, the role of types of arguments was in selecting - via multiple dispatch - the appropriate function body among different methods. Contrary in`@code_typed`, the types of parameters determine the choice of inner methods that need to be called (again with multiple dispatch). This process can trigger other optimization, such as inlining, as seen in the case of `one(n)` being replaced with `1` directly, though here this replacement is hidden in the `φ`function.
157
158
158
159
Note that the same view of the code is offered by the `@code_warntype`macro, which we have seen in the previous [lecture](@ref perf_lecture). The main difference from `@code_typed` is that it highlights type instabilities with red color and shows only unoptimized view of the code. You can view the unoptimized code with a keyword argument `optimize=false`:
159
160
```julia
@@ -247,7 +248,7 @@ and the output is used mainly for debugging / inspection.
247
248
Language introspection is very convenient for investigating, how things are implemented and how they are optimized / compiled to the native code.
248
249
249
250
!!! note "Reminder `@which`"
250
-
Though we have already used it quite a few times, recall the very useful macro`@which`, which identifies the concrete function called inthefunction call. For example `@which mapreduce(sin, +, [1,2,3,4])`. Note again that the macro here is a convenience macro to obtain types of arguments from the expression. Under the hood, it calls `InteractiveUtils.which(function_name, (Base.typesof)(args...))`. Funnily enough, you can call `@which InteractiveUtils.which(+, (Base.typesof)(1,1))` to inspect, where`which` is defined.
251
+
Though we have already used it quite a few times, recall the very useful macro`@which`, which identifies the concrete function called inafunction call. For example `@which mapreduce(sin, +, [1,2,3,4])`. Note again that the macro here is a convenience macro to obtain types of arguments from the expression. Under the hood, it calls `InteractiveUtils.which(function_name, (Base.typesof)(args...))`. Funnily enough, you can call `@which InteractiveUtils.which(+, (Base.typesof)(1,1))` to inspect, where`which` is defined.
251
252
252
253
### Broadcasting
253
254
Broadcasting is not a unique concept in programming languages (Python/Numpy, MATLAB), however its implementation in Julia allows to easily fuse operations. For example
@@ -349,7 +350,7 @@ The type returned by the quotation depends on what is quoted. Observe the return
349
350
```julia
350
351
:(1) |> typeof
351
352
:(:x) |> typeof
352
-
:(1 + x) |> typeof
353
+
:(1 + x) |> typeof
353
354
quote
354
355
1 + x
355
356
x + 1
@@ -428,9 +429,7 @@ The parsed code `p` is of type `Expr`, which according to Julia's help[^2] is *a
428
429
429
430
[^3]: An example provided by Stefan Karpinski [https://stackoverflow.com/questions/23480722/what-is-a-symbol-in-julia](https://stackoverflow.com/questions/23480722/what-is-a-symbol-in-julia)
We want you to use Julia for something that is acutally useful for you.
4
+
Therefore you can choose your final, graded project very freely.
5
+
We will discuss your ideas with you individually and come up with a sufficiently
6
+
extensive project together.
7
+
8
+
For you inspiration of what such a project could look like we have four
9
+
suggestions for you (which you can of course choose to work on as well).
10
+
11
+
## The Equation Learner And Its Symbolic Representation
12
+
13
+
In many scientific and engineering one searches for interpretable (i.e.
14
+
human-understandable) models instead of the black-box function approximators
15
+
that neural networks provide.
16
+
The [*equation learner*](http://proceedings.mlr.press/v80/sahoo18a.html) (EQL)
17
+
is one approach that can identify concise equations that describe a given
18
+
dataset.
19
+
20
+
The EQL is essentially a neural network with different unary or binary
21
+
activation functions at each indiviual unit. The network weights are
22
+
regularized during training to obtain a sparse model which hopefully results in
23
+
a model that represents a simple equation.
24
+
25
+
The goal of this project is to implement the EQL, and if there is enough time
26
+
the [*improved equation learner*](https://arxiv.org/abs/2105.06331) (iEQL).
27
+
The equation learners should be tested on a few toy problems (possibly inspired
28
+
by the tasks in the papers). Finally, you will implement functionality that
29
+
can transform the learned model into a symbolic, human readable, and exectuable
30
+
Julia expression.
31
+
32
+
## An Evolutionary Algorithm Applied To Julia's AST
33
+
34
+
Most of the approaches to equation learning have to be differentiable by default
35
+
in order to use the traditional machinery of stochastic gradient descent with
36
+
backpropagation. This often leads to equations with too many terms, requiring
37
+
special techniques for enforcing sparsity for terms with low weights.
38
+
39
+
In Julia we can however use a different learning paradigm of evolutionary
40
+
algorithms, which can work on discrete set of expressions. The goal is to
41
+
write mutation and recombination - the basic operators of a genetic algorithm,
42
+
but applied on top of Julia AST.
43
+
44
+
Data: AI Feyman [database](https://space.mit.edu/home/tegmark/aifeynman.html) on symbolic regression (from [article](https://arxiv.org/pdf/1905.11481.pdf)/[code](https://github.com/SJ001/AI-Feynman))
- Genetic Programming for Julia: fast performance and parallel island model implementation [report](http://courses.csail.mit.edu/18.337/2015/projects/MorganFrank/projectReport.pdf)
49
+
50
+
## Distributed Optimization Package
51
+
52
+
One click distributed optimization is at the heart of other machine learning
53
+
and optimization libraries such as pytorch, however some equivalents are
54
+
missing in the Julia's Flux ecosystem. The goal of this project is to explore,
55
+
implement and compare at least two state-of-the-art methods of distributed
56
+
gradient descent on data that will be provided for you.
57
+
58
+
Some of the work has already been done in this area by one of our former students,
59
+
see [link](https://dspace.cvut.cz/handle/10467/97057).
0 commit comments