Skip to content

Commit 5a4f294

Browse files
committed
Merge branch 'main' into safe-names-dev
2 parents 4066399 + f1ce5fa commit 5a4f294

File tree

8 files changed

+791
-20
lines changed

8 files changed

+791
-20
lines changed

.github/workflows/docs.yaml

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -32,6 +32,13 @@ jobs:
3232
${{ matrix.install_deps }}
3333
echo "${{ matrix.path_extension }}" >> $GITHUB_PATH
3434
35+
- name: Get example files
36+
run: |
37+
wget http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/t10k-images-idx3-ubyte.gz
38+
wget http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/t10k-labels-idx1-ubyte.gz
39+
gunzip t10k-images-idx3-ubyte.gz t10k-labels-idx1-ubyte.gz
40+
mv t10k-images-idx3-ubyte t10k-labels-idx1-ubyte $GITHUB_WORKSPACE/examples/
41+
3542
- name: Cache
3643
uses: actions/cache@v2
3744
with:

CITATION.cff

Lines changed: 62 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,62 @@
1+
cff-version: 1.2.0
2+
message: "If you use this software, please cite it as below."
3+
authors:
4+
- given-names: "Dougal"
5+
family-names: "Maclaurin"
6+
- given-names: "Adam"
7+
family-names: "Paszke"
8+
title: "Dex: typed and functional array processing"
9+
version: 0.0.0
10+
date-released: 2018-09-29
11+
url: "https://github.com/google-research/dex-lang"
12+
preferred-citation:
13+
type: article
14+
authors:
15+
- given-names: "Adam"
16+
family-names: "Paszke"
17+
- given-names: "Daniel D."
18+
family-names: "Johnson"
19+
- given-names: "David"
20+
family-names: "Duvenaud"
21+
- given-names: "Dimitrios"
22+
family-names: "Vytiniotis"
23+
- given-names: "Alexey"
24+
family-names: "Radul"
25+
- given-names: "Matthew J."
26+
family-names: "Johnson"
27+
- given-names: "Jonathan"
28+
family-names: "Ragan-Kelley"
29+
- given-names: "Dougal"
30+
family-names: "Maclaurin"
31+
doi: "10.1145/3473593"
32+
title: "Getting to the Point: Index Sets and Parallelism-Preserving Autodiff for Pointful Array Programming"
33+
volume: 5
34+
number: "ICFP"
35+
year: 2021
36+
month: 8
37+
issue-date: "August 2021"
38+
journal: "Proceedings of the ACM on Programming Languages"
39+
pages: 29
40+
abstract: >
41+
We present a novel programming language design that attempts to combine the clarity
42+
and safety of high-level functional languages with the efficiency and parallelism
43+
of low-level numerical languages. We treat arrays as eagerly-memoized functions on
44+
typed index sets, allowing abstract function manipulations, such as currying, to work
45+
on arrays. In contrast to composing primitive bulk-array operations, we argue for
46+
an explicit nested indexing style that mirrors application of functions to arguments.
47+
We also introduce a fine-grained typed effects system which affords concise and automatically-parallelized
48+
in-place updates. Specifically, an associative accumulation effect allows reverse-mode
49+
automatic differentiation of in-place updates in a way that preserves parallelism.
50+
Empirically, we benchmark against the Futhark array programming language, and demonstrate
51+
that aggressive inlining and type-driven compilation allows array programs to be written
52+
in an expressive, "pointful" style with little performance penalty.
53+
keywords:
54+
- "array programming"
55+
- "automatic differentiation"
56+
- "parallel computing"
57+
url: "https://doi.org/10.1145/3473593"
58+
publisher:
59+
name: "Association for Computing Machinery"
60+
city: "New York"
61+
region: "NY"
62+
country: "USA"

dex.cabal

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ maintainer: [email protected]
1313
license-file: LICENSE
1414
build-type: Simple
1515

16-
data-files: lib/*.dx
16+
data-files: lib/*.dx, static/*.html, static/*.js, static/*.css
1717
extra-source-files: lib/*.dx,
1818
src/lib/dexrt.bc,
1919
static/index.js, static/style.css

examples/kernelregression.dx

Lines changed: 40 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,8 @@
1+
import linalg
12
import plot
23

34
-- Conjugate gradients solver
4-
def solve {m} (mat:m=>m=>Float) (b:m=>Float) : m=>Float =
5+
def solve' {m} (mat:m=>m=>Float) (b:m=>Float) : m=>Float =
56
x0 = for i:m. 0.0
67
ax = mat **. x0
78
r0 = b - ax
@@ -16,6 +17,11 @@ def solve {m} (mat:m=>m=>Float) (b:m=>Float) : m=>Float =
1617
(x', r', p')
1718
xOut
1819

20+
def chol_solve (l:LowerTriMat m Float) (b:m=>Float) : m=>Float =
21+
b' = forward_substitute l b
22+
u = transposeLowerToUpper l
23+
backward_substitute u b'
24+
1925
' # Kernel ridge regression
2026

2127
' To learn a function $f_{true}: \mathcal{X} \to \mathbb R$
@@ -40,7 +46,7 @@ ys : Nx=>Float = for i. trueFun xs.i + noise * randn (ixkey k2 i)
4046
-- Kernel ridge regression
4147
def regress {a} (kernel: a -> a -> Float) (xs: Nx=>a) (ys: Nx=>Float) : a -> Float =
4248
gram = for i j. kernel xs.i xs.j + select (i==j) 0.0001 0.0
43-
alpha = solve gram ys
49+
alpha = solve' gram ys
4450
predict = \x. sum for i. alpha.i * kernel xs.i x
4551
predict
4652

@@ -59,3 +65,35 @@ preds = map predict xtest
5965

6066
:html showPlot $ xyPlot xtest preds
6167
> <html output>
68+
69+
' # Gaussian process regression
70+
71+
' GP regression (kriging) works in a similar way. Compared with kernel ridge regression, GP regression assumes Gaussian distributed prior. This, combined
72+
with the Bayes rule, gives the variance of the prediction.
73+
74+
' In this implementation, the conjugate gradient solver is replaced with the
75+
cholesky solver from `lib/linalg.dx` for efficiency.
76+
77+
def gp_regress (kernel: a -> a -> Float) (xs: n=>a) (ys: n=>Float)
78+
: (a -> (Float&Float)) =
79+
noise_var = 0.0001
80+
gram = for i j. kernel xs.i xs.j
81+
c = chol (gram + eye *. noise_var)
82+
alpha = chol_solve c ys
83+
predict = \x.
84+
k' = for i. kernel xs.i x
85+
mu = sum for i. alpha.i * k'.i
86+
alpha' = chol_solve c k'
87+
var = kernel x x + noise_var - sum for i. k'.i * alpha'.i
88+
(mu, var)
89+
predict
90+
91+
gp_predict = gp_regress (rbf 0.2) xs ys
92+
93+
(gp_preds, vars) = unzip (map gp_predict xtest)
94+
95+
:html showPlot $ xycPlot xtest gp_preds (map sqrt vars)
96+
> <html output>
97+
98+
:html showPlot $ xyPlot xtest vars
99+
> <html output>

0 commit comments

Comments
 (0)