66 relatively small results should be memoized; ie check if the arguments were
77 seen before, and if they were, fetch the result, and if not, compute the result
88 and cache it for future lookups
9- - eg query steps against an hdb
9+ - eg query steps against a read-only db
1010 - eg algorithms whose implementation lends naturally to top-down dynamic programming
1111- the process of memoizing a function should be agnostic to the function,
1212 undoable, configurable, and easy
1919| .memo.C | cache capacity. defaults to =100h=. can be any short |
2020| .memo.P | cache table parent structure name. defaults to =`cache=. can be any symbol |
2121** functions
22- | name | description |
23- |----------------+------------------------------------------------------------------------------|
24- | .memo.init | initializes a cache table given a namespace and size |
25- | .memo.initns | initializes a cache table given a namespace |
26- | .memo.initsize | initializes a cache table given a size |
27- | .memo.mk | makes a memoized copy of a function given a source, target, and cache table |
28- | .memo.mkmut | mutates a given function into its memoized copy given a source |
29- | .memo.mknew | makes a memoized copy of a function given a source and target |
30- | .memo.rm | removes a memoized function and clears its cached data given a source |
31- | .memo.rs | resizes a cache table given a cache name and size |
32- | .memo.mv | moves a memoized function to a new cache table given a source and cache name |
22+ | name | description |
23+ |---------------+------------------------------------------------------------------------------|
24+ | .memo.init | initializes a cache table given a namespace and capacity |
25+ | .memo.initns | initializes a cache table given a namespace |
26+ | .memo.initcap | initializes a cache table given a capacity |
27+ | .memo.mk | makes a memoized copy of a function given a source, target, and cache table |
28+ | .memo.mkmut | mutates a given function into its memoized copy given a source |
29+ | .memo.mknew | makes a memoized copy of a function given a source and target |
30+ | .memo.rm | removes a memoized function and clears its cached data given a source |
31+ | .memo.mv | moves a memoized function to a new cache table given a source and cache name |
32+ | .memo.cap | returns the capacity of a given cache table |
33+ | .memo.rs | resizes a cache table given a cache name and capacity |
3334* examples
3435#+begin_example
3536q)/ .memo.init takes a [namespace;<short>]
@@ -40,13 +41,13 @@ q)cache
4041 | ::
41420| (+`f`a!(,`;,::))!+(,`r)!,,::
4243
43- q)/ .memo.initns is shorthand for .memo.init, using .memo.C as the cache size
44+ q)/ .memo.initns is shorthand for .memo.init, using .memo.C as the cache capacity
4445q).memo.initns[]
4546`..cache.1
4647
47- q)/ .memo.initsize is shorthand for .memo.init, using \d as the namespace
48+ q)/ .memo.initcap is shorthand for .memo.init, using \d as the namespace
4849q)\d .a
49- q).memo.initsize 2h
50+ q).memo.initcap 2h
5051`.a.cache.0
5152
5253q)/ (::) or ` may be used to reference \d
@@ -67,19 +68,19 @@ q)g 4 5 6
6768q)/ since 1 2 3 and 4 5 6 are new args, `g computes them and caches the results
6869q)/ in `.a.cache.0
6970q)cache.0
70- f a | r
71+ f a | r
7172-----------| --
7273 :: | ::
73- .a.g ,1 2 3| 6
74+ .a.g ,1 2 3| 6
7475.a.g ,4 5 6| 15
7576q)/ subsequent invocations with recognized arguments will fetch from
7677q)/ here--instead of executing the underlying implementation
7778q)g 4 5 6
7879q)cache.0
79- f a | r
80+ f a | r
8081-----------| --
8182 :: | ::
82- .a.g ,1 2 3| 6
83+ .a.g ,1 2 3| 6
8384.a.g ,4 5 6| 15
8485
8586q)/ .memo.mkmut is shorthand for .memo.mk, mutating the function in place and
@@ -122,57 +123,66 @@ q)g
122123
123124q)/ any removal of a memoized function clears its cached data
124125q)cache.0
125- f a | r
126+ f a | r
126127----| --
127128 ::| ::
128129#+end_example
129130
130- #+begin_example
131- q)/ .memo.rs takes a [cache;<short>]
132- q).memo.rs[`cache.1;2h]
133- `.a.cache.1
134- q)j1 1 2 3;j1 2 3 4;cache.1
135- f a | r
136- ------------| --
137- :: | ::
138- .a.j1 ,1 2 3| 6
139- .a.j1 ,2 3 4| 24
140-
141- q)/ resizing a cache to below its capacity trims it
142- q).memo.rs[`cache.1;1h]
143- `.a.cache.1
144- q)cache.1
145- f a | r
146- ------------| --
147- :: | ::
148- .a.j1 ,2 3 4| 24
149- #+end_example
150-
151131#+begin_example
152132q)/ .memo.mv takes a [<symbol>;<symbol>] as its function and cache table,
153133q)/ respectively
134+ q)j1 2 3 4
154135q).memo.mv[`j1;`..cache.0]
155136`.a.j1
156137q)/ its cached data has moved
157138q)cache.1
158- f a | r
139+ f a | r
159140----| --
160141 ::| ::
161142q)\d .
162143q)cache.0
163- f a | r
144+ f a | r
164145------------| --
165146 :: | ::
166147.a.j1 ,2 3 4| 24
167148q)/ and it now points to this new cache, too
168149q).a.j1 5 6 7
169150210
170151q)cache.0
152+ f a | r
153+ ------------| ---
154+ :: | ::
155+ .a.j1 ,2 3 4| 24
156+ .a.j1 ,5 6 7| 210
157+ #+end_example
158+
159+ #+begin_example
160+ q)/ .memo.cap takes a [cache]
161+ q).memo.cap`cache.0
162+ 2h
163+ q)/ in practice, you probably wouldn't want to make a cache table
164+ q)/ this small
165+
166+ q)/ .memo.rs takes a [cache;<short>]
167+ q).memo.rs[`cache.0;10h]
168+ `.a.cache.1
169+ q).a.j1 1 2 3;.a.j1 2 3 4;cache.0
171170f a | r
172171------------| ---
173172 :: | ::
174173.a.j1 ,2 3 4| 24
175174.a.j1 ,5 6 7| 210
175+ .a.j1 ,1 2 3| 6
176+
177+ q)/ resizing a cache to below its capacity trims it
178+ q).memo.rs[`cache.0;1h]
179+ `..cache.0
180+ q)cache.0
181+ f a | r
182+ ------------| --
183+ :: | ::
184+ .a.j1 ,1 2 3| 6
185+ q)/ notice the lru semantics, here
176186#+end_example
177187
178188#+begin_example
@@ -183,10 +193,12 @@ q)dumb:{$[1=x;0;x<3;1;dumb[x-1]+dumb x-2]}
183193q)dumb 10
18419434
185195q)\t dumb 30
186- 380
187- q).memo.mkmut`dumb
196+ 497
197+ q)/ we want to first expand the cache capacity beyond its paltry 1h
198+ q).memo.rs[`cache.1;100h];.memo.mkmut`dumb
199+ `..dumb
188200q)\t dumb 30
189- 1
201+ 0
190202q).memo.rm[`dumb][27]~dumb 27
1912031b
192204q)/ the stupid implementation is magically hundreds of times faster
@@ -196,33 +208,65 @@ q)/ the stupid implementation is magically hundreds of times faster
196208#+begin_example
197209q)/ if passed as a literal, its new name can't be null
198210q).memo.mknew[sums;`]
199- 'null
211+ 'null name
200212
201- q)/ don't play with the parent structure
202- q)cache:10
203- q)/ all memoized functions that point here will no longer work
204- q).a.j1 10 11
205- '0
213+ q)/ don't play with the parent structure, eg
214+ q)/ cache:10
215+ q)/ the library will cease to function properly
206216
207217q)/ don't memoize stateful functions
208218q).memo.mknew[rand;`why]
219+ `..why
220+ q)count distinct why each 10#10
221+ 1
222+ q)/ randomness is lost
209223
210224q)/ be wary of .z.s
211225q)megadumb:{$[1=x;0;x<3;1;.z.s[x-1]+.z.s x-2]}
212226q)\t megadumb 30
213- 375
227+ 436
214228q)\t .memo.mkmut[`megadumb]30
215- 376
229+ 664
216230q)/ recall that .z.s produces a function literal as defined at parse-time,
217231q)/ so cache lookups are not used anywhere except the top of the stack
218232q)\t megadumb 30
2192330
220234q)\t megadumb 31
221- 601
235+ 766
222236q)/ recursively memoize by using names
223237q)dumb:{$[1=x;0;x<3;1;dumb[x-1]+dumb x-2]}
224238q)\t .memo.mkmut[`dumb]31
2252390
226240q)megadumb[32]~dumb 32
2272411b
242+
243+ q)/ as the memo tables are global data, memoized functions that need to
244+ q)/ write to the cache, ie take new values, cannot run in parallel.
245+ q)/ if the function can be made logically concurrent, measure to be
246+ q)/ sure if this trade is worth it. if, however, you know that invocations
247+ q)/ will only read from the cache, feel free to do this in parallel
248+ q)0<system"s"
249+ 1b
250+ q)dumb peach 10 30
251+ 34 514229
252+ q)dumb peach 40 50
253+ 'noupdate
254+
255+ q)/ don't make the cache too small. the additional overhead incurred by
256+ q)/ both cache misses and constant evictions will drastically impact
257+ q)/ performance. this is particularly relevant in memoizing recursive
258+ q)/ calls with a large number of leaves
259+ q).memo.rs[.memo.initns[];1h]
260+ `..cache.2
261+ q).memo.mknew[.memo.rm[`dumb];`ultradumb]
262+ `..ultradumb
263+ q)\t ultradumb 30
264+ 550
265+
266+ q)/ don't memoize a function multiple times. there is absolutely no
267+ q)/ value in doing so, and its cache behavior will get complicated
268+ q).memo.mkmut .memo.mknew[sum;`foo]
269+ `..foo
270+ q)count string foo
271+ 142
228272#+end_example
0 commit comments