Heads up: design changes for version 0.7 -- e.g. printing a tensor will require a context #391
Replies: 3 comments
-
|
This refactoring will also minimize user-facing API, with a new module module Context : sig
type t
type routine
val cuda : ?device:int -> unit -> t
val compile : t -> Assignments.t -> t * routine
val run : t -> routine -> unit
val print : t -> Tnode.t -> unit
end
(* Usage *)
let ctx = Context.cuda () in
let ctx, routine = Context.compile ctx assignments in
let result = Context.execute ctx routine input |
Beta Was this translation helpful? Give feedback.
-
|
Since I plan for |
Beta Was this translation helpful? Give feedback.
-
|
This is already in progress, we lost the stream parallelism and data-parallel functionality; but old code is not refactored yet, just a reduced part of it is exposed in the new Summary by Claude Opus: Summary Today we completed a significant refactoring of OCANNL's training and Background and Motivation The conversation began with a critical assessment of OCANNL's design The key pain point identified was the cumbersome Backend module Key Changes Made
We introduced a new Context module that serves as a simplified backend
Key design decisions:
Completely transformed the training API to use Context.t instead of Before: After: Changes to key functions:
Updated approximately 30+ test files from the old Backend API to the Before: After: Test files updated include:
After completing the migration, we removed the With_context module Technical Challenges Addressed
Benefits of the New Design
Impact This refactoring touches nearly every test and example in the codebase The work demonstrates OCANNL's evolution toward a balance of |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
The core fat-cutting refactoring: getting rid of the notion of hosted tensors, and getting rid of streams -- while preserving the current fine-grained contexts. I will also rewrite the backend scaffolding.
Conceptually (and maybe even API-wise?) printing or saving a tensor (node) will be something that contexts do.
Beta Was this translation helpful? Give feedback.
All reactions