Pass to automatically reduce the number of new sparse tensors allocated #232
paul-tqh-nguyen
started this conversation in
Ideas
Replies: 2 comments
-
I might've been overzealous when proposing this idea initially. I think the solution might be much simpler. We simply want to have a pass to minimize the number of sparse tensors created.
|
Beta Was this translation helpful? Give feedback.
0 replies
-
Here's an idea I had. I'm not sure if it's completely thought through, so I'm posting it here for a sanity check:
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Most of our ops create new tensors as the output tensor via the
empty_like
function, thedup_tensor
function, etc. We can also create new tensors viasparse_tensor.init
op.In some complicated use cases (e.g. GraphSAGE), I'm imagining it's possible that there are a lot of cases where we allocate a bunch of sparse tensors and delete them.
It would make sense that we re-use tensors when possible (e.g. once we're done with one and need to create one later, we'd just resize the old one instead of deleting it and creating a new one). This might lead to better memory efficiency, which might also lead to better runtime speed. It'd also reduce developer burden.
I think that there are a lot of parallel concepts from register allocation that we could adapt towards this problem.
f64
sparse tensors, we could think of this as a case where our machine has a limited number of 2Df64
sparse tensor "registers" and could use register allocation-inspired algorithms to minimize the number of new 2Df64
sparse tensors we'd have to allocate.(rank, dtype)
sparse tensor type used in our use case.Caveats:
I don't think this idea is limited to sparse tensors. I think an analogous idea could be used for applications primarily using dense tensors. Would this be useful for Numba in anyway?
Beta Was this translation helpful? Give feedback.
All reactions