Skip to content

Slow but unbounded grouth of memory cost with time? #235

@niusen

Description

@niusen

Hi TensorKit developers,

thank you for this nice package and I used it a lot for my research. Now I have a problem with the increase of memory cost, probably it is related to TensorKit.jl. Below is brief description of my algorithm: I construct finite PEPS with U(1) symmetric tensor, and then generate Monte Carlo samples and do contractions to obtain the wavefunction amplitude. I expect the memory cost should saturate quickly. However, the memory occupation increases slowly and never stops.

Here is the simplified version of my codes that can reproduce the increase of memory cost:
https://github.com/niusen/PEPS_memory_test
Running "my_test.jl", then one can observe the change of occupied memory either from "top" linux command or from reading the output file. The slow memory increase is not visible within short time, but become evident after half day. For example, after 8 hours, the used memory increase from around 1.5Gb to around 8Gb.

I tried to tune some of the cache size "like TensorKit.treepermutercache" as well as use "GC.gc(true)", but the problem still exists. Is there a way to check the origin of the increase?

Thank you a lot.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions