Skip to content

How to estimate SARvey memory requirements per core/dataset? #101

@epehlivanli

Description

@epehlivanli

Currently unclear how to anticipate memory needs. Is there a way to estimate memory need based on:

  • Number of pixels / points (e.g., Grove dataset: 13,344)

  • Number of arcs (e.g., ~392,246)

  • Number of nearest neighbors (e.g., 50)

  • Number of cores?

From testing:

  • Grove dataset: 13k points, 392k arcs, 137 dates -> ~120 GB RAM needed for Step 1/2 with 24–48 cores.

Grove dataset with 24 Cores:

Memory Usage on Step 1

              total        used        free	  shared  buff/cache   available
Mem:           187Gi	   123Gi        60Gi	   851Mi       6.1Gi	    63Gi
Swap:             0B          0B          0B


Memory Usage on Step 2

               total        used        free	  shared  buff/cache   available
Mem:           187Gi	   120Gi        63Gi	   851Mi       6.1Gi	    66Gi
Swap:             0B          0B          0B

Grove dataset with 48 Cores:

Memory Usage on Step 1

               total        used        free	  shared  buff/cache   available
Mem:           187Gi       123Gi        60Gi	   851Mi       6.1Gi	    63Gi
Swap:             0B          0B          0B


Memory Usage on Step 2

               total        used        free	  shared  buff/cache   available
Mem:           187Gi       120Gi        63Gi	   851Mi       6.1Gi	    66Gi
Swap:             0B          0B          0B

Grove dataset:

wget http://149.165.154.65:/data/HDF5EOS/Emirhan/Grove_subset/Grove_newcon.tar.gz
tar -xzf Grove_newcon.tar.gz
cd Grove_newcon
sarvey -f config.json 0 2

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions