Skip to content

Memory leakage upon repeated training #230

@CasBex

Description

@CasBex

Hi, I've been creating some random forest regressors lately and I've noticed high memory usage during hyperparameter tuning. It turns out that there is some memory leakage in the package. For some reason Julia does not delete the trees when they become unreachable.

Following is a MWE: after finishing run_forests, some memory should be reclaimed but it doesn't happen and memory usage increases. When running the second loop however, memory usage stays constant.

using DecisionTree
function run_forests(features, labels)
    forest = build_forest(labels, features)
    labels .+= apply_forest(forest, features)
    labels ./= 2
end

function run_something_else(features, labels)
    C = repeat(features, inner=(2,2))
    labels ./= vec(sum(C, dims=2))[1:length(labels)]
end

const features = rand(10_000, 10)
const labels = sum(features, dims=2) |> vec

# notice memory consumption increases every couple of iterations
for i = 1:1_000
    run_forests(features, labels)
    @info "Iteration $i current memory used" Sys.maxrss()
end

# notice memory consumption does not increase every couple of iterations
for i = 1:1_000
    run_something_else(features, labels)
    @info "Iteration $i current memory used" Sys.maxrss()
end

Any idea what might cause this?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions