-
Notifications
You must be signed in to change notification settings - Fork 102
Open
Description
Hi, I've been creating some random forest regressors lately and I've noticed high memory usage during hyperparameter tuning. It turns out that there is some memory leakage in the package. For some reason Julia does not delete the trees when they become unreachable.
Following is a MWE: after finishing run_forests, some memory should be reclaimed but it doesn't happen and memory usage increases. When running the second loop however, memory usage stays constant.
using DecisionTree
function run_forests(features, labels)
forest = build_forest(labels, features)
labels .+= apply_forest(forest, features)
labels ./= 2
end
function run_something_else(features, labels)
C = repeat(features, inner=(2,2))
labels ./= vec(sum(C, dims=2))[1:length(labels)]
end
const features = rand(10_000, 10)
const labels = sum(features, dims=2) |> vec
# notice memory consumption increases every couple of iterations
for i = 1:1_000
run_forests(features, labels)
@info "Iteration $i current memory used" Sys.maxrss()
end
# notice memory consumption does not increase every couple of iterations
for i = 1:1_000
run_something_else(features, labels)
@info "Iteration $i current memory used" Sys.maxrss()
endAny idea what might cause this?
Metadata
Metadata
Assignees
Labels
No labels