Skip to content

Conversation

@daniil-lyakhov
Copy link
Collaborator

Changes

Variable which is containing model outputs during statistic collection is explicitly freed after each iteration

Reason for changes

The quantization cell was failing on my machine with 125GB of RAM in the SD v3 TorchFX notebook.
After checking that the pytorch code is not leaking by the pytorch profiler tool

image

I confirmed that the problem is not related to statistic collection directly. After that I forced the garbage collection by the del statement - and my memory didn't exceed healthy 50GB during the runtime, and SQ was successfully applied to the model (in contrast with previous runs)

Related tickets

Tests

I'm not sure which test should I do with that

@daniil-lyakhov daniil-lyakhov requested a review from a team as a code owner August 28, 2025 11:31
outputs = engine.infer(input_data)
processed_outputs = self._process_outputs(outputs)
self._register_statistics(processed_outputs, merged_statistics)
del outputs
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add comment about reason to add del here.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@AlexanderDokuchaev AlexanderDokuchaev merged commit 04a4439 into openvinotoolkit:develop Sep 16, 2025
20 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants