Skip to content

Conversation

@sfiisf
Copy link
Collaborator

@sfiisf sfiisf commented Jan 12, 2026

基于 refine_offload 分支。对 comfy/model_management.pymodel_unload 函数进行一些修改。

首先是当前实现中可能出现 None / int 的问题,应将检查并修改 memory_to_free 提前

    def model_unload(self, memory_to_free=None, unpatch_weights=True):
        ...
        logging.debug(f"memory_to_free: {memory_to_free/(1024*1024*1024)} GB") # 此处可能出现 None / (1024 ** 3)
        ...
        if memory_to_free is None:
            # free the full model
            memory_to_free = self.model.loaded_size()

同时我认为 partially_unload 的设置不够精细。比如当 OOM 发生时会将 memory_to_free 设为一个极大值,则所有已载入模型均会使用 partially_unload 方式进行 offload。取 min(memory_to_free, 当前模型载入大小) 相对更精细一些。

以及当前实现直接根据 partially_unload 返回,但当前部分卸载的逻辑应可能存在完整卸载模型的情况,检查模型是否被完整卸载再返回,更符合原函数的行为。

@sfiisf sfiisf force-pushed the refine_offload_fix_fengguosheng branch from b8f6afb to 0b5833e Compare January 12, 2026 02:52
@sfiisf sfiisf merged commit 68ceb5a into refine_offload Jan 12, 2026
3 of 5 checks passed
@sfiisf sfiisf deleted the refine_offload_fix_fengguosheng branch January 12, 2026 09:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants