-
Notifications
You must be signed in to change notification settings - Fork 754
Bug Report: Pretrained checkpoint and current DonutModel mismatch #147
Copy link
Copy link
Open
Description
Hello,
Describe the bug
I cloned the latest master (commit 7df02e1, Sept 30, 2025) and downloaded the pre-trained model via Google Drive (dolphin_model.bin and dolphin_tokenizer.json).
When loading the pretrained weights, the model fails to load into the current DonutModel due to size mismatch errors in the vision encoder layers.
It seems that the current DonutModel vision tower (vpm.* layers) has different dimensions compared to the released checkpoint.
Questions
- Are the Google Drive weights outdated relative to the current code?
- Could you release an updated checkpoint matching the latest master?
Thanks a lot for your work on this project!
Error traceback
RuntimeError: Error(s) in loading state_dict for DonutModel:
size mismatch for vpm.model.layers.1.downsample.norm.weight: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for vpm.model.layers.1.downsample.norm.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for vpm.model.layers.1.downsample.reduction.weight: copying a param with shape torch.Size([512, 1024]) from checkpoint, the shape in current model is torch.Size([256, 512]).
size mismatch for vpm.model.layers.2.downsample.norm.weight: copying a param with shape torch.Size([2048]) from checkpoint, the shape in current model is torch.Size([1024]).
size mismatch for vpm.model.layers.2.downsample.norm.bias: copying a param with shape torch.Size([2048]) from checkpoint, the shape in current model is torch.Size([1024]).
size mismatch for vpm.model.layers.2.downsample.reduction.weight: copying a param with shape torch.Size([1024, 2048]) from checkpoint, the shape in current model is torch.Size([512, 1024]).
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels