Skip to content

PerceptionLM Document Issue #1869

@wonderwind271

Description

@wonderwind271

Bug description.
In the official documentation here, the way to load the model is said to be

model = PerceptionLMForConditionalGeneration.from_pretrained("perception_lm-hf/perception_lm-1.5-7b-hf")

However, such repo perception_lm-hf/perception_lm-1.5-7b-hf does not exist on HF. We get following error if we attempt to do this:

OSError: perception_lm-hf/perception_lm-1.5-7b-hf is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `hf
auth login` or by passing `token=<your_token>`

I think the correct repo name should be facebook/Perception-LM-1B, which is this repo on HF. The official doc also mentioned this repo in its "PerceptionLMConfig" section, so I am very certain this is meant to be the case.

Describe the expected behaviour
After changing the line mentioned above to

model = PerceptionLMForConditionalGeneration.from_pretrained("facebook/Perception-LM-1B")

we get expected outcome. So, I think we should change the example code in the official HF doc as well.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions