Skip to content
This repository was archived by the owner on Jan 24, 2024. It is now read-only.

Tried multiple different models but get "The model weights are not tied..." error every time.. #266

@jontstaz

Description

@jontstaz

Hi,

I'm running Basaran via Docker and I have now tried using several different models at this point but every time after its downloaded and load everything, I'm facing with this error: The model weights are not tied. Please use the `tie_weights` method before using the `infer_auto_device` function

Am I missing something? I've tried multiple GPTQ models from TheBloke and even the official Llama2-13b model but this error is thrown every single time regardless of the model and it prevents me from using Basaran at all.

Any help would be appreciated. Thanks in advance.

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions