Skip to content

auth_token error on model download #34

@jTerracina

Description

@jTerracina

Hi I'm getting this error about an unexpected keyword 'use_auth_token'

File "~/*/test_tweetnlp/.venv/lib/python3.13/site-packages/transformers/modeling_utils.py", line 4072, in from_pretrained model = cls(config, *model_args, **model_kwargs) TypeError: RobertaForSequenceClassification.__init__() got an unexpected keyword argument 'use_auth_token'

I've distilled this down to a minimal project to reproduce the issue:

import tweetnlp  

def main():
    model = tweetnlp.load_model("sentiment")
    result = model.sentiment("I'm afraid that tweetnlp might not work properly.")
    print(result)

if __name__ == "__main__":
    main()

Tweetnlp is the only dependency installed for the project.

I did a bit of googling and is seems that the hugging face transformers library which tweetnlp has a dependency on has deprecated the 'use_auth_token' in favor of 'token' . https://discuss.huggingface.co/t/the-use-auth-token-argument-is-deprecated-and-will-be-removed-in-v5-of-transformers/53943/5

Presumably updating the relevant argument in tweetnlp or pinning it to an older version of transformers would fix the issue. I don't have time to investigate further, but if there is interest I would be willing to attempt a fix with a PR sometime later...

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions