This repository was archived by the owner on Dec 16, 2022. It is now read-only.
Replies: 2 comments
-
|
Hello! Could anyone give any idea how to make it work? |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
If it matters to you now (almost a year later), the thing we tried was to untar-gz the coref-spanbert-large-2021.03.10.tar.gz model file and manually change the config.json inside it. The change involves changing every occurrence of "spanbert-large-cased" to the local path where the spanbert-large-cased model is stored. The tar.gz file, obtained by compressing the other files along with the modified config.json file, and loading that in Predictor.from_path works. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I want to resolve coreferences without Internet using AllenNLP and coref-spanbert-large model. I try to do it in the way that is describing here https://demo.allennlp.org/coreference-resolution
As I understood, firstly I need to download a transformer model spanbert-large-cased.
I did it and saved to folder
transformer_model_namethe path which I mention in code.Here is my code:
Despite mentioned path to files
local_config_pathit seems it tries to load files from other folderC:\Users\aap/.cache\huggingface\transformersHere is traceback:
So what do I do wrong? How to make it works correctly?
Beta Was this translation helpful? Give feedback.
All reactions