Replies: 2 comments 3 replies
-
|
Hi, thanks for posting this. The context enhancement mechanism works on tokens. It could be that spaCy tokenizes |
Beta Was this translation helpful? Give feedback.
2 replies
-
|
Ok. I think I found the answer. Spacy just doesnt support lemmatization for chinese language. I have updated the code to use Stanza and it is doing lemmatization properly. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi Everyone,
Im trying to get the context words in foreign languages to work. Below code that uses Spacy zh is not picking up the context words. Is there a way to use chinese context words to enhance score?
Beta Was this translation helpful? Give feedback.
All reactions