Cast transformer to fp16
#13552
Unanswered
znadrich-qf
asked this question in
Help: Coding & Implementations
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
If I am using
en_core_web_trf
for inference is there a way to cast the underlying transformer model tofp16
for better model throughput?Beta Was this translation helpful? Give feedback.
All reactions