[Inference Provider]: Add TextCLF as an inference provider#2022
[Inference Provider]: Add TextCLF as an inference provider#2022textclf-api wants to merge 6 commits intohuggingface:mainfrom
Conversation
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.
|
Hey there! Thank you for your interest in becoming an Inference Provider and for the excellent work you've put into this integration! However, we're currently in a consolidation phase focusing on growing usage of Inference Providers via new features and integrations rather than expanding to new partners. This means we've temporarily paused onboarding new providers while we work on these improvements. We're not able to provide a specific timeline for when we'll resume new provider onboarding, but we'd love to revisit this integration in the future. In the meantime:
Thanks again for your contribution and understanding! |
Adding TextCLF as a new Inference Provider for the @huggingface/inference library.
Note
Medium Risk
Introduces a new external-provider integration with custom request/response shaping, so failures will primarily show up as runtime API incompatibilities or malformed response handling rather than compile-time issues.
Overview
Adds TextCLF as a new third-party provider in
@huggingface/inference, wiring it into provider selection/typing and documenting it in the README.Implements
TextCLFConversationalTaskand a customTextCLFTextGenerationTaskthat routestext-generationthrough TextCLF’s chat-completions endpoint (including parameter mapping likemax_new_tokens→max_tokens) and validates/parses responses, plus adds provider-level integration tests for chat completion and streaming.Written by Cursor Bugbot for commit 53ec4c7. This will update automatically on new commits. Configure here.