Skip to content

[inference provider] Add charity-engine as an inference provider#1958

Open
tristanolive wants to merge 1 commit intohuggingface:mainfrom
tristanolive:add-provider-charityengine
Open

[inference provider] Add charity-engine as an inference provider#1958
tristanolive wants to merge 1 commit intohuggingface:mainfrom
tristanolive:add-provider-charityengine

Conversation

@tristanolive
Copy link

CharityEngine is "the crowdsourced cloud" - a high-throughput compute service running on a global network of volunteered devices.

Our distributed platform provides an ecosystem of compute, storage, and web crawling capabilities and this integration makes it available for running batch inference on HF.

[CharityEngine](https://charityengine.com) is "the crowdsourced cloud" - a high-throughput compute service running on a global network of volunteered devices.

Our distributed platform provides an ecosystem of compute, storage, and web crawling capabilities and this integration makes it available for running batch inference on HF.
@SBrandeis
Copy link
Contributor

Hey there!
Thank you for your interest in becoming an Inference Provider and for the excellent work you've put into this integration!
We really appreciate the effort.

However, we're currently in a consolidation phase focusing on growing usage of Inference Providers via new features and integrations rather than expanding to new partners. This means we've temporarily paused onboarding new providers while we work on these improvements.

We're not able to provide a specific timeline for when we'll resume new provider onboarding, but we'd love to revisit this integration in the future.

Thanks again for your contribution and understanding!

@SBrandeis SBrandeis added the inference-providers integration of a new or existing Inference Provider label Feb 10, 2026
@GRMatt
Copy link

GRMatt commented Feb 10, 2026

@SBrandeis , two thoughts/suggestions --

  1. If you're not onboarding new partners, could you kindly note this at How to be registered as an inference provider on the Hub, or other obvious place? (*It's a non-trivial amount of work to set up these integrations.)

  2. FWIW, there are a decent number of currently active providers, but there are considerable swaths of the universe of models which are not covered by any of them; at Charity Engine we'd planned on offering models not provided by others. Also, our community-powered (distributed) model will enable us to provide quite low-cost services. So I hope you'll keep us in mind when you get around to re-opening the program.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

inference-providers integration of a new or existing Inference Provider

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants