You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: extensions/firestore-huggingface-inference-api/POSTINSTALL.md
+1Lines changed: 1 addition & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,6 +5,7 @@ You can test out this extension right away!
5
5
Depending on the task you want to run, add a new Firestore document to `${param:collectionPath}`.
6
6
7
7
For example, if you want to run a text classification task using the model [`distilbert-base-uncased-finetuned-sst-2-english`](https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english), add a new document to `${param:collectionPath}` with the following fields:
# Trigger Hugging Face inference API from Firestore
2
2
3
-
**Author**: undefined
4
-
5
-
**Description**: This extension triggers the Hugging Face inference API when a new document is created in a Firestore collection.
6
-
3
+
**Author**: undefined
7
4
5
+
**Description**: This extension triggers the Hugging Face inference API when a new document is created in a Firestore collection.
8
6
9
7
**Details**: Use this extension to run inferences in a Firestore collection using the [Hugging Face inference API](https://huggingface.co/docs/api-inference).
10
8
@@ -25,39 +23,30 @@ This extension uses other Firebase or Google Cloud Platform services which may h
25
23
26
24
When you use Firebase Extensions, you're only charged for the underlying resources that you use. A paid-tier billing plan is only required if the extension uses a service that requires a paid-tier plan, for example calling to a Google Cloud Platform API or making outbound network requests to non-Google services. All Firebase services offer a free tier of usage. [Learn more about Firebase billing.](https://firebase.google.com/pricing)
27
25
28
-
29
-
30
-
31
26
**Configuration Parameters:**
32
27
33
-
* Hugging Face Access Token: You can find your API token on your [Hugging Face account page](https://huggingface.co/settings/token).
34
-
From Hugging Face docs:
35
-
You should see a token hf_xxxxx (old tokens are api_XXXXXXXX or api_org_XXXXXXX).
36
-
If you do not submit your API token when sending requests to the API, you will not be able to run inference on your private models.
37
-
38
-
* Model ID: The Model ID from [Hugging Face Model Hub](https://huggingface.co/models).
39
-
Check the [recommended models for each ML task available](https://api-inference.huggingface.co/docs/python/html/detailed_parameters.html#detailed-parameters), or the [Tasks](https://huggingface.co/tasks) overview.
28
+
- Hugging Face Access Token: You can find your API token on your [Hugging Face account page](https://huggingface.co/settings/token).
29
+
From Hugging Face docs:
30
+
You should see a token hf_xxxxx (old tokens are api_XXXXXXXX or api_org_XXXXXXX).
31
+
If you do not submit your API token when sending requests to the API, you will not be able to run inference on your private models.
40
32
41
-
* Inference Collection Path: New inferences using the HuggingFace Inference API can be made by easily adding a new document to this collection path.
33
+
- Model ID: The Model ID from [Hugging Face Model Hub](https://huggingface.co/models).
34
+
Check the [recommended models for each ML task available](https://api-inference.huggingface.co/docs/python/html/detailed_parameters.html#detailed-parameters), or the [Tasks](https://huggingface.co/tasks) overview.
42
35
43
-
* The task to run the inference on: The task to run the inference on. [Check more in the Hugging Face docs](https://huggingface.co/docs/api-inference/detailed_parameters), or the [Tasks](https://huggingface.co/tasks) overview.
36
+
- Inference Collection Path: New inferences using the HuggingFace Inference API can be made by easily adding a new document to this collection path.
44
37
45
-
* Custom Inference Endpoint: If you want to use a custom model hosted on your own server, you can specify the endpoint here.
46
-
47
-
* Cloud Functions location: Where do you want to deploy the functions created for this extension? For help selecting a location, refer to the [location selection guide](https://firebase.google.com/docs/functions/locations).
38
+
- The task to run the inference on: The task to run the inference on. [Check more in the Hugging Face docs](https://huggingface.co/docs/api-inference/detailed_parameters), or the [Tasks](https://huggingface.co/tasks) overview.
48
39
40
+
- Custom Inference Endpoint: If you want to use a custom model hosted on your own server, you can specify the endpoint here.
49
41
42
+
- Cloud Functions location: Where do you want to deploy the functions created for this extension? For help selecting a location, refer to the [location selection guide](https://firebase.google.com/docs/functions/locations).
50
43
51
44
**Cloud Functions:**
52
45
53
-
***triggerInference:** Firestore onCreate-triggered function that run an inference on ${param:MODEL_ID} when a new document is created in the collection ${param:COLLECTION_PATH}
54
-
55
-
46
+
-**triggerInference:** Firestore onCreate-triggered function that run an inference on ${param:MODEL_ID} when a new document is created in the collection ${param:COLLECTION_PATH}
56
47
57
48
**Access Required**:
58
49
59
-
60
-
61
50
This extension will operate with the following project IAM roles:
62
51
63
-
* datastore.user (Reason: This role is required to read/write from the Cloud Firestore database.)
52
+
- datastore.user (Reason: This role is required to read/write from the Cloud Firestore database.)
0 commit comments