-
Notifications
You must be signed in to change notification settings - Fork 2
Home
Nicolay Rusnachenko edited this page Dec 10, 2025
·
14 revisions
A lightweight, no-strings-attached framework for your LLM that allows applying Chain-of-Thought-alike prompt schema (See related section) towards a massive textual collections.
- Download this provider:
wget https://raw.githubusercontent.com/nicolay-r/nlp-thirdgate/refs/heads/master/llm/replicate_104.py- Usage (version
1.2.0):
from bulk_chain.core.utils import dynamic_init
from bulk_chain.api import iter_content
content_it = iter_content(
# 1. Your schema.
schema=[
{"prompt": "Given customer message: {text}, detect the customer's intent?", "out": "intent" },
{"prompt": "Given customer message: {text}, extract relevant entities?", "out": "entities"},
{"prompt": "Given intent: {intent} and entities: {entities}, generate a concise response or action recommendation for support agent.", "out": "action"}
],
# 2. Your third-party model implementation.
llm=dynamic_init(class_filepath="replicate_104.py")(
api_token="<API-KEY>",
model_name="meta/meta-llama-3-70b-instruct"),
# 3. Customize your inference and result providing modes:
infer_mode="batch_async",
# 4. Your iterator of dictionaries
input_dicts_it=YOUR_DATA_IT,
)