A Python package for transforming unstructured news headlines or short text snippets into structured, domain-specific summaries. Ideal for business, logistics, and transportation sectors, this tool extracts key details (e.g., entity, action, reason, location, and impact) and outputs them in a standardized JSON-like format.
- Extracts structured information from noisy text inputs
- Outputs results in a consistent JSON-like format
- Supports custom LLM backends via LangChain
- Easy integration with existing workflows
Install the package via pip:
pip install logistics_headline_extractorfrom logistics_headline_extractor import logistics_headline_extractor
user_input = "Waymo temporarily suspends service in SF amid power outage"
result = logistics_headline_extractor(user_input=user_input)
print(result)You can use any LangChain-compatible LLM by passing it to the llm parameter:
from langchain_openai import ChatOpenAI
from logistics_headline_extractor import logistics_headline_extractor
llm = ChatOpenAI()
user_input = "Waymo temporarily suspends service in SF amid power outage"
result = logistics_headline_extractor(user_input=user_input, llm=llm)from langchain_anthropic import ChatAnthropic
from logistics_headline_extractor import logistics_headline_extractor
llm = ChatAnthropic()
user_input = "Waymo temporarily suspends service in SF amid power outage"
result = logistics_headline_extractor(user_input=user_input, llm=llm)from langchain_google_genai import ChatGoogleGenerativeAI
from logistics_headline_extractor import logistics_headline_extractor
llm = ChatGoogleGenerativeAI()
user_input = "Waymo temporarily suspends service in SF amid power outage"
result = logistics_headline_extractor(user_input=user_input, llm=llm)For LLM7 (default provider), you can provide your API key:
from logistics_headline_extractor import logistics_headline_extractor
user_input = "Waymo temporarily suspends service in SF amid power outage"
result = logistics_headline_extractor(user_input=user_input, api_key="your_api_key_here")Or set it as an environment variable:
export LLM7_API_KEY="your_api_key_here"user_input(str): The text input to processllm(Optional[BaseChatModel]): LangChain LLM instance (defaults to ChatLLM7)api_key(Optional[str]): API key for LLM7 (if using default provider)
This package uses ChatLLM7 by default. The free tier rate limits are sufficient for most use cases. For higher rate limits, you can:
- Get a free API key by registering at https://token.llm7.io/
- Pass your API key via the
api_keyparameter orLLM7_API_KEYenvironment variable - Use a different LLM provider by passing a custom LangChain LLM instance
The package returns a list of strings matching the pattern:
{"entity": "...", "action": "...", "reason": "...", "location": "...", "temporal": "..."}If the LLM call fails, the function will raise a RuntimeError with details about the failure.
For issues and feature requests, please create an issue on our GitHub repository.
Eugene Evstafev
Email: hi@euegne.plus
GitHub: chigwell