A Python package designed to assist users in making informed purchasing decisions for servers, RAM, and flash storage during market price fluctuations.
- Hardware Recommendations: Get tailored hardware configurations based on your specific needs.
- Price Trend Analysis: Understand market price trends to make informed decisions.
- Optimal Purchasing Strategies: Receive data-driven advice to secure the best value for your investments.
- Flexible LLM Integration: Use any LangChain-compatible LLM for processing.
pip install pricewise_hardwarefrom pricewise_hardware import pricewise_hardware
user_input = "I need a server with 64GB RAM and 1TB SSD, my budget is $2000."
response = pricewise_hardware(user_input)
print(response)You can use any LangChain-compatible LLM by passing it to the pricewise_hardware function.
from langchain_openai import ChatOpenAI
from pricewise_hardware import pricewise_hardware
llm = ChatOpenAI()
response = pricewise_hardware(user_input, llm=llm)
print(response)from langchain_anthropic import ChatAnthropic
from pricewise_hardware import pricewise_hardware
llm = ChatAnthropic()
response = pricewise_hardware(user_input, llm=llm)
print(response)from langchain_google_genai import ChatGoogleGenerativeAI
from pricewise_hardware import pricewise_hardware
llm = ChatGoogleGenerativeAI()
response = pricewise_hardware(user_input, llm=llm)
print(response)By default, the package uses the ChatLLM7 from langchain_llm7. If you want to use a specific API key, you can pass it directly or set it as an environment variable.
export LLM7_API_KEY="your_api_key"from pricewise_hardware import pricewise_hardware
user_input = "I need a server with 64GB RAM and 1TB SSD, my budget is $2000."
response = pricewise_hardware(user_input, api_key="your_api_key")
print(response)- user_input (str): The user input text to process.
- llm (Optional[BaseChatModel]): The LangChain LLM instance to use. If not provided, the default
ChatLLM7will be used. - api_key (Optional[str]): The API key for LLM7. If not provided, the environment variable
LLM7_API_KEYwill be used.
The default rate limits for LLM7 free tier are sufficient for most use cases of this package. If you need higher rate limits, you can get a free API key by registering at LLM7.
If you encounter any issues, please report them on the GitHub issues page.
- Eugene Evstafev
- Email: hi@eugene.plus
- GitHub: chigwell