A Streamlit application that generates content for blog posts, LinkedIn, and X (Twitter) based on a given subject and target audience. The application uses AI to create tailored content for different platforms in parallel.
This project is a Python implementation using LangChain of a simple n8n workflow for content creation. For visual demonstrations of the workflow, check out these YouTube videos:
- English: AI Automation by Nate Herk
- German: KI Automatisierung by Sebastian Claes
This application requires Docker Desktop to run. If you don't have Docker Desktop installed, follow these steps:
- Download Docker Desktop from Docker's official website
- Run the installer and follow the installation wizard
- Start Docker Desktop from the Start menu
- Wait for Docker to start (you'll see the Docker icon in the system tray turn solid)
- Download Docker Desktop from Docker's official website
- Open the downloaded .dmg file and drag Docker to your Applications folder
- Start Docker from your Applications folder
- Wait for Docker to start (you'll see the Docker icon in the menu bar)
- Follow the instructions for your specific distribution on Docker's documentation
- Start Docker with
sudo systemctl start docker - Verify installation with
docker --version
-
Clone the repository:
git clone https://github.com/ankoehn/ai-content-writer.git cd ai-content-writer -
Create your environment file:
cp .env_template .env
-
Edit the
.envfile and add your API keys:- Required:
TAVILY_API_KEYfor search functionality - Required:
OPENAI_API_KEYfor OpenAI models (default provider) - Optional:
DEEPSEEK_API_KEYif using DeepSeek models
- Required:
-
Configure your LLM provider in the
.envfile:- For OpenAI (default):
LLM_PROVIDER=openai LLM_MODEL=gpt-4o - For DeepSeek:
LLM_PROVIDER=deepseek LLM_MODEL=deepseek-chat DEEPSEEK_API_KEY=your_deepseek_api_key
- For OpenAI (default):
Start the application using Docker Compose:
# Run in foreground
docker compose up
# Or run in background
docker compose up -dAccess the application in your web browser at:
http://localhost:8085
-
Open your browser and navigate to
http://localhost:8085 -
Fill in the form with:
- Campaign name
- Content subject (what you want to write about)
- Target audience (who the content is for)
-
Click "Create" to generate content
-
View the generated content for Blog, LinkedIn, and X
-
Access previously generated content from the history sidebar
-
Export your content history to Excel using the export button
- The application uses the Tavily search engine to find relevant information about the content subject
- It then processes this information using three different AI agents specialized for each platform:
- Blog Agent: Creates a concise two-paragraph blog post
- LinkedIn Agent: Creates an engaging LinkedIn post with emojis and hashtags
- X Agent: Creates a short, impactful tweet with emojis and hashtags
- Generated content is saved to a local JSON file for persistence
app.py: Main Streamlit applicationdocker-compose.yml: Docker Compose configurationDockerfile: Docker container configurationrequirements.txt: Python dependencieswriter/: Core content generation functionalityai/: AI agents and LLM processingsearchengine/: Search engine integrationmodel.py: Data modelsconfig.py: Configuration settingsutils/: Utility functions
This project is licensed under the MIT License - see the LICENSE file for details.