hewwo :3
This is a learning project for backend stuff. (Tech stack below).
Premise is something like:
- user queues job through dashboard (generate playlist name based on input description)
- task put in Redis queue
- containerized worker picks up job
- does the job (Ask ollama server (llama3.2) to generate playlist name)
- writes to PostgreSQL db
- displayed in dashboard! wow!
docker compose updocker exec -it ollama-server ollama pull llama3.2:latest- or do this from Docker desktop
- only need to do this once!
- unless you delete the image :3
- go to browser
localhost:5000
no (jk) More details in my Obsidian devlog
- Gunicorn
- Eventlet
- PostgreSQL
- Redis
- Docker
- Flask
- Flask-WTF
- Flask-SocketIO
- make sure
workerandflask-appcontainers don't start untilpostgresfully set up - smaller images possible?
- e.g., alpine linux instead of full python:3?
- use MCP, and maybe some Spotify API? to make actual playlists
- nginx reverse proxy?
- ansible, make, kubernetes?
- needed to add a Flask-SocketIO connection (
flask-appto the user's browser)- otherwise, it's just a non-permanent browser session, and had issue with Flask.flash()
- same issue with
return redirect(url_for('dashboard')), if it's triggered byworker

