Audits your content library using a Job / Evidence / Gap framework. Connects to your analytics, pulls real data, auto-flags everything, and gives you a prioritized CSV with actions.
You need Python 3 and requests:
pip3 install requests
If using Google Search Console or GA4, also install:
pip3 install google-auth google-api-python-client # for GSC
pip3 install google-analytics-data google-auth # for GA4
Pick your platform and set up the connection.
You need three things:
- API key: Go to PostHog → Settings → Personal API Keys → Create. Give it read-only scopes for Query, Insight, Event definition, and Person.
- Project ID: It's in your PostHog URL:
posthog.com/project/<THIS_NUMBER> - Host:
https://us.posthog.com(US) orhttps://eu.posthog.com(EU)
Set them in your terminal:
export POSTHOG_API_KEY="phx_your_key_here"
export POSTHOG_PROJECT_ID="12345"
export POSTHOG_HOST="https://us.posthog.com"
You need a Google Cloud service account:
- Go to Google Cloud Console → create a project (or use existing)
- Enable the "Search Console API" in APIs & Services → Library
- Go to APIs & Services → Credentials → Create Credentials → Service Account
- Click the service account → Keys tab → Add Key → JSON (downloads a .json file)
- Go to Google Search Console → your property → Settings → Users and permissions → Add user → paste the service account email (the
@...iam.gserviceaccount.comone) → set to Full
Set in your terminal:
export GSC_CREDENTIALS_PATH="/path/to/your-credentials.json"
Same service account setup as GSC, but:
- Enable the "Analytics Data API" (not the regular Analytics API)
- In GA4 → Admin → Property → Property Access Management → add the service account email as Viewer
- Your property ID is in GA4 → Admin → Property Settings (just the number)
Set in your terminal:
export GA4_CREDENTIALS_PATH="/path/to/your-credentials.json"
First, see what's available:
python3 scripts/pull_posthog_data.py --discover
python3 scripts/pull_gsc_data.py --discover --site-url "https://yoursite.com"
python3 scripts/pull_ga4_data.py --discover --property-id "123456789"
Then pull the CSV:
# PostHog
python3 scripts/pull_posthog_data.py --output audit-data.csv --url-pattern "/blog%" --conversion-events "sign_up,download"
# GSC
python3 scripts/pull_gsc_data.py --output audit-data.csv --site-url "https://yoursite.com" --url-filter "/blog/"
# GA4
python3 scripts/pull_ga4_data.py --output audit-data.csv --property-id "123456789" --url-filter "/blog/"
Replace the conversion events and URL filters with your own.
If you use Warp, install the .skill file and prompt: "Run a content audit on my blog content."
- Create a project (or custom GPT, or system prompt)
- Add
SKILL.mdandreferences/framework.mdas context - Attach your
audit-data.csv - Prompt: "Run a content audit on this CSV"
The instructions will guide the AI to ask you about your product and ICP before starting.
- A flagged CSV with every piece of content rated RED / YELLOW / GREEN
- Grouped summary by content type
- Deep analysis on your top 20-30 pieces
- Gap analysis: what to create next, what to scale, what to kill