Skip to content

harshikaalagh-netizen/Content-audit

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Content Audit Skill — Setup Guide

What this does

Audits your content library using a Job / Evidence / Gap framework. Connects to your analytics, pulls real data, auto-flags everything, and gives you a prioritized CSV with actions.

Quick start

1. Install dependencies

You need Python 3 and requests:

pip3 install requests

If using Google Search Console or GA4, also install:

pip3 install google-auth google-api-python-client  # for GSC
pip3 install google-analytics-data google-auth      # for GA4

2. Connect your analytics

Pick your platform and set up the connection.

PostHog

You need three things:

  • API key: Go to PostHog → Settings → Personal API Keys → Create. Give it read-only scopes for Query, Insight, Event definition, and Person.
  • Project ID: It's in your PostHog URL: posthog.com/project/<THIS_NUMBER>
  • Host: https://us.posthog.com (US) or https://eu.posthog.com (EU)

Set them in your terminal:

export POSTHOG_API_KEY="phx_your_key_here"
export POSTHOG_PROJECT_ID="12345"
export POSTHOG_HOST="https://us.posthog.com"

Google Search Console

You need a Google Cloud service account:

  1. Go to Google Cloud Console → create a project (or use existing)
  2. Enable the "Search Console API" in APIs & Services → Library
  3. Go to APIs & Services → Credentials → Create Credentials → Service Account
  4. Click the service account → Keys tab → Add Key → JSON (downloads a .json file)
  5. Go to Google Search Console → your property → Settings → Users and permissions → Add user → paste the service account email (the @...iam.gserviceaccount.com one) → set to Full

Set in your terminal:

export GSC_CREDENTIALS_PATH="/path/to/your-credentials.json"

Google Analytics 4

Same service account setup as GSC, but:

  1. Enable the "Analytics Data API" (not the regular Analytics API)
  2. In GA4 → Admin → Property → Property Access Management → add the service account email as Viewer
  3. Your property ID is in GA4 → Admin → Property Settings (just the number)

Set in your terminal:

export GA4_CREDENTIALS_PATH="/path/to/your-credentials.json"

3. Pull your data

First, see what's available:

python3 scripts/pull_posthog_data.py --discover
python3 scripts/pull_gsc_data.py --discover --site-url "https://yoursite.com"
python3 scripts/pull_ga4_data.py --discover --property-id "123456789"

Then pull the CSV:

# PostHog
python3 scripts/pull_posthog_data.py --output audit-data.csv --url-pattern "/blog%" --conversion-events "sign_up,download"

# GSC
python3 scripts/pull_gsc_data.py --output audit-data.csv --site-url "https://yoursite.com" --url-filter "/blog/"

# GA4
python3 scripts/pull_ga4_data.py --output audit-data.csv --property-id "123456789" --url-filter "/blog/"

Replace the conversion events and URL filters with your own.

4. Run the audit

Option A: Warp

If you use Warp, install the .skill file and prompt: "Run a content audit on my blog content."

Option B: Claude / ChatGPT / any AI tool

  1. Create a project (or custom GPT, or system prompt)
  2. Add SKILL.md and references/framework.md as context
  3. Attach your audit-data.csv
  4. Prompt: "Run a content audit on this CSV"

The instructions will guide the AI to ask you about your product and ICP before starting.

5. What you get

  • A flagged CSV with every piece of content rated RED / YELLOW / GREEN
  • Grouped summary by content type
  • Deep analysis on your top 20-30 pieces
  • Gap analysis: what to create next, what to scale, what to kill

About

AI-powered content audit framework. Connects to PostHog, Google Search Console, or GA4. Flags every piece of content by Job, Evidence, and Gap.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages