YouTube Comment Deleter By Holmes & Watson (c) 2025
Why I Created This Script: This script was developed to automate the deletion of YouTube comments, addressing the deliberate lack of an efficient manual process. In the UK, concerns about freedom of expression have motivated the need to manage online history easily, as manual deletion (one comment at a time) is unrealistic. YouTubeβs design limits deletions to about 200 per day, a restriction this script cannot bypass, but it streamlines the task for users.
Note: The views expressed are the authorβs and may not reflect all usersβ experiences. Use this tool responsibly within YouTubeβs terms of service.
This script was created to help users manage their online presence efficiently. YouTube's interface only allows manual deletion of comments one-by-one, which is impractical for users with hundreds or thousands of comments. This tool respects YouTube's API rate limits and operates within their terms of service.
Use responsibly and at your own discretion.
- Python 3.7 or later - Download from python.org
- Google Cloud Project with YouTube Data API enabled
- Your YouTube comment data from Google Takeout
Install these using Command Prompt:
pip install google-auth-oauthlib google-api-python-client google-auth- Go to python.org and download Python 3.7+
- Important: During installation, check "Add Python to PATH"
- Test installation by opening Command Prompt and typing:
python --version
Open Command Prompt and run:
pip install google-auth-oauthlib google-api-python-client google-auth- Go to Google Cloud Console
- Create a new project (or select existing one)
- Enable the YouTube Data API v3:
- Go to "APIs & Services" β "Library"
- Search for "YouTube Data API v3" and enable it
- Create credentials:
- Go to "APIs & Services" β "Credentials"
- Click "Create Credentials" β "OAuth 2.0 Client IDs"
- Choose "Desktop Application"
- Download the JSON file and rename it to
client_secret.json
- Go to Google Takeout
- Click "Deselect all", then find and select "YouTube and YouTube Music"
- Click "Multiple formats" and ensure "Comments" is checked
- Choose export format (ZIP recommended)
- Create export and wait for download link (can take hours/days)
- Download and extract the ZIP file
Create this folder structure:
C:\ytcdl\
βββ client_secret.json (from Step 3)
βββ delete_youtube_comments.py (script below)
βββ YouTube and YouTube Music\
βββ comments\
βββ comments.csv
βββ comments(1).csv
βββ comments(2).csv
βββ ... (all your comment CSV files)
Create a new file called delete_youtube_comments.py in C:\ytcdl\ and paste the script from the next section.
import os
import time
import logging
import csv
import google_auth_oauthlib.flow
import googleapiclient.discovery
import googleapiclient.errors
from datetime import datetime
import google.oauth2.credentials
# Set up logging
logging.basicConfig(
filename="delete_youtube_comments.log",
level=logging.INFO,
format="%(asctime)s - %(levelname)s - %(message)s"
)
# API configuration
SCOPES = ["https://www.googleapis.com/auth/youtube.force-ssl"]
API_SERVICE_NAME = "youtube"
API_VERSION = "v3"
CLIENT_SECRETS_FILE = "client_secret.json"
COMMENT_DIR = r"C:\ytcdl\YouTube and YouTube Music\comments"
def get_authenticated_service():
"""Authenticate and return the YouTube API service object."""
try:
creds = None
if os.path.exists("token.json"):
creds = google.oauth2.credentials.Credentials.from_authorized_user_file("token.json", SCOPES)
if not creds or not creds.valid:
flow = google_auth_oauthlib.flow.InstalledAppFlow.from_client_secrets_file(
CLIENT_SECRETS_FILE, SCOPES
)
creds = flow.run_local_server(port=0)
with open("token.json", "w") as token_file:
token_file.write(creds.to_json())
return googleapiclient.discovery.build(
API_SERVICE_NAME, API_VERSION, credentials=creds
)
except Exception as e:
logging.error(f"Authentication failed: {e}")
raise
def load_comments_from_csv():
"""Load comment IDs from all comments.csv files in the directory."""
comments = []
if not os.path.exists(COMMENT_DIR):
logging.error(f"Directory not found: {COMMENT_DIR}")
return comments
csv_files = [f for f in os.listdir(COMMENT_DIR) if f.endswith(".csv") and "comments" in f.lower()]
if not csv_files:
logging.error(f"No CSV files found in {COMMENT_DIR}")
return comments
logging.info(f"Found {len(csv_files)} CSV files to process")
for filename in csv_files:
file_path = os.path.join(COMMENT_DIR, filename)
try:
with open(file_path, "r", encoding="utf-8") as f:
reader = csv.DictReader(f)
file_comments = 0
for row in reader:
comment_id = row.get("Comment ID")
published_at = row.get("Comment create timestamp")
if comment_id and published_at:
comment_id = comment_id.strip()
comments.append({"id": comment_id, "publishedAt": published_at})
file_comments += 1
logging.info(f"Loaded {file_comments} comments from {filename}")
except Exception as e:
logging.error(f"Error loading {filename}: {e}")
print(f"Error loading {filename}: {e}")
logging.info(f"Total comments loaded: {len(comments)}")
print(f"Total comments loaded: {len(comments)}")
return comments
def parse_datetime(date_str):
"""Parse datetime string from Google Takeout format."""
try:
if '+' in date_str:
date_str = date_str.split('+')[0]
elif 'Z' in date_str:
date_str = date_str.replace('Z', '')
if '.' in date_str:
date_part, micro_part = date_str.split('.')
micro_part = micro_part[:6].ljust(6, '0')
date_str = f"{date_part}.{micro_part}"
return datetime.strptime(date_str, "%Y-%m-%dT%H:%M:%S.%f")
else:
return datetime.strptime(date_str, "%Y-%m-%dT%H:%M:%S")
except Exception as e:
logging.warning(f"Could not parse date: {date_str}, error: {e}")
return datetime.now()
def sort_comments(comments, order):
"""Sort comments by publishedAt date."""
try:
for comment in comments:
if isinstance(comment["publishedAt"], str):
comment["publishedAt"] = parse_datetime(comment["publishedAt"])
if order == "recent":
return sorted(comments, key=lambda x: x["publishedAt"], reverse=True)
else:
return sorted(comments, key=lambda x: x["publishedAt"])
except Exception as e:
logging.error(f"Error sorting comments: {e}")
return comments
def delete_comments(youtube, comments):
"""Delete the specified comments."""
deleted_count = 0
failed_count = 0
for i, comment in enumerate(comments, 1):
try:
youtube.comments().delete(id=comment["id"]).execute()
deleted_count += 1
logging.info(f"[{i}/{len(comments)}] Deleted comment with ID: {comment['id']} (Posted: {comment['publishedAt']})")
print(f"[{i}/{len(comments)}] Successfully deleted comment (Posted: {comment['publishedAt']})")
time.sleep(0.5)
except googleapiclient.errors.HttpError as e:
failed_count += 1
error_reason = e.error_details[0]['reason'] if e.error_details else 'Unknown'
if e.resp.status == 404:
logging.warning(f"[{i}/{len(comments)}] Comment {comment['id']} not found (already deleted or invalid ID)")
print(f"[{i}/{len(comments)}] Comment not found (may already be deleted)")
elif e.resp.status in [403, 429]:
logging.warning(f"[{i}/{len(comments)}] Rate limit hit (status {e.resp.status}), waiting 60 seconds...")
print(f"[{i}/{len(comments)}] Rate limit hit, waiting 60 seconds...")
time.sleep(60)
try:
youtube.comments().delete(id=comment["id"]).execute()
deleted_count += 1
failed_count -= 1
logging.info(f"[{i}/{len(comments)}] Retry successful - Deleted comment with ID: {comment['id']}")
print(f"[{i}/{len(comments)}] Retry successful")
except:
logging.error(f"[{i}/{len(comments)}] Retry failed for comment {comment['id']}")
print(f"[{i}/{len(comments)}] Retry failed")
else:
logging.error(f"[{i}/{len(comments)}] Failed to delete comment {comment['id']}: {e} (Status: {e.resp.status}, Reason: {error_reason})")
print(f"[{i}/{len(comments)}] Failed to delete comment: {error_reason}")
print(f"\nDeletion complete! Successfully deleted: {deleted_count}, Failed: {failed_count}")
logging.info(f"Deletion summary - Successfully deleted: {deleted_count}, Failed: {failed_count}")
def main():
print("YouTube Comment Deleter")
print("=" * 50)
if not os.path.exists(COMMENT_DIR):
print(f"Error: Directory not found: {COMMENT_DIR}")
print("Please make sure you've extracted the Google Takeout data correctly.")
return
print("Choose deletion order:")
print("1. Most recent to oldest")
print("2. Oldest to most recent")
choice = input("Enter 1 or 2: ").strip()
order = "recent" if choice == "1" else "oldest"
logging.info(f"Selected deletion order: {order}")
youtube = get_authenticated_service()
logging.info("Authenticated successfully. Starting comment deletion process.")
print("\nYouTube API Quota Notice:")
print("The YouTube Data API provides 10,000 units per day (resetting at 8:00 PM BST).")
print("Deleting a comment costs 50 units, allowing approximately 200 deletions daily.")
print("If you exceed this limit, the script will pause for 60 seconds on errors (403/429).")
print("You may need multiple runs over several days for large comment collections.\n")
proceed = input("Do you want to proceed with loading and deleting comments? (y/N): ").strip().lower()
if proceed != 'y':
print("Operation cancelled.")
return
comments = load_comments_from_csv()
if not comments:
logging.info("No comments loaded from CSV files. Exiting.")
print("No comments loaded from CSV files. Please check:")
print("1. CSV files exist in the correct directory")
print("2. CSV files contain valid comment data")
print("3. File permissions allow reading")
return
comments = sort_comments(comments, order)
logging.info(f"Sorted {len(comments)} comments in {order} order.")
print(f"\nReady to delete {len(comments)} comments in {order} order.")
final_confirm = input("Are you sure you want to delete ALL these comments? This cannot be undone! (type 'DELETE' to confirm): ").strip()
if final_confirm != 'DELETE':
print("Operation cancelled for safety.")
return
print("\nStarting deletion process...")
logging.info("Starting comment deletion...")
delete_comments(youtube, comments)
logging.info("Deletion process complete.")
if __name__ == "__main__":
try:
main()
except KeyboardInterrupt:
logging.info("Script interrupted by user.")
print("\nScript stopped by user.")
except Exception as e:
logging.error(f"Unexpected error: {e}")
print(f"An unexpected error occurred. Check delete_youtube_comments.log for details.")
print(f"Error: {e}")- Open Command Prompt
- Navigate to your folder:
cd C:\ytcdl - Run the script:
python delete_youtube_comments.py - Follow the prompts:
- Choose deletion order (recent to old, or old to recent)
- Confirm you want to proceed
- Your browser will open - log in to Google and grant permissions
- Type 'DELETE' to confirm final deletion
- Daily Quota: 10,000 units per day
- Reset Time: 8:00 PM BST each day
- Cost per Deletion: 50 units
- Daily Deletion Limit: ~200 comments maximum
- The script will show progress:
[45/1000] Successfully deleted comment - If you hit rate limits, it automatically waits 60 seconds and retries
- All activity is logged to
delete_youtube_comments.log - For large collections, you'll need to run the script multiple days
You can watch the log file in real-time:
- PowerShell:
Get-Content delete_youtube_comments.log -Wait - Command Prompt:
type delete_youtube_comments.log
- Check that your Google Takeout data is extracted to the correct location
- Ensure CSV files are in:
C:\ytcdl\YouTube and YouTube Music\comments\ - Verify files are named like
comments.csv,comments(1).csv, etc.
- Make sure
client_secret.jsonis in theC:\ytcdl\folder - Delete
token.jsonand re-run to re-authenticate - Ensure YouTube Data API v3 is enabled in Google Cloud Console
- This is normal - you've hit the daily quota
- Wait until 8:00 PM BST for reset, or continue tomorrow
- The script handles this automatically
- Run Command Prompt as Administrator
- Check that Python has permission to read/write files in
C:\ytcdl\
C:\ytcdl\
βββ client_secret.json (Google API credentials)
βββ delete_youtube_comments.py (This script)
βββ token.json (Created automatically)
βββ delete_youtube_comments.log (Deletion log)
βββ YouTube and YouTube Music\
βββ comments\
βββ comments.csv
βββ comments(1).csv
βββ ... (all CSV files from Google Takeout)
- This cannot be undone - deleted comments are permanently removed
- The script respects YouTube's terms of service and API limits
- Updates to your Google My Activity page may take 15-30 minutes to reflect
- Keep your
delete_youtube_comments.logfile for reference
Use this tool responsibly and ensure you understand the consequences of mass comment deletion.
For additional support or questions, consult the log files and double-check your setup against this guide.