Monitor and visualize your Snowflake Snowpipe data ingestion pipelines using Apache Superset dashboards.
- Docker and Docker Compose
- Snowflake account with access to pipe usage data
- Snowpipe data models deployed (snowflake_pipe_* tables)
- Configure Environment Variables
# Create .env file with database connection details
# Ensure your Snowflake database contains the required pipe monitoring tables- Start Superset
docker-compose up -d- Import Dashboard Assets
docker exec -it apache-superset /bin/bash
cd /usr/src/
superset import-directory assets --overwrite- Pipe Performance: Files processed, data loaded (GB), processing times
- Cost Analysis: Credits used, daily costs, cost trends
- Error Tracking: Failed file counts, error patterns
- Status Overview: Current pipe status and health
- snowflake_pipe_copy_history: Detailed copy operation history
- snowflake_pipe_stats: Aggregated pipe statistics
- snowflake_pipe_status: Current status of all pipes
- Verify Snowpipe usage tables exist in your database
- Check that pipes have recent activity to display
- Ensure proper permissions on PIPE_USAGE_HISTORY views
- Time Range filter for historical analysis
- Pipe Name filter to focus on specific pipelines
- Database/Schema filters for multi-tenant environments
- Pipe usage data typically updates every few hours
- Dashboard auto-refreshes based on configured intervals
- Manual refresh available via dashboard controls