This guide provides examples and best practices for using the PostgreSQL MCP server in your applications.
-
Install the server:
npm install npm run build
-
Test the server:
# On Unix/Linux/macOS ./test-client.js get_schema_info '{"connectionString":"postgresql://user:password@localhost:5432/dbname"}' # On Windows node test-client.js get_schema_info '{"connectionString":"postgresql://user:password@localhost:5432/dbname"}'
PostgreSQL connection strings follow this format:
postgresql://[user[:password]@][host][:port][/dbname][?param1=value1&...]
Examples:
postgresql://postgres:password@localhost:5432/mydbpostgresql://postgres@localhost/mydbpostgresql://postgres:password@localhost/mydb?sslmode=require
List all tables in a database:
{
"name": "get_schema_info",
"arguments": {
"connectionString": "postgresql://postgres:password@localhost:5432/mydb"
}
}Get detailed information about a specific table:
{
"name": "get_schema_info",
"arguments": {
"connectionString": "postgresql://postgres:password@localhost:5432/mydb",
"tableName": "users"
}
}Create a new users table:
{
"name": "create_table",
"arguments": {
"connectionString": "postgresql://postgres:password@localhost:5432/mydb",
"tableName": "users",
"columns": [
{ "name": "id", "type": "SERIAL", "nullable": false },
{ "name": "username", "type": "VARCHAR(100)", "nullable": false },
{ "name": "email", "type": "VARCHAR(255)", "nullable": false },
{ "name": "created_at", "type": "TIMESTAMP", "default": "NOW()" }
]
}
}Add, modify, and drop columns:
{
"name": "alter_table",
"arguments": {
"connectionString": "postgresql://postgres:password@localhost:5432/mydb",
"tableName": "users",
"operations": [
{ "type": "add", "columnName": "last_login", "dataType": "TIMESTAMP" },
{ "type": "alter", "columnName": "email", "nullable": false },
{ "type": "drop", "columnName": "temporary_field" }
]
}
}Export to JSON:
{
"name": "export_table_data",
"arguments": {
"connectionString": "postgresql://postgres:password@localhost:5432/mydb",
"tableName": "users",
"outputPath": "./exports/users.json",
"where": "created_at > '2023-01-01'",
"limit": 1000
}
}Export to CSV:
{
"name": "export_table_data",
"arguments": {
"connectionString": "postgresql://postgres:password@localhost:5432/mydb",
"tableName": "users",
"outputPath": "./exports/users.csv",
"format": "csv"
}
}Import from JSON:
{
"name": "import_table_data",
"arguments": {
"connectionString": "postgresql://postgres:password@localhost:5432/mydb",
"tableName": "users",
"inputPath": "./imports/users.json",
"truncateFirst": true
}
}Import from CSV:
{
"name": "import_table_data",
"arguments": {
"connectionString": "postgresql://postgres:password@localhost:5432/mydb",
"tableName": "users",
"inputPath": "./imports/users.csv",
"format": "csv",
"delimiter": ","
}
}Copy data between databases:
{
"name": "copy_between_databases",
"arguments": {
"sourceConnectionString": "postgresql://postgres:password@localhost:5432/source_db",
"targetConnectionString": "postgresql://postgres:password@localhost:5432/target_db",
"tableName": "users",
"where": "active = true",
"truncateTarget": false
}
}Basic monitoring:
{
"name": "monitor_database",
"arguments": {
"connectionString": "postgresql://postgres:password@localhost:5432/mydb"
}
}Advanced monitoring with alerts:
{
"name": "monitor_database",
"arguments": {
"connectionString": "postgresql://postgres:password@localhost:5432/mydb",
"includeTables": true,
"includeQueries": true,
"includeLocks": true,
"includeReplication": true,
"alertThresholds": {
"connectionPercentage": 80,
"longRunningQuerySeconds": 30,
"cacheHitRatio": 0.95,
"deadTuplesPercentage": 10,
"vacuumAge": 7
}
}
}Analyze configuration:
{
"name": "analyze_database",
"arguments": {
"connectionString": "postgresql://postgres:password@localhost:5432/mydb",
"analysisType": "configuration"
}
}Analyze performance:
{
"name": "analyze_database",
"arguments": {
"connectionString": "postgresql://postgres:password@localhost:5432/mydb",
"analysisType": "performance"
}
}Analyze security:
{
"name": "analyze_database",
"arguments": {
"connectionString": "postgresql://postgres:password@localhost:5432/mydb",
"analysisType": "security"
}
}Debug connection issues:
{
"name": "debug_database",
"arguments": {
"connectionString": "postgresql://postgres:password@localhost:5432/mydb",
"issue": "connection",
"logLevel": "debug"
}
}Debug performance issues:
{
"name": "debug_database",
"arguments": {
"connectionString": "postgresql://postgres:password@localhost:5432/mydb",
"issue": "performance",
"logLevel": "debug"
}
}Debug lock issues:
{
"name": "debug_database",
"arguments": {
"connectionString": "postgresql://postgres:password@localhost:5432/mydb",
"issue": "locks",
"logLevel": "debug"
}
}Debug replication issues:
{
"name": "debug_database",
"arguments": {
"connectionString": "postgresql://postgres:password@localhost:5432/mydb",
"issue": "replication",
"logLevel": "debug"
}
}-
Connection Pooling: The server implements connection pooling internally, but you should still close connections when done.
-
Error Handling: Always check the
successfield in responses and handle errors appropriately. -
Security:
- Never hardcode connection strings with passwords in your code
- Use environment variables or secure vaults for credentials
- Use SSL connections in production environments
-
Performance:
- Limit the amount of data returned by using WHERE clauses and LIMIT
- For large data exports/imports, consider using batching
- Monitor query performance regularly
-
Monitoring:
- Set up regular monitoring to catch issues early
- Configure appropriate alert thresholds based on your application needs
- Pay special attention to connection usage and cache hit ratio
-
Connection Errors:
- Check that the PostgreSQL server is running
- Verify connection string parameters
- Ensure network connectivity between the MCP server and PostgreSQL
-
Permission Errors:
- Verify that the user has appropriate permissions for the requested operations
- Check schema and table permissions
-
Performance Issues:
- Use the
analyze_databaseanddebug_databasetools to identify bottlenecks - Check for long-running queries
- Verify proper indexing on tables
- Use the
-
Data Migration Issues:
- Ensure table schemas match when copying between databases
- Check disk space for large exports
- Verify file permissions for import/export paths