Create a .env file in the backend root directory with the following configurations:
# Database Configuration
DB_HOST=localhost
DB_PORT=5432
DB_USERNAME=postgres
DB_PASSWORD=password
DB_DATABASE=manage_assets
DB_SYNCHRONIZE=true
NODE_ENV=development
# Document Storage
UPLOAD_DIR=./uploads/documents
MAX_FILE_SIZE=524288000
# JWT Configuration
JWT_SECRET=your_jwt_secret_key
JWT_EXPIRATION=24h
# Server Configuration
PORT=3000
API_PREFIX=api
# CORS Configuration
CORS_ORIGIN=http://localhost:3000,http://localhost:3001
# File Upload Configuration
ALLOWED_MIME_TYPES=application/pdf,image/jpeg,image/png,image/gif,text/plain,application/msword,application/vnd.openxmlformats-officedocument.wordprocessingml.document,application/vnd.ms-excel,application/vnd.openxmlformats-officedocument.spreadsheetml.sheet
# Audit Configuration
ENABLE_AUDIT_LOGGING=true
AUDIT_LOG_RETENTION_DAYS=365
# Storage Configuration
ENABLE_FILE_COMPRESSION=false
ENABLE_FILE_ENCRYPTION=false
ENCRYPTION_KEY=your_encryption_key
# Feature Flags
ENABLE_DOCUMENT_PREVIEW=false
ENABLE_OCR=false
ENABLE_CLOUD_STORAGE=falseThe tables are automatically created by TypeORM synchronization when DB_SYNCHRONIZE=true.
Tables created:
documentsdocument_versionsdocument_access_permissionsdocument_audit_logs
Indexes are automatically created during table synchronization.
No initial data is required. The system is ready to use once tables are created.
cd backend
npm install# For development with synchronize: true
npm run start:dev
# For production with migrations
npm run migration:run# Development with watch mode
npm run start:dev
# Production build and run
npm run build
npm run start:prodThe DocumentsModule is already integrated into the AppModule. No additional setup is needed.
To verify integration:
- Check app.module.ts imports the DocumentsModule
- Verify all entities are included in TypeOrmModule.forRoot()
- Confirm DocumentsModule is exported from documents module
Once the server is running, access the Swagger documentation at:
http://localhost:3000/api/docs
# 1. Upload a document
curl -X POST http://localhost:3000/documents/upload \
-H "Authorization: Bearer YOUR_JWT_TOKEN" \
-F "file=@test-file.pdf" \
-F "assetId=550e8400-e29b-41d4-a716-446655440000" \
-F "documentType=invoice" \
-F "name=Test Invoice"
# 2. List documents
curl -X GET "http://localhost:3000/documents?limit=10" \
-H "Authorization: Bearer YOUR_JWT_TOKEN"
# 3. Get document details
curl -X GET http://localhost:3000/documents/DOC_ID \
-H "Authorization: Bearer YOUR_JWT_TOKEN"
# 4. Download document
curl -X GET http://localhost:3000/documents/DOC_ID/download \
-H "Authorization: Bearer YOUR_JWT_TOKEN" \
-o downloaded-file.pdf
# 5. Grant access
curl -X POST http://localhost:3000/documents/DOC_ID/permissions/grant \
-H "Authorization: Bearer YOUR_JWT_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"userId": "user-uuid",
"permissions": ["view", "download"],
"expiresAt": "2025-12-31"
}'Solution: Ensure the UPLOAD_DIR exists or is accessible:
mkdir -p ./uploads/documents
chmod 755 ./uploads/documentsSolution: Increase MAX_FILE_SIZE in environment variables:
MAX_FILE_SIZE=1073741824 # 1GBSolution: Ensure proper file permissions:
chmod -R 755 ./uploadsSolution: Verify database configuration:
# Test connection
psql -h localhost -U postgres -d manage_assetsSolution: Ensure JWT_SECRET is configured and token is valid:
JWT_SECRET=your_secure_secret_key
JWT_EXPIRATION=24h- Use SSD for upload directory
- Implement file cleanup policies
- Consider cloud storage integration (S3, Azure Blob)
- Add indexes for frequently searched columns
- Implement partitioning for audit logs
- Regular maintenance and vacuuming
- Implement caching for frequently accessed documents
- Use pagination for list operations
- Compress responses
- Monitor disk usage
- Track API response times
- Log error rates
// Implement in upload validation
- Validate file type via magic bytes, not just extension
- Scan uploaded files for malware
- Implement rate limiting// Already implemented features:
- Permission-based access control
- User identity verification
- Audit logging of all access
- Permission expiration// Recommended additions:
- Encrypt sensitive files at rest
- Use HTTPS for all communications
- Implement TLS/SSL
- Use secure headers# Daily backup script
#!/bin/bash
DATE=$(date +%Y%m%d)
tar -czf /backups/documents_$DATE.tar.gz ./uploads/documents# PostgreSQL backup
pg_dump -h localhost -U postgres manage_assets | gzip > backup_$(date +%Y%m%d).sql.gz# Restore files
tar -xzf /backups/documents_$DATE.tar.gz -C ./
# Restore database
gunzip < backup_$DATE.sql.gz | psql -h localhost -U postgres manage_assetsLogs are output to console in development and file in production.
Query audit logs via API:
curl -X GET "http://localhost:3000/documents/DOC_ID/audit-logs" \
-H "Authorization: Bearer YOUR_JWT_TOKEN"Configure error tracking service (Sentry, DataDog):
import * as Sentry from "@sentry/node";
// In main.ts
Sentry.init({
dsn: process.env.SENTRY_DSN,
});-- Create temporary table
CREATE TABLE temp_documents (
asset_id UUID,
file_path VARCHAR,
document_type VARCHAR,
name VARCHAR,
description TEXT
);
-- Import from CSV
COPY temp_documents FROM 'documents.csv' WITH (FORMAT csv);// Use DocumentService to import
// Handle file migration to new storage system# Verify document count
curl -X GET "http://localhost:3000/documents?limit=1000" \
-H "Authorization: Bearer YOUR_JWT_TOKEN" \
| jq '.total'- Cloud storage integration (S3, Azure, GCS)
- Full-text search with Elasticsearch
- Document preview generation
- OCR capability
- E-signature integration
- Workflow approvals
- Advanced analytics
- Mobile app sync
- Real-time collaboration
- Advanced version comparison
- API Documentation: http://localhost:3000/api/docs
- Module README: ./README.md
- Source Code: ./src
- Entity Definitions: ./entities
- DTOs: ./dto
- Services: ./services
- Controllers: ./controllers