Skip to content

Commit 4d8086e

Browse files
committed
first commit
0 parents  commit 4d8086e

File tree

4,995 files changed

+710033
-0
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

4,995 files changed

+710033
-0
lines changed

.github/workflows/ci-cd.yml

Lines changed: 131 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,131 @@
1+
name: CI/CD Pipeline
2+
3+
on:
4+
push:
5+
branches: [ main ]
6+
pull_request:
7+
branches: [ main ]
8+
9+
jobs:
10+
lint:
11+
name: Lint Code
12+
runs-on: ubuntu-latest
13+
steps:
14+
- name: Checkout code
15+
uses: actions/checkout@v3
16+
17+
- name: Setup Node.js
18+
uses: actions/setup-node@v3
19+
with:
20+
node-version: '18'
21+
cache: 'npm'
22+
23+
- name: Install dependencies
24+
run: npm ci
25+
26+
- name: Run ESLint
27+
run: npx eslint . --ext .js
28+
29+
test:
30+
name: Run Tests
31+
runs-on: ubuntu-latest
32+
needs: lint
33+
services:
34+
postgres:
35+
image: postgres:14
36+
env:
37+
POSTGRES_PASSWORD: postgres
38+
POSTGRES_USER: postgres
39+
POSTGRES_DB: claim_db_test
40+
ports:
41+
- 5432:5432
42+
options: >-
43+
--health-cmd pg_isready
44+
--health-interval 10s
45+
--health-timeout 5s
46+
--health-retries 5
47+
48+
steps:
49+
- name: Checkout code
50+
uses: actions/checkout@v3
51+
52+
- name: Setup Node.js
53+
uses: actions/setup-node@v3
54+
with:
55+
node-version: '18'
56+
cache: 'npm'
57+
58+
- name: Install dependencies
59+
run: npm ci
60+
61+
- name: Run tests
62+
run: npm test
63+
env:
64+
NODE_ENV: test
65+
DB_HOST: localhost
66+
DB_USER: postgres
67+
DB_PASSWORD: postgres
68+
DB_NAME: claim_db_test
69+
JWT_SECRET: test_secret_key
70+
71+
security-scan:
72+
name: Security Scan
73+
runs-on: ubuntu-latest
74+
needs: lint
75+
steps:
76+
- name: Checkout code
77+
uses: actions/checkout@v3
78+
79+
- name: Setup Node.js
80+
uses: actions/setup-node@v3
81+
with:
82+
node-version: '18'
83+
cache: 'npm'
84+
85+
- name: Install dependencies
86+
run: npm ci
87+
88+
- name: Run npm audit
89+
run: npm audit --audit-level=high
90+
91+
build-and-publish:
92+
name: Build and Publish Docker Image
93+
runs-on: ubuntu-latest
94+
needs: [test, security-scan]
95+
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
96+
steps:
97+
- name: Checkout code
98+
uses: actions/checkout@v3
99+
100+
- name: Set up Docker Buildx
101+
uses: docker/setup-buildx-action@v2
102+
103+
- name: Login to DockerHub
104+
uses: docker/login-action@v2
105+
with:
106+
username: ${{ secrets.DOCKERHUB_USERNAME }}
107+
password: ${{ secrets.DOCKERHUB_TOKEN }}
108+
109+
- name: Build and push
110+
uses: docker/build-push-action@v4
111+
with:
112+
context: .
113+
push: true
114+
tags: |
115+
yourusername/claim-api:latest
116+
yourusername/claim-api:${{ github.sha }}
117+
cache-from: type=registry,ref=yourusername/claim-api:latest
118+
cache-to: type=inline
119+
120+
deploy:
121+
name: Deploy to Production
122+
runs-on: ubuntu-latest
123+
needs: build-and-publish
124+
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
125+
steps:
126+
- name: Deploy to Production
127+
run: echo "Deploy to production - This step would use a deployment tool or script"
128+
# In a real setup, you would use a deployment tool like:
129+
# - AWS CLI for ECS/EKS
130+
# - kubectl for Kubernetes
131+
# - SSH into server and deploy with Docker Compose

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
.env

Dockerfile

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
FROM node:18-alpine
2+
3+
# Create app directory
4+
WORKDIR /usr/src/app
5+
6+
# Install app dependencies
7+
# A wildcard is used to ensure both package.json AND package-lock.json are copied
8+
COPY package*.json ./
9+
10+
# Install dependencies
11+
RUN npm ci --only=production
12+
13+
# Bundle app source
14+
COPY . .
15+
16+
# Create volume for logs
17+
VOLUME [ "/usr/src/app/logs" ]
18+
19+
# Expose the port the app runs on
20+
EXPOSE 5000
21+
22+
# Set environment variable
23+
ENV NODE_ENV=production
24+
25+
# Command to run the application
26+
CMD ["node", "index.js"]

README.md

Lines changed: 187 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,187 @@
1+
# Claim Processing API
2+
3+
A RESTful API for processing healthcare claims, built with Node.js, Express.js, and PostgreSQL. This API features comprehensive monitoring, logging, and alerting mechanisms.
4+
5+
## Features
6+
7+
- **Claim Processing**: Submit and retrieve healthcare claims
8+
- **JWT Authentication**: Secure API endpoints
9+
- **Robust Logging**: Structured logs for debugging and auditing
10+
- **Metrics Monitoring**: Prometheus integration for real-time system metrics
11+
- **Visualization**: Grafana dashboards for monitoring system health
12+
- **Docker Support**: Containerized application and database
13+
- **Rate Limiting**: Protection against API abuse
14+
- **Security Headers**: Enhanced API security
15+
16+
## Tech Stack
17+
18+
- **Backend**: Node.js, Express.js
19+
- **Database**: PostgreSQL
20+
- **Authentication**: JWT
21+
- **Logging**: Winston
22+
- **Monitoring**: Prometheus, Grafana
23+
- **Containerization**: Docker, Docker Compose
24+
25+
## Setup Instructions
26+
27+
### Prerequisites
28+
29+
- Node.js (v14 or higher)
30+
- Docker and Docker Compose (for containerized setup)
31+
- PostgreSQL (if running locally)
32+
33+
### Environment Variables
34+
35+
Create a `.env` file in the root directory with the following variables:
36+
37+
```
38+
DB_NAME=claim_db
39+
DB_USER=postgres
40+
DB_PASSWORD=postgres
41+
DB_HOST=localhost
42+
JWT_SECRET=your_secret_key
43+
PORT=5000
44+
NODE_ENV=development
45+
```
46+
47+
### Local Development
48+
49+
1. Install dependencies:
50+
```
51+
npm install
52+
```
53+
54+
2. Start PostgreSQL database (if not using Docker)
55+
56+
3. Run the application:
57+
```
58+
npm run dev
59+
```
60+
61+
### Docker Setup
62+
63+
1. Build and start the containers:
64+
```
65+
docker-compose up -d
66+
```
67+
68+
2. The API will be available at `http://localhost:5000`
69+
3. Prometheus will be available at `http://localhost:9090`
70+
4. Grafana will be available at `http://localhost:3000`
71+
72+
## API Documentation
73+
74+
### Authentication
75+
76+
All endpoints require a valid JWT token in the Authorization header:
77+
78+
```
79+
Authorization: Bearer <your-token>
80+
```
81+
82+
### Endpoints
83+
84+
#### Submit a Claim
85+
86+
```
87+
POST /claims
88+
```
89+
90+
Request body:
91+
```json
92+
{
93+
"payer": "Insurance Company",
94+
"amount": 500.00,
95+
"procedure_codes": ["P1", "P2"]
96+
}
97+
```
98+
99+
#### Get Claim Details
100+
101+
```
102+
GET /claims/:id
103+
```
104+
105+
Returns the claim details for the specified ID.
106+
107+
#### Check Claim Status
108+
109+
```
110+
GET /claims/status/:id
111+
```
112+
113+
Returns the current status of the claim.
114+
115+
## Monitoring and Logging
116+
117+
### Logs
118+
119+
Logs are stored in the `logs` directory with the following files:
120+
121+
- `combined.log`: All logs
122+
- `error.log`: Error logs only
123+
- `access.log`: HTTP request logs
124+
- `exceptions.log`: Uncaught exceptions
125+
- `rejections.log`: Unhandled promise rejections
126+
127+
### Metrics
128+
129+
Metrics are exposed at the `/metrics` endpoint in Prometheus format.
130+
131+
Key metrics include:
132+
133+
- HTTP request duration
134+
- Claim processing duration
135+
- Database query performance
136+
- System metrics (CPU, memory)
137+
- Request counts and error rates
138+
139+
### Alerting
140+
141+
Prometheus can be configured with alerting rules to notify about system issues:
142+
143+
- High error rates
144+
- Slow response times
145+
- System resource constraints
146+
147+
## Log Storage Strategies
148+
149+
Several strategies can be used for log storage:
150+
151+
1. **Local File System**: Simple but not scalable
152+
- Pros: Easy to set up, good for development
153+
- Cons: Limited storage, not suitable for distributed systems
154+
155+
2. **Centralized Logging Service**: (ELK Stack or similar)
156+
- Pros: Searchable, supports high volumes, visualization
157+
- Cons: Requires additional infrastructure, more complex
158+
159+
3. **Cloud-based Logging**: (AWS CloudWatch, GCP Logging)
160+
- Pros: Managed service, scalable, integrated with cloud services
161+
- Cons: Vendor lock-in, potential costs
162+
163+
4. **Log Aggregation Tools**: (Fluentd, Logstash)
164+
- Pros: Flexible, supports multiple destinations
165+
- Cons: Requires configuration and maintenance
166+
167+
The recommended approach is to use Elastic Stack (Elasticsearch, Logstash, Kibana) for production environments, which offers powerful search capabilities and visualization tools for log analysis.
168+
169+
## CI/CD Pipeline
170+
171+
The repository includes a GitHub Actions workflow for CI/CD:
172+
173+
- Runs linting and tests
174+
- Builds Docker image
175+
- Publishes image to container registry
176+
- Deploys to target environment
177+
178+
Pipeline stages:
179+
1. **Build**: Compile code and create artifacts
180+
2. **Test**: Run unit and integration tests
181+
3. **Scan**: Security and vulnerability scanning
182+
4. **Package**: Build container images
183+
5. **Deploy**: Push to target environment
184+
185+
## License
186+
187+
[MIT License](LICENSE)

config/db.js

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
const { Sequelize } = require('sequelize');
2+
require('dotenv').config();
3+
4+
const sequelize = new Sequelize(process.env.DB_NAME, process.env.DB_USER, process.env.DB_PASSWORD, {
5+
host: process.env.DB_HOST,
6+
dialect: 'postgres',
7+
});
8+
9+
module.exports = sequelize;

0 commit comments

Comments
 (0)