Absolutely — I’ll do both parts: a clean README you can use as-is, and honest advice about GitHub value and positioning.
A command-line utility for analyzing timestamped event logs and producing time-based activity summaries.
This tool was built as a foundational analytics utility and learning project, focusing on schema-first design, separation of concerns, and incremental feature development.
* Parses structured JSON event logs
* Validates input schema before processing
* Sorts events chronologically
* Aggregates activity by hour
* Optional per-user and per-hour breakdowns
* Console output with selectable modes
* Export results to:
* **JSON**
* **CSV**
* Clean separation between:
* Data loading
* Analysis
* Output formatting
The tool expects a JSON file containing a list of events with the following schema:
{
"timestamp": "2025-01-01T09:15:00",
"user": "alice",
"status": "success"
}timestampmust be ISO-8601 formattedstatusis treated as categorical (no assumption of boolean)
python activity_timeline_analyzer_v2_export.py \
--input events.jsonpython activity_timeline_analyzer_v2_export.py \
--input events.json \
--mode hourpython activity_timeline_analyzer_v2_export.py \
--input events.json \
--output report.json \
--format jsonpython activity_timeline_analyzer_v2_export.py \
--input events.json \
--mode hour \
--output report.csv \
--format csv### JSON
* Structured, machine-readable output
* Includes per-hour counts (and additional breakdowns when enabled)
### CSV
* Tabular export of **events per hour**
* Designed for spreadsheets and reporting tools
This project intentionally prioritizes: * Clear schema contracts * Predictable data flow * Minimal side effects * Easy extensibility
It is designed to be adapted, not treated as a finished product.
* CSV export currently supports hourly aggregation only
* No real-time ingestion (file-based input)
* No visualization layer
* Designed for batch analysis, not streaming
These are deliberate trade-offs to keep the core logic simple and robust.
* Per-user CSV export
* Time-window filtering
* Threshold alerts
* Exit codes for monitoring pipelines
* Integration with log collectors or schedulers
MIT License
This tool was built as part of an ongoing learning journey in Python, data processing, and AI-assisted development. It reflects a focus on building reliable foundations before adding complexity.
Darren Williamson Python Utility Development * Automation * Data Analysis Uk Citizen / Spain-based / Remote LinkedIn: https://www.linkedin.com/in/darren-williamson3/