feat: write batch history data to InfluxDB with original timestamps#71
Open
wxtry wants to merge 22 commits intomash2k3:mainfrom
Open
feat: write batch history data to InfluxDB with original timestamps#71wxtry wants to merge 22 commits intomash2k3:mainfrom
wxtry wants to merge 22 commits intomash2k3:mainfrom
Conversation
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Options were silently wiped by async_create_entry(data={})
- Now: async_update_entry for model data, async_create_entry for options
- Use config_entry.entry_id instead of private _config_entry_id
- Guard _auto_switch_report_mode_on_battery_state() with options check - Default initial TLV config to Historic mode - Change JSON default update_interval from 15s to 300s Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- TLV Historic mode uses config_entry.options timeout (default 65 min) - JSON devices use config_entry.options timeout (default 65 min) - Real-time mode keeps fixed 300s timeout Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Add _import_batch_statistics() helper - CMD 0x42 history: all points imported via async_import_statistics - Latest point still updates entity current state - Statistics aligned to 5-minute boundaries - Fix test_batch_history alignment test math (1709500123 % 300 = 223, not 123) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Type 17: parse batch sensorData, update entities from latest point - Import all batch points to HA long-term statistics - Send ACK when need_ack=1 - Type 13 explicitly ignored (was silently dropped before) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
HA's async_import_statistics requires statistic_id to be lowercase.
async_import_statistics validates statistic_id as entity_id format and requires source="recorder". External statistics need async_add_external_statistics which accepts the colon-separated format. Also adds enhanced status logging and error handling for statistics import to aid debugging offline timeout behavior. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
HA requires timestamps at the top of the hour (minutes=0, seconds=0). Changed from 5-minute alignment to 1-hour alignment, and aggregate multiple data points within the same hour by averaging. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Firmware 2.0.6 sends historical data via CMD 0x31 instead of CMD 0x42. Both carry identical sensorData[] arrays. Treat CMD 0x31 the same as CMD 0x42 for batch statistics import. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Keep recorder dependency addition but revert fork-specific name, documentation URL, issue tracker, and version to upstream values. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Reuse HA's built-in InfluxDB integration connection to write historical sensor data directly, preserving original device timestamps for Grafana. Works for both JSON Type 17 and TLV CMD 0x42/0x31 batch data paths. Gracefully no-ops if InfluxDB integration is not configured. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
HA's InfluxDB integration uses unit_of_measurement (e.g. °C, %, ppm) as the measurement name, not "state". Match this behavior so batch historical data merges with normal state_changed event data in Grafana. Also add debug logging for InfluxDB batch writes. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Optional companion to #70 — writes batch historical data points to InfluxDB with their original timestamps, enabling high-resolution historical data in Grafana.
When devices report batch history (JSON Type 17, TLV CMD 0x42/0x31), the integration already imports hourly aggregates into HA Statistics (#70). This PR additionally writes the raw data points to InfluxDB, preserving per-sample timestamps and full precision.
How it works
hass.data["influxdb"])event_to_json(matching measurement name =unit_of_measurement, tags =domain+entity_id)Known limitations
hass.data["influxdb"]internal API (InfluxThread.influx.write) — not a public interface, may break on HA updatesinfluxdbcomponent)I understand this feature couples to InfluxDB internals, which may not be desirable for the project. Happy to adjust or withdraw if you'd prefer not to take on this maintenance burden.
Depends on
Test plan
🤖 Generated with Claude Code