-
-
Notifications
You must be signed in to change notification settings - Fork 37
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Description du bug
Après avoir galéré pour refaire mon consentement auprès d'Enedis, j'ai enfin pu refaire tourner ce module et récupérer toutes mes données de consommation. Cependant, je me souviens qu'auparavant, je récupérais également mes données de production et injection (panneaux solaires).
En regardant le log du module, j'ai trouvé cette erreur (j'ai remplacé le PDL et les mots de passe par XXXXXX) :
Tout le reste fonctionne parfaitement.
2024-12-13 07:06:24.318 - INFO : Effacement des données importées dans Energy.
2024-12-13 07:06:24.318 - INFO : - myelectricaldata:XXXXXXXXXXXXX_production
2024-12-13 07:06:28.250 - INFO : 127.0.0.1:49058 - "GET / HTTP/1.1" 200
Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/websocket/_socket.py", line 118, in recv
bytes_ = _recv()
^^^^^^^
File "/usr/local/lib/python3.12/site-packages/websocket/_socket.py", line 97, in _recv
return sock.recv(bufsize)
^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/ssl.py", line 1233, in recv
return self.read(buflen)
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/ssl.py", line 1106, in read
return self._sslobj.read(len)
^^^^^^^^^^^^^^^^^^^^^^
TimeoutError: The read operation timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/app/models/export_home_assistant_ws.py", line 496, in import_data
self.clear_data(list_statistic_ids)
File "/app/models/export_home_assistant_ws.py", line 181, in clear_data
clear_stat = self.send(clear_statistics)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/models/export_home_assistant_ws.py", line 138, in send
output = json.loads(self.websocket.recv())
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/websocket/_core.py", line 388, in recv
opcode, data = self.recv_data()
^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/websocket/_core.py", line 416, in recv_data
opcode, frame = self.recv_data_frame(control_frame)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/websocket/_core.py", line 437, in recv_data_frame
frame = self.recv_frame()
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/websocket/_core.py", line 478, in recv_frame
return self.frame_buffer.recv_frame()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/websocket/_abnf.py", line 363, in recv_frame
self.recv_header()
File "/usr/local/lib/python3.12/site-packages/websocket/_abnf.py", line 319, in recv_header
header = self.recv_strict(2)
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/websocket/_abnf.py", line 398, in recv_strict
bytes_ = self.recv(min(16384, shortage))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/websocket/_core.py", line 563, in _recv
return recv(self.sock, bufsize)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/websocket/_socket.py", line 120, in recv
raise WebSocketTimeoutException("Connection timed out")
websocket._exceptions.WebSocketTimeoutException: Connection timed out
2024-12-13 07:06:29.337 - ERROR : Connection timed out
2024-12-13 07:06:29.337 - CRITICAL : Erreur lors de l'export des données vers Home Assistant
config.yaml
cycle: 14400 # 4H
debug: true
log2file: false
wipe_influxdb: false # Work only with influxdb > 2.X
# Par défaut le backend en local dans le /data/cache.db du conteneur.
# Mais il est possible de basculer sur une base de données externe de type SQLite ou PostgreSQL
# Exemple pour Postgres :
# storage_uri: postgresql://myelectricaldata:myelectricaldata@postgres:5432/myelectricaldata
mqtt:
enable: true
hostname: core-mosquitto
port: 1883
username: mqtt
password: XXXXXX
prefix: myelectricaldata
client_id: myelectricaldata # DOIT ETRE UNIQUE SUR LA TOTALITE DES CLIENTS CONNECTE AU SERVEUR MQTT
retain: true
qos: 0
home_assistant: # MQTT est obligaoire pour Home Assistant
enable: true
discovery: true
discovery_prefix: homeassistant
home_assistant_ws:
enable: true
ssl: true
token: XXXXXX
url: HA_URL
influxdb:
enable: false
hostname: influxdb
port: 8086
token: myelectricaldata
org: myelectricaldata
bucket: myelectricaldata
# ATTENTION, L'activation de l'importation asynchrone va réduire fortement le temps d'importation dans InfluxDB
# mais va augmenter la consommation mémoire & CPU et donc à activer uniquement sur un hardware robuste.
method: synchronous # Mode disponible : synchronous / asynchronous / batching
# batching_options permet uniquement de configurer la methode `batching`.
# Pour plus d'information : https://github.com/influxdata/influxdb-client-python#batching
batching_options:
batch_size: 1000
flush_interval: 1000
jitter_interval: 0
retry_interval: 5000
max_retry_time: 180_000
max_retries: 5
max_retry_delay: 125_000
exponential_base: 2
myelectricaldata:
"XXXXXXX":
enable: "true"
token: "XXXXXXXXX"
name: "Home"
addresses: "true"
cache: "true"
consumption: "true"
consumption_detail: "true"
consumption_price_base: "0.145907"
consumption_price_hc: "0.124364"
consumption_price_hp: "0.164915"
consumption_max_date: "2021-06-01"
consumption_detail_max_date: "2021-06-01"
offpeak_hours_0: 22H00-6H00 # LUNDI
offpeak_hours_1: 22H00-6H00 # MARDI
offpeak_hours_2: 22H00-6H00 # MERCREDI
offpeak_hours_3: 22H00-6H00 # JEUDI
offpeak_hours_4: 22H00-6H00 # VENDREDI
offpeak_hours_5: 22H00-6H00;12H00-14H00 # SAMEDI
offpeak_hours_6: 22H00-6H00;12H00-14H00 # DIMANCHE
plan: TEMPO
production: "true"
production_detail: "true"
production_price: "0.11"
production_max_date: "2024-01-01"
production_detail_max_date: "2024-01-01"
refresh_addresse: "false"
refresh_contract: "false"
Une idée ? Quelque chose que je dois faire de mon côté ?
Type d'installation
- Docker
- [X ] HassIO
*Version : 0.13.2
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working