Replies: 3 comments 3 replies
-
Can you give examples of which entities are newly ignoring display precision, and where in release notes / blog posts it was described that this was intentional? I don't recall seeing anywhere that this was intentionally degraded. |
Beta Was this translation helpful? Give feedback.
-
https://developers.home-assistant.io/blog/2025/05/26/sensor-default-display-precision/ |
Beta Was this translation helpful? Give feedback.
-
You're right that the card shown in the screenshot is a custom card. However, as described in the official developer blog post, the actual change was deeper:
This effectively means that all custom cards need to be updated or redesigned to read and respect But let’s be realistic: Wouldn’t it be more practical to keep the
This strikes a balance between developer consistency and user experience — without adding unnecessary complexity for either side. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
🧩 Problem Summary
After updating to Home Assistant Core 2025.6.0, the "Display precision" setting in the UI no longer affects how values are displayed for many entities — especially those with a
device_class
or asuggested_display_precision
.This appears to be an intentional design change where the frontend now prioritizes internal logic over the user-defined setting.
❗ Why this is a real problem
This change breaks the expected behavior of the UI and negatively affects user trust:
15.13
instead of15.13975
is to create a Template Sensor.🎯 What the developers intended
Based on release notes and blog posts, the goal seems to be unified formatting across the system using fallback values derived from
device_class
orsuggested_display_precision
.While that may sound like a smart backend optimization, it removes visible control from the user — and makes the UI feel misleading when an option does nothing.
💡 What should be done?
✅ Option 1 (Recommended): Restore full functionality to the "Display precision" setting in the UI, making it override any internal or fallback precision.
This is the preferred solution because:
The default value can still be the full precision (as determined internally), but the user should be allowed to reduce the precision via the UI — without having to create a template sensor.
This matters especially for non-expert users who see such workarounds as unnecessary complexity that harms their experience and gives them a negative impression of the platform.
❌ Option 2: Remove the "Display precision" setting entirely and clearly document the reasoning.
This is not a good option, unless it’s paired with a simple and user-friendly alternative.
Real improvements don’t come from removing features — they come from making them easier to use.
No user should have to rely on YAML or templates just to round off a number display.
✅ Summary:
Either restore the "Display precision" setting and make it functional again (with default values showing full internal precision),
or remove it and provide a simpler alternative — without forcing users into YAML.
Beta Was this translation helpful? Give feedback.
All reactions