Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion RELEASE_NOTES.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@

## New Features

<!-- Here goes the main new features and examples or instructions on how to use them -->
* Helper function to calculate energy metrics

## Bug Fixes

Expand Down
84 changes: 84 additions & 0 deletions src/frequenz/client/reporting/_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,11 @@
or a pandas DataFrame.
"""

CumulativeEnergy = namedtuple(
"CumulativeEnergy", ["start_time", "end_time", "consumption", "production"]
)
"""Type for cumulative energy consumption and production over a specified time."""


@dataclass(frozen=True)
class ComponentsDataBatch:
Expand Down Expand Up @@ -252,3 +257,82 @@ def dt2ts(dt: datetime) -> PBTimestamp:
except grpcaio.AioRpcError as e:
print(f"RPC failed: {e}")
return


# pylint: disable=too-many-arguments
async def cumulative_energy(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe this should be part of the client for now, otherwise it could be a separate module.

client: ReportingApiClient,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it's not part of the client, it needs an URL and key. I guess part of the client makes things simpler now.

@llucax Any thoughts on this? This helper function IMO goes beyond a thin client and could be seen as additional tooling on top of the client. We do think it is overkill to start that now though and wait for further cases.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it's not part of the client, it needs an URL and key. I guess part of the client makes things simpler now.

No, it just needs a client, which you should have already instantiated. It really exactly the same, is just a syntax issue. The method would be async def cumulative_energy(self: ReportingApiClient,...) the same, it is just called self by convention. You can actually also call a method like this ReportingApiClient.cumulative_energy(client, ...).

I like utilities being separated because it keeps the client interface clear, otherwise is really hard to draw a line between what should be part of the client what shouldn't.

@llucax Any thoughts on this? This helper function IMO goes beyond a thin client and could be seen as additional tooling on top of the client. We do think it is overkill to start that now though and wait for further cases.

I think we should adopt the https://github.com/frequenz-floss/frequenz-dispatch-python approach if there is a room for a higher level interface. At least IMHO that worked out well. In dispatch we basically have the https://github.com/frequenz-floss/frequenz-client-dispatch-python as a thin wrapper over the API and then frequenz-dispatch-python provides a higher level interface, which makes a lot of assumptions about how the API should be used, so it is much less flexible, but much easier to use for 99% of the cases (scientifically proven :P).

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, I think it will be best to set up frequenz-reporting-python and move the energy metric there :-)

microgrid_id: int,
component_id: int,
start_time: datetime,
end_time: datetime,
use_active_power: bool,
resolution: int | None = None,
) -> CumulativeEnergy:
"""
Calculate the cumulative energy consumption and production over a specified time range.

Args:
client: The client used to fetch the metric samples from the Reporting API.
microgrid_id: The ID of the microgrid.
component_id: The ID of the component within the microgrid.
start_time: The start date and time for the period.
end_time: The end date and time for the period.
use_active_power: If True, use the 'AC_ACTIVE_POWER' metric.
If False, use the 'AC_ACTIVE_ENERGY' metric.
resolution: The resampling resolution for the data, represented in seconds.
If None, no resampling is applied.

Returns:
EnergyMetric: A named tuple with start_time, end_time, consumption, and production.
"""
metric = Metric.AC_ACTIVE_POWER if use_active_power else Metric.AC_ACTIVE_ENERGY

metric_samples = [
sample
async for sample in client.list_microgrid_components_data(
microgrid_components=[(microgrid_id, [component_id])],
metrics=metric,
start_dt=start_time,
end_dt=end_time,
resolution=resolution,
)
]

values = [
sample.value
for sample in metric_samples
if start_time <= sample.timestamp <= end_time
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't happen anyway, but typically exclusive timestamps for the end time are used, i.e. < end_time.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we actually need to extract the values again and not just loop over the metric_samples (see example for power).

]

if values:
if use_active_power:
# Convert power to energy if using AC_ACTIVE_POWER
time_interval_hours = (
resolution or (end_time - start_time).total_seconds()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The second part looks off to me since each power needs to be integrated over it's time period to energy, so would only work for given resolution. You could just use:

consumption = sum(
    m1.value * (m2.timestamp-m1.timestamp).total_seconds() 
    for m in zip(metric_samples, metric_samples[1:] 
    if m1.value > 0) / 3600.0

) / 3600.0
consumption = sum(
max(e2 - e1, 0) * time_interval_hours
for e1, e2 in zip(values, values[1:])
)
production = sum(
min(e2 - e1, 0) * time_interval_hours
for e1, e2 in zip(values, values[1:])
)
else:
# Directly use energy values if using AC_ACTIVE_ENERGY
consumption = sum(
e2 - e1 for e1, e2 in zip(values, values[1:]) if e2 - e1 > 0
)
production = sum(
e2 - e1 for e1, e2 in zip(values, values[1:]) if e2 - e1 < 0
)
else:
consumption = production = 0.0

return CumulativeEnergy(
start_time=start_time,
end_time=end_time,
consumption=consumption,
production=production,
)