You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To train a model that predicts the demand amounts of customers under various conditions, a dataset of features and
380
381
labels needs to be created. Because the model may also learn during the course of a running competition, a generator based structure should be preferred. This means that a generator
381
382
exists that creates $x, y$ pairs for the model to train on, instead of creating a large batch of learning data ahead of
382
-
the learning processing.
383
+
the learning processing, which is otherwise a common practice. Whenever a round completes and new information is
384
+
available, the demand estimator is asked to estimate the demand for all customers subscribed to the tariffs of the
385
+
broker for the next 24 timesteps. These estimations are then saved (i.e. they replace any previous estimations) and the
386
+
wholesale component as well as other components can act on this newly created estimations.
383
387
384
-
%TODO STOP
385
388
According to the simulation specification, the customer models generate their demand pattern based on their internal
386
-
structure, broker factors and game factors \citep[]{ketter2018powertac}. The preprocessing pipeline therefore generates
389
+
structure, broker factors and game factors \citep[]{ketter2018powertac}. The preprocessing pipeline of the generator therefore generates
387
390
feature-label pairs that include: Customer, tariff, weather, time and demand information. The realized demand is the
388
391
label while all other components are part of the features that are used to train the model. The intuitive model class
389
392
for demand patterns prediction are \ac {RNN} due to the sequential nature of the problem \citep[]{EvalGRU2014}. However,
390
-
as will later be shown, the implementation of relatively shallow dense classic \ac {NN} also results in decent results.
393
+
as will be shown later, the implementation of relatively shallow dense classic \ac {NN} also results in decent results.
0 commit comments