GRU for timeseries forcasting #2997
Unanswered
charlie1986
asked this question in
Q&A
Replies: 1 comment
-
This error is saying that the recurrent block expects an input of |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I am trying to translate a python code using tensorflow/keras to DJL equivalent code. Below is python code:
`# Architecture Gated Recurrent Unit
regressorGRU = Sequential()
//First GRU layer with dropout
regressorGRU.add(GRU(units=hidden_unit, return_sequences=True, input_shape=(x_train.shape[1],1), activation = 'tanh'))
regressorGRU.add(Dropout(0.2))
//Second GRU layer with dropout
regressorGRU.add(GRU(units=hidden_unit, return_sequences=True, activation = 'tanh'))
regressorGRU.add(Dropout(0.2))
//Third GRU layer with dropout
regressorGRU.add(GRU(units=hidden_unit, return_sequences=False, activation = 'tanh'))
regressorGRU.add(Dropout(0.2))
//Output layer
regressorGRU.add(Dense(units=1))`
And here the DJL code:
public static SequentialBlock createNeuralNetwork(int hiddenUnit) {
//Architecture Gated Recurrent Unit
var regressorGRU = new SequentialBlock();
//First GRU layer with dropout
var gru1 = GRU.builder()
.setNumLayers(1)
.setStateSize(hiddenUnit)
.optReturnState(true)
.build();
regressorGRU.add(gru1);
regressorGRU.add(Activation::tanh);
regressorGRU.add(Dropout.builder().optRate(0.2f).build());
//Second GRU layer with dropout
var gru2 = GRU.builder()
.setNumLayers(1)
.setStateSize(hiddenUnit)
.optReturnState(true)
.build();
regressorGRU.add(gru2);
regressorGRU.add(Activation::tanh);
regressorGRU.add(Dropout.builder().optRate(0.2f).build());
//Third GRU layer with dropout
var gru3 = GRU.builder()
.setNumLayers(1)
.setStateSize(hiddenUnit)
.optReturnState(false)
.build();
regressorGRU.add(gru3);
regressorGRU.add(Activation::tanh);
regressorGRU.add(Dropout.builder().optRate(0.2f).build());
//Output layer
regressorGRU.add(Linear.builder().setUnits(1).build());
return regressorGRU;
}
But the error has occured: java.lang.UnsupportedOperationException: Expected layout: NTC, but got: ??
How to implement the equivalent code in DJL?
Beta Was this translation helpful? Give feedback.
All reactions