Replies: 1 comment 3 replies
-
You cannot use subi, you have to use the sub function. What is the error message when you use sub? |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am now learning how to improve the params of the Linear Regression with gradients in the e-book Deep into Deep Java learning on IDEA. But I met a huge problem when training the params.

I use the instruction param.subi(param.getGradient().mul(lr).div(batchSize)); in IDEA and run it., but the console warns me that a leaf Variable that requires grad is being used in an in-place operation.
Searching information about this problem, I know that it is triggered by math operation directly on NDArray, but I don't know how to solve it with other instructions. I have tried the instruction "param.sub()" and "param = param - param.getGradient().mul(lr).div(batchSize)", but both fali.
The full warning message is below:
Exception in thread "main" ai.djl.engine.EngineException: a leaf Variable that requires grad is being used in an in-place operation.
at ai.djl.pytorch.jni.PyTorchLibrary.torchSubi(Native Method)
at ai.djl.pytorch.jni.JniUtils.subi(JniUtils.java:532)
at ai.djl.pytorch.engine.PtNDArray.subi(PtNDArray.java:535)
at ai.djl.pytorch.engine.PtNDArray.subi(PtNDArray.java:37)
at org.example.Test.sgd(Test.java:136)
at org.example.Test.main(Test.java:110)
My djl version:
ai.djl.pytorch pytorch-engine: 0.17.0
ai.djl.pytorch pytorch-native-cpu win-x86_64:1.11.0
ai.djl.pytorch pytorch-jni 1.11.0-0.17.0
ai.djl api 0.17.0
ai.djl basicdataset 0.17.0
Beta Was this translation helpful? Give feedback.
All reactions