Skip to content

would it be possible to replace gradient(calculated from autograd function) backpropagation by scalar loss backpropagation? #4

@Alisa-de

Description

@Alisa-de

Hello,
I am currently using your code for surface normal estimation. I was wondering what is the benefit to calculate gradient(df) in loss function and do output.backward(gradient=df) in training process instead of use loss value do loss.backward()?

Btw, have you ever tried to let the depth branch do refinement for raw depth?

Thank you for your help in advance!
Best regards
Alisa

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions