-
Notifications
You must be signed in to change notification settings - Fork 178
[luci] Use static dimensions from new_shape attribute #14844
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change | ||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
|
@@ -120,8 +120,7 @@ loco::TensorShape Algorithm::visit(const luci::CircleReshape *node) | |||||||||||||||||||||||||
| // for non-existing `shape`, we can use `newShape` if it's valid | ||||||||||||||||||||||||||
| auto new_shape = node->newShape(); | ||||||||||||||||||||||||||
| auto rank = new_shape->rank(); | ||||||||||||||||||||||||||
| auto shape_dummy = dynamic_cast<luci::CircleOutputDummy *>(node->shape()); | ||||||||||||||||||||||||||
| if (shape_dummy && rank > 0) | ||||||||||||||||||||||||||
| if (rank > 0) | ||||||||||||||||||||||||||
| { | ||||||||||||||||||||||||||
| is_static_shape = true; | ||||||||||||||||||||||||||
| shape_by_input.rank(rank); | ||||||||||||||||||||||||||
|
|
@@ -154,7 +153,14 @@ loco::TensorShape Algorithm::visit(const luci::CircleReshape *node) | |||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||
| for (uint32_t axis = 0; axis < shape_by_attr.rank(); ++axis) | ||||||||||||||||||||||||||
| { | ||||||||||||||||||||||||||
| shape_by_attr.dim(axis) = node->newShape()->dim(axis); | ||||||||||||||||||||||||||
| if (node->newShape()->dim(axis) > 0) | ||||||||||||||||||||||||||
| { | ||||||||||||||||||||||||||
| shape_by_attr.dim(axis) = node->newShape()->dim(axis); | ||||||||||||||||||||||||||
| } | ||||||||||||||||||||||||||
| else | ||||||||||||||||||||||||||
| { | ||||||||||||||||||||||||||
| shape_by_attr.dim(axis).unset(); // unset means unknown dimension | ||||||||||||||||||||||||||
| } | ||||||||||||||||||||||||||
|
Comment on lines
+156
to
+163
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. how is this change related with removing
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The purpose of the change is to align way of dynamic shape representation:
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
I don't understand this. plz explain your understandings about this.
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Let's consider even my case from ONE/compiler/luci/service/src/Nodes/CircleReshape.cpp Lines 130 to 135 in e5e151d
[0,2,0]while shape_by_attr is [-1,2,-1]TensorShape::operator== doesn't handle -1 as dynamic dimension
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Not this. I'm asking about the relation with
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. And I really don't understand why you are mentioning about the log.
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. IMO incorrect information about mismatch will be shown just because other representation of dynamic dimension. It is related with this change because after call ONE/compiler/luci/service/src/Nodes/CircleReshape.cpp Lines 130 to 135 in e5e151d
shape_dummy) condition shape_by_input == shape_by_attr will be negative incorrectly
|
||||||||||||||||||||||||||
| } | ||||||||||||||||||||||||||
| } | ||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -197,3 +197,39 @@ TEST(ShapeRuleTest, reshape_by_newShape) | |
| ASSERT_EQ(2, output_shape.dim(0).value()); | ||
| ASSERT_EQ(12, output_shape.dim(1).value()); | ||
| } | ||
|
|
||
| TEST(ShapeRuleTest, reshape_by_newShape_dynamic) | ||
seanshpark marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| { | ||
| auto g = loco::make_graph(); | ||
| auto node_reshape = g->nodes()->create<luci::CircleReshape>(); | ||
| auto tensor_input = g->nodes()->create<luci::CircleInput>(); | ||
| auto target_shape = g->nodes()->create<luci::CircleInput>(); | ||
|
|
||
| tensor_input->dtype(loco::DataType::S32); | ||
| tensor_input->shape({2, 3, 4}); | ||
| tensor_input->shape_status(luci::ShapeStatus::VALID); | ||
|
|
||
| target_shape->dtype(loco::DataType::S32); | ||
| target_shape->rank(1); | ||
| target_shape->shape_status(luci::ShapeStatus::VALID); | ||
|
|
||
| node_reshape->tensor(tensor_input); | ||
| node_reshape->shape(target_shape); | ||
|
|
||
| // reshape to {dynamic, 4, dynamic} | ||
| node_reshape->newShape()->rank(3); | ||
| node_reshape->newShape()->dim(0) = -1; | ||
| node_reshape->newShape()->dim(1) = 2; | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Now you changed
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. the idea was to set data easier to understand (especially after shape specialization) |
||
| node_reshape->newShape()->dim(2) = -1; | ||
|
|
||
| loco::TensorShape output_shape; | ||
| luci::sinf::Rule shape_inf_rule; | ||
|
|
||
| ASSERT_TRUE(shape_inf_rule.infer(node_reshape, output_shape)); | ||
|
|
||
| ASSERT_EQ(3, output_shape.rank()); | ||
| ASSERT_FALSE(output_shape.dim(0).known()); | ||
| ASSERT_TRUE(output_shape.dim(1).known()); | ||
| ASSERT_EQ(2, output_shape.dim(1).value()); | ||
| ASSERT_FALSE(output_shape.dim(2).known()); | ||
| } | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. IIUC, you are to reshape
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. it's probably can be reshaping to
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Have you tried running with pytorch or TF with reshape from (2, 3, 4) to (6, 4, 1) ?
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. for both |
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Q) why did you remove to check
shape_dummy?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For my model converted from Pytorch I would like to use dimensions stored in newShape but models converted from Pytorch have no
luci::CircleOutputDummyThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you know what and how
CircleOutputDummyworks?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I know only (based on the spec) that it's "Temporary DummyNode used with dangle CircleNode" but to be honest I don't know why it is used here ;)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you don't know the purpose of this node, why did you deleted it?
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just for clarification - I believe I understand concept of
CircleOutputDummyhere -> we are using values from attribute only if shape input to missing (dummy node assign to avoid dangling operand) -> attribute here is to reflect TF impl of Reshape node.I am just not understand why we cannot relax the algorithm -> if based on shape input we can only calculate rank, let's check if maybe we can extract sth valuable from attribute (still treating input shape with the higher priority)