You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+13Lines changed: 13 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -23,6 +23,13 @@ You can also get the node from comfy manager under the name of More math.
23
23
- Vector Math: Support for List literals `[v1, v2, ...]` and operations between lists/scalars/tensors
24
24
- Custom functions `funcname(variable,variable,...)->expression;` they can be used in any later defined custom function or in expression. Shadowing inbuilt functions do not work. **Be careful with recursion. There is no stack limit. Got to 700 000 iterations before I got bored.**
25
25
- Custom variables `varname=expression;` They can be used in any later assigment or final expression.
26
+
- Support for **indexed assignment**: `a[i, j, ...] = expression;`. Supports multidimensional tensors and nested lists.
27
+
-**Scalar Filling**: If the assigned value has only 1 element (scalar, 1-element list/tensor), it fills the entire selected slice.
28
+
-**Rank Matching**: Automatically squeezes leading ones from the value to match the rank of the target slice (e.g., assigning a 4D tensor with `dim0=1` to a 3D slice).
29
+
-**Available Variables**:
30
+
-`V0`, `V1`, ...: Individual input variables.
31
+
-`V`: A stacked tensor of all input variables `V` (shape: `[num_variables, ...]`). Available when shapes match.
32
+
-`Vcnt` or `V_count`: Number of input variables.
26
33
- Support for control flow statements including `if/else`, `while` loops, blocks `{}`, and `return` statements. `if`/`else`/`while` do not work like ternary operator or other inbuilts. They colapse tensors and list to single value using any.
27
34
- Support for stack. Stack survives between field evaluations but not between nodes or end of node execution.
28
35
- Usefull in GuiderMath node to store variables between steps.
@@ -37,6 +44,10 @@ You can also get the node from comfy manager under the name of More math.
37
44
- Modifications to existing variables persist to outer scope
38
45
-**Return Statements**: `return [expression];`
39
46
- Early return from functions or top-level expressions
47
+
-**For Loops**: `for (variable in expression) statement`
48
+
- Iterates over elements of a list or a tensor (along dimension 0)
49
+
-**Break/Continue**: `break;`, `continue;`
50
+
- Control loop execution (works in `while` and `for` loops)
40
51
41
52
## Operators
42
53
@@ -141,6 +152,8 @@ You can also get the node from comfy manager under the name of More math.
141
152
-`k_expr` can be a math expression (using `kX`, `kY`, `kZ`) or a list literal.
142
153
-`convolution(tensor, kw, [kh], [kd], k_expr)` or `conv`: Applies a convolution to `tensor`. Does not perform automatic permutations. Expects standard PyTorch layout `(Batch, Channel, Spatial...)`.
143
154
-`k_expr` can be a math expression (using `kX`, `kY`, `kZ`) or a list literal.
155
+
-**`get_value(tensor, position)`**: Retrieves a value from a tensor at the specified N-dimensional position (provided as a list or tensor). Uses the formula `pos0*strides[0] + pos1*strides[1] + ...` to find the linear index.
156
+
-**`crop(tensor, position, size)`**: Extracts a sub-tensor of specified `size` starting at `position` (both provided as lists/tensors). Areas outside the input tensor are filled with zeros.
144
157
145
158
-`permute(tensor, dims)` or `perm`: Rearranges the dimensions of the tensor. (e.g., `perm(a, [2, 3, 0, 1])`)
146
159
-`reshape(tensor, shape)` or `rshp`: Reshapes the tensor to a new shape. (e.g., `rshp(a, [S0*S1, S2, S3])`)
0 commit comments