You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/ansys/dpf/core/operators/result/migrate_to_h5dpf.py
+25-17Lines changed: 25 additions & 17 deletions
Original file line number
Diff line number
Diff line change
@@ -18,31 +18,33 @@
18
18
classmigrate_to_h5dpf(Operator):
19
19
r"""Read mesh properties from the results files contained in the streams or
20
20
data sources and make those properties available through a mesh
21
-
selection manager in output.
21
+
selection manager in output.User can input a GenericDataContainer that
22
+
will map an item to a result name. Example of Map: {{ default: wf1},
23
+
{EUL: wf2}, {ENG_SE: wf3}}.
22
24
23
25
24
26
Parameters
25
27
----------
26
-
dataset_size_compression_threshold: int, optional
28
+
dataset_size_compression_threshold: int or GenericDataContainer, optional
27
29
Integer value that defines the minimum dataset size (in bytes) to use h5 native compression Applicable for arrays of floats, doubles and integers.
28
-
h5_native_compression: int or DataTree, optional
30
+
h5_native_compression: int or DataTree or GenericDataContainer, optional
29
31
Integer value / DataTree that defines the h5 native compression used For Integer Input {0: No Compression (default); 1-9: GZIP Compression : 9 provides maximum compression but at the slowest speed.}For DataTree Input {type: None / GZIP / ZSTD; level: GZIP (1-9) / ZSTD (1-20); num_threads: ZSTD (>0)}
30
-
export_floats: bool, optional
32
+
export_floats: bool or GenericDataContainer, optional
31
33
Converts double to float to reduce file size (default is true).If False, nodal results are exported as double precision and elemental results as single precision.
32
34
filename: str
33
35
filename of the migrated file
34
36
comma_separated_list_of_results: str, optional
35
37
list of results (source operator names) separated by semicolons that will be stored. (Example: U;S;EPEL). If empty, all available results will be converted.
36
38
all_time_sets: bool, optional
37
-
default is false
39
+
Deprecated. Please use filtering workflows instead to select time scoping. Default is false.
38
40
streams_container: StreamsContainer, optional
39
41
streams (result file container) (optional)
40
42
data_sources: DataSources, optional
41
43
if the stream is null then we need to get the file path from the data sources
42
44
compression_workflow: Workflow or GenericDataContainer, optional
43
-
BETA Option: Applies input compression workflow. User can input a GenericDataContainer that will map a compression workflow to a result name. Example of Map: {{ default: wf1}, {EUL: wf2}, {ENG_SE: wf3}}
45
+
BETA Option: Applies input compression workflow.
44
46
filtering_workflow: Workflow or GenericDataContainer, optional
45
-
Applies input filtering workflow. User can input a GenericDataContainer of the format described for Pin(6) that will map a filtering workflow to a result name.
47
+
Applies input filtering workflow.
46
48
47
49
Returns
48
50
-------
@@ -142,26 +144,32 @@ def __init__(
142
144
def_spec() ->Specification:
143
145
description=r"""Read mesh properties from the results files contained in the streams or
144
146
data sources and make those properties available through a mesh
145
-
selection manager in output.
147
+
selection manager in output.User can input a GenericDataContainer that
148
+
will map an item to a result name. Example of Map: {{ default: wf1},
149
+
{EUL: wf2}, {ENG_SE: wf3}}.
146
150
"""
147
151
spec=Specification(
148
152
description=description,
149
153
map_input_pin_spec={
150
154
-5: PinSpecification(
151
155
name="dataset_size_compression_threshold",
152
-
type_names=["int32"],
156
+
type_names=["int32", "generic_data_container"],
153
157
optional=True,
154
158
document=r"""Integer value that defines the minimum dataset size (in bytes) to use h5 native compression Applicable for arrays of floats, doubles and integers.""",
155
159
),
156
160
-2: PinSpecification(
157
161
name="h5_native_compression",
158
-
type_names=["int32", "abstract_data_tree"],
162
+
type_names=[
163
+
"int32",
164
+
"abstract_data_tree",
165
+
"generic_data_container",
166
+
],
159
167
optional=True,
160
168
document=r"""Integer value / DataTree that defines the h5 native compression used For Integer Input {0: No Compression (default); 1-9: GZIP Compression : 9 provides maximum compression but at the slowest speed.}For DataTree Input {type: None / GZIP / ZSTD; level: GZIP (1-9) / ZSTD (1-20); num_threads: ZSTD (>0)}""",
161
169
),
162
170
-1: PinSpecification(
163
171
name="export_floats",
164
-
type_names=["bool"],
172
+
type_names=["bool", "generic_data_container"],
165
173
optional=True,
166
174
document=r"""Converts double to float to reduce file size (default is true).If False, nodal results are exported as double precision and elemental results as single precision.""",
167
175
),
@@ -181,7 +189,7 @@ def _spec() -> Specification:
181
189
name="all_time_sets",
182
190
type_names=["bool"],
183
191
optional=True,
184
-
document=r"""default is false""",
192
+
document=r"""Deprecated. Please use filtering workflows instead to select time scoping. Default is false.""",
document=r"""BETA Option: Applies input compression workflow. User can input a GenericDataContainer that will map a compression workflow to a result name. Example of Map: {{ default: wf1}, {EUL: wf2}, {ENG_SE: wf3}}""",
document=r"""Applies input filtering workflow. User can input a GenericDataContainer of the format described for Pin(6) that will map a filtering workflow to a result name.""",
r"""Allows to connect compression_workflow input to the operator.
502
510
503
-
BETA Option: Applies input compression workflow. User can input a GenericDataContainer that will map a compression workflow to a result name. Example of Map: {{ default: wf1}, {EUL: wf2}, {ENG_SE: wf3}}
r"""Allows to connect filtering_workflow input to the operator.
523
531
524
-
Applies input filtering workflow. User can input a GenericDataContainer of the format described for Pin(6) that will map a filtering workflow to a result name.
0 commit comments