Skip to content

Commit d230db2

Browse files
authored
Merge branch 'master' into ci/drop-py39-tests
2 parents 325f296 + 68f0755 commit d230db2

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

42 files changed

+738
-17886
lines changed

.gitattributes

Lines changed: 63 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,63 @@
1+
###############################################################################
2+
# Set default behavior to automatically normalize line endings.
3+
###############################################################################
4+
* text=auto
5+
6+
###############################################################################
7+
# Set default behavior for command prompt diff.
8+
#
9+
# This is need for earlier builds of msysgit that does not have it on by
10+
# default for csharp files.
11+
# Note: This is only used by command line
12+
###############################################################################
13+
#*.cs diff=csharp
14+
15+
###############################################################################
16+
# Set the merge driver for project and solution files
17+
#
18+
# Merging from the command prompt will add diff markers to the files if there
19+
# are conflicts (Merging from VS is not affected by the settings below, in VS
20+
# the diff markers are never inserted). Diff markers may cause the following
21+
# file extensions to fail to load in VS. An alternative would be to treat
22+
# these files as binary and thus will always conflict and require user
23+
# intervention with every merge. To do so, just uncomment the entries below
24+
###############################################################################
25+
#*.sln merge=binary
26+
#*.csproj merge=binary
27+
#*.vbproj merge=binary
28+
#*.vcxproj merge=binary
29+
#*.vcproj merge=binary
30+
#*.dbproj merge=binary
31+
#*.fsproj merge=binary
32+
#*.lsproj merge=binary
33+
#*.wixproj merge=binary
34+
#*.modelproj merge=binary
35+
#*.sqlproj merge=binary
36+
#*.wwaproj merge=binary
37+
38+
###############################################################################
39+
# behavior for image files
40+
#
41+
# image files are treated as binary by default.
42+
###############################################################################
43+
*.jpg binary
44+
*.png binary
45+
*.gif binary
46+
47+
###############################################################################
48+
# diff behavior for common document formats
49+
#
50+
# Convert binary document formats to text before diffing them. This feature
51+
# is only available from the command line. Turn it on by uncommenting the
52+
# entries below.
53+
###############################################################################
54+
#*.doc diff=astextplain
55+
#*.DOC diff=astextplain
56+
#*.docx diff=astextplain
57+
#*.DOCX diff=astextplain
58+
#*.dot diff=astextplain
59+
#*.DOT diff=astextplain
60+
#*.pdf diff=astextplain
61+
#*.PDF diff=astextplain
62+
#*.rtf diff=astextplain
63+
#*.RTF diff=astextplain

.github/workflows/test_docker.yml

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -229,3 +229,11 @@ jobs:
229229
token: ${{ secrets.CODECOV_TOKEN }} # required
230230
name: ${{ env.PACKAGE_NAME }}_${{ matrix.python-version }}_${{ matrix.os }}_pytest_${{ inputs.ANSYS_VERSION }}_docker.xml
231231
flags: docker,${{ inputs.ANSYS_VERSION }},${{ matrix.os }},${{ matrix.python-version }}
232+
233+
- name: "Upload test analytics results to Codecov"
234+
if: ${{ !cancelled() }}
235+
uses: codecov/test-results-action@v1
236+
with:
237+
token: ${{ secrets.CODECOV_TOKEN }}
238+
name: test_results_${{ env.PACKAGE_NAME }}_${{ matrix.python-version }}_${{ matrix.os }}_${{ inputs.ANSYS_VERSION }}${{ inputs.test_any == 'true' && '_any' || '' }}
239+
flags: ${{ inputs.ANSYS_VERSION }},${{ matrix.os }},${{ matrix.python-version }}${{ inputs.test_any == 'true' && ',any' || '' }}

.github/workflows/tests.yml

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -309,3 +309,11 @@ jobs:
309309
file: ./.tox/.cov/coverage.xml
310310
name: ${{ env.PACKAGE_NAME }}_${{ matrix.python-version }}_${{ matrix.os }}_pytest_${{ inputs.ANSYS_VERSION }}${{ inputs.test_any == 'true' && '_any' || '' }}.xml
311311
flags: ${{ inputs.ANSYS_VERSION }},${{ matrix.os }},${{ matrix.python-version }}${{ inputs.test_any == 'true' && ',any' || '' }}
312+
313+
- name: "Upload test analytics results to Codecov"
314+
if: ${{ !cancelled() }}
315+
uses: codecov/test-results-action@v1
316+
with:
317+
token: ${{ secrets.CODECOV_TOKEN }}
318+
name: test_results_${{ env.PACKAGE_NAME }}_${{ matrix.python-version }}_${{ matrix.os }}_${{ inputs.ANSYS_VERSION }}${{ inputs.test_any == 'true' && '_any' || '' }}
319+
flags: ${{ inputs.ANSYS_VERSION }},${{ matrix.os }},${{ matrix.python-version }}${{ inputs.test_any == 'true' && ',any' || '' }}

doc/source/_static/dpf_operators.html

Lines changed: 5 additions & 5 deletions
Large diffs are not rendered by default.

doc/source/getting_started/dpf_server.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ simulation workflow.
1111
The DPF Server is packaged within the **Ansys installer** in Ansys 2021 R1 and later.
1212

1313
It is also available as a standalone package that contains all the necessary files to run, enabling DPF capabilities.
14-
The standalone DPF Server is available on the `DPF Pre-Release page <https://download-archive.ansys.com/Others/DPF%20Pre-Release>`_ of the Ansys Customer Portal.
14+
The standalone DPF Server is available on the `DPF Pre-Release page <https://download.ansys.com/Others/DPF%20Pre-Release>`_ of the Ansys Customer Portal.
1515
The first standalone version of DPF Server available is 6.0 (2023 R2).
1616

1717
The sections on this page describe how to install and use a standalone DPF Server.

src/ansys/dpf/core/operators/averaging/__init__.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,7 @@
1414
from .extend_to_mid_nodes import extend_to_mid_nodes
1515
from .extend_to_mid_nodes_fc import extend_to_mid_nodes_fc
1616
from .force_summation import force_summation
17+
from .force_summation_psd import force_summation_psd
1718
from .gauss_to_node_fc import gauss_to_node_fc
1819
from .nodal_difference import nodal_difference
1920
from .nodal_difference_fc import nodal_difference_fc

src/ansys/dpf/core/operators/averaging/force_summation.py

Lines changed: 44 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,10 @@ class force_summation(Operator):
3030
Nodal Scoping. Set of nodes in which elemental contribution forces will be accumulated (default = all nodes)
3131
elemental_scoping: Scoping, optional
3232
Elemental Scoping. Set of elements contributing to the force calcuation. (default = all elements)
33+
streams_container: StreamsContainer, optional
34+
Streams container. Optional if using data sources.
3335
data_sources: DataSources
36+
Data sources. Optional if using a streams container.
3437
force_type: int, optional
3538
Type of force to be processed (0 - default: Total forces (static, damping, and inertia)., 1: Static forces, 2: Damping forces, 3: Inertia forces)
3639
spoint: Field, optional
@@ -59,6 +62,8 @@ class force_summation(Operator):
5962
>>> op.inputs.nodal_scoping.connect(my_nodal_scoping)
6063
>>> my_elemental_scoping = dpf.Scoping()
6164
>>> op.inputs.elemental_scoping.connect(my_elemental_scoping)
65+
>>> my_streams_container = dpf.StreamsContainer()
66+
>>> op.inputs.streams_container.connect(my_streams_container)
6267
>>> my_data_sources = dpf.DataSources()
6368
>>> op.inputs.data_sources.connect(my_data_sources)
6469
>>> my_force_type = int()
@@ -71,6 +76,7 @@ class force_summation(Operator):
7176
... time_scoping=my_time_scoping,
7277
... nodal_scoping=my_nodal_scoping,
7378
... elemental_scoping=my_elemental_scoping,
79+
... streams_container=my_streams_container,
7480
... data_sources=my_data_sources,
7581
... force_type=my_force_type,
7682
... spoint=my_spoint,
@@ -90,6 +96,7 @@ def __init__(
9096
time_scoping=None,
9197
nodal_scoping=None,
9298
elemental_scoping=None,
99+
streams_container=None,
93100
data_sources=None,
94101
force_type=None,
95102
spoint=None,
@@ -105,6 +112,8 @@ def __init__(
105112
self.inputs.nodal_scoping.connect(nodal_scoping)
106113
if elemental_scoping is not None:
107114
self.inputs.elemental_scoping.connect(elemental_scoping)
115+
if streams_container is not None:
116+
self.inputs.streams_container.connect(streams_container)
108117
if data_sources is not None:
109118
self.inputs.data_sources.connect(data_sources)
110119
if force_type is not None:
@@ -124,7 +133,7 @@ def _spec() -> Specification:
124133
map_input_pin_spec={
125134
0: PinSpecification(
126135
name="time_scoping",
127-
type_names=["scoping"],
136+
type_names=["scoping", "vector<int32>"],
128137
optional=True,
129138
document=r"""default = all time steps""",
130139
),
@@ -140,11 +149,17 @@ def _spec() -> Specification:
140149
optional=True,
141150
document=r"""Elemental Scoping. Set of elements contributing to the force calcuation. (default = all elements)""",
142151
),
152+
3: PinSpecification(
153+
name="streams_container",
154+
type_names=["streams_container"],
155+
optional=True,
156+
document=r"""Streams container. Optional if using data sources.""",
157+
),
143158
4: PinSpecification(
144159
name="data_sources",
145160
type_names=["data_sources"],
146161
optional=False,
147-
document=r"""""",
162+
document=r"""Data sources. Optional if using a streams container.""",
148163
),
149164
5: PinSpecification(
150165
name="force_type",
@@ -258,6 +273,8 @@ class InputsForceSummation(_Inputs):
258273
>>> op.inputs.nodal_scoping.connect(my_nodal_scoping)
259274
>>> my_elemental_scoping = dpf.Scoping()
260275
>>> op.inputs.elemental_scoping.connect(my_elemental_scoping)
276+
>>> my_streams_container = dpf.StreamsContainer()
277+
>>> op.inputs.streams_container.connect(my_streams_container)
261278
>>> my_data_sources = dpf.DataSources()
262279
>>> op.inputs.data_sources.connect(my_data_sources)
263280
>>> my_force_type = int()
@@ -274,6 +291,8 @@ def __init__(self, op: Operator):
274291
self._inputs.append(self._nodal_scoping)
275292
self._elemental_scoping = Input(force_summation._spec().input_pin(2), 2, op, -1)
276293
self._inputs.append(self._elemental_scoping)
294+
self._streams_container = Input(force_summation._spec().input_pin(3), 3, op, -1)
295+
self._inputs.append(self._streams_container)
277296
self._data_sources = Input(force_summation._spec().input_pin(4), 4, op, -1)
278297
self._inputs.append(self._data_sources)
279298
self._force_type = Input(force_summation._spec().input_pin(5), 5, op, -1)
@@ -344,10 +363,33 @@ def elemental_scoping(self) -> Input:
344363
"""
345364
return self._elemental_scoping
346365

366+
@property
367+
def streams_container(self) -> Input:
368+
r"""Allows to connect streams_container input to the operator.
369+
370+
Streams container. Optional if using data sources.
371+
372+
Returns
373+
-------
374+
input:
375+
An Input instance for this pin.
376+
377+
Examples
378+
--------
379+
>>> from ansys.dpf import core as dpf
380+
>>> op = dpf.operators.averaging.force_summation()
381+
>>> op.inputs.streams_container.connect(my_streams_container)
382+
>>> # or
383+
>>> op.inputs.streams_container(my_streams_container)
384+
"""
385+
return self._streams_container
386+
347387
@property
348388
def data_sources(self) -> Input:
349389
r"""Allows to connect data_sources input to the operator.
350390
391+
Data sources. Optional if using a streams container.
392+
351393
Returns
352394
-------
353395
input:

0 commit comments

Comments
 (0)