Skip to content

Commit 05dba50

Browse files
authored
Patch CI for Public Release (#22)
* update CI for public * fix doc build * update doc build and api docs * update dpf docs * make documentation deployment conditional
1 parent 50874be commit 05dba50

File tree

22 files changed

+197
-166
lines changed

22 files changed

+197
-166
lines changed

.ci/azure-pipelines.yml

Lines changed: 11 additions & 43 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,6 @@ pr:
2020
exclude:
2121
- '*'
2222

23-
2423
jobs:
2524
- job: Windows
2625
variables:
@@ -130,7 +129,7 @@ jobs:
130129
docker pull $(DPF_IMAGE)
131130
docker run --restart always --name dpf -v `pwd`:/dpf -v /tmp:/dpf/_cache -p $(DPF_PORT):50054 $(DPF_IMAGE) > log.txt &
132131
grep -q 'server started on ip' <(timeout 60 tail -f log.txt)
133-
python -c "from ansys.dpf import core; core.connect_to_server(port=$(DPF_PORT)); print('Python Connected')"
132+
python -c "from ansys.dpf import core as dpf; dpf.connect_to_server(port=$(DPF_PORT)); print('Python Connected')"
134133
displayName: Pull, launch, and validate DPF service
135134
136135
- script: |
@@ -140,44 +139,13 @@ jobs:
140139
pytest -v --junitxml=junit/test-results.xml --cov ansys.dpf.core --cov-report=xml --cov-report=html
141140
displayName: Test Core API
142141
143-
144-
# - script: |
145-
# .ci/setup_headless_display.sh
146-
# pip install -r .ci/requirements_test_xvfb.txt
147-
# python .ci/display_test.py
148-
# displayName: Install and start a virtual framebuffer
149-
150-
151-
# - script: |
152-
# set -ex
153-
# echo $(PAT) | docker login -u $(GH_USERNAME) --password-stdin docker.pkg.github.com
154-
# docker pull $(MAPDL_IMAGE)
155-
# docker run -e ANSYSLMD_LICENSE_FILE=1055@$(LICENSE_SERVER) --restart always --name mapdl -p $(PYMAPDL_PORT):50052 $(MAPDL_IMAGE) -smp &
156-
# python -c "from ansys.mapdl import launch_mapdl; print(launch_mapdl())"
157-
# displayName: Pull, launch, and validate MAPDL service
158-
159-
# - script: |
160-
# pip install -r requirements_test.txt
161-
# pip install pytest-azurepipelines
162-
# pytest -v --junitxml=junit/test-results.xml --cov --cov-report=xml --cov-report=html
163-
# displayName: 'Test Core API'
164-
165-
# - template: build_documentation.yml # path is relative
166-
167-
# - script: |
168-
# bash <(curl -s https://codecov.io/bash)
169-
# displayName: 'Upload coverage to codecov.io'
170-
# condition: eq(variables['python.version'], '3.7')
171-
172-
# - script: |
173-
# pip install twine
174-
# python setup.py sdist
175-
# twine upload --skip-existing dist/pyvista*
176-
# displayName: 'Upload to PyPi'
177-
# condition: and(eq(variables['python.version'], '3.7'), contains(variables['Build.SourceBranch'], 'refs/tags/'))
178-
# env:
179-
# TWINE_USERNAME: $(twine.username)
180-
# TWINE_PASSWORD: $(twine.password)
181-
# TWINE_REPOSITORY_URL: "https://upload.pypi.org/legacy/"
182-
183-
142+
- script: |
143+
pip install twine
144+
python setup.py sdist
145+
twine upload --skip-existing dist/*
146+
displayName: 'Upload to PyPi'
147+
condition: contains(variables['Build.SourceBranch'], 'refs/tags/')
148+
env:
149+
TWINE_USERNAME: __token__
150+
TWINE_PASSWORD: $(PYPI_TOKEN)
151+
TWINE_REPOSITORY_URL: "https://upload.pypi.org/legacy/"

.github/workflows/ci-build.yml

Lines changed: 12 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,8 @@ name: Documentation Build
33

44
on: [push, pull_request, workflow_dispatch]
55

6+
67
jobs:
7-
# This workflow contains a single job called "build"
88
build:
99
runs-on: ubuntu-20.04
1010

@@ -50,17 +50,26 @@ jobs:
5050
env:
5151
GH_USERNAME: ${{ secrets.GH_USERNAME }}
5252
PAT: ${{ secrets.REPO_DOWNLOAD_PAT }}
53-
54-
53+
5554
- name: Build Documentation
5655
run: |
5756
sudo apt install pandoc -qy
5857
pip install -r requirements_docs.txt
5958
make -C docs html
59+
touch docs/build/html/.nojekyll
6060
6161
- name: Upload Documentation
6262
uses: actions/[email protected]
6363
with:
6464
name: Documentation
6565
path: docs/build/html
6666
retention-days: 7
67+
68+
- name: Deploy
69+
uses: JamesIves/[email protected]
70+
if: startsWith(github.ref, 'refs/tags/')
71+
with:
72+
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
73+
BRANCH: gh-pages
74+
FOLDER: docs/build/html
75+
CLEAN: true

README.md

Lines changed: 63 additions & 41 deletions
Original file line numberDiff line numberDiff line change
@@ -1,44 +1,89 @@
1-
# DPF - ANSYS Data Processing Framework
1+
# DPF - Ansys Data Processing Framework
2+
3+
The Data Processing Framework (DPF) is designed to provide numerical
4+
simulation users/engineers with a toolbox for accessing and
5+
transforming simulation data. DPF can access data from solver result
6+
files as well as several neutral formats (csv, hdf5, vtk,
7+
etc.). Various operators are available allowing the manipulation and
8+
the transformation of this data.
9+
10+
DPF is a workflow-based framework which allows simple and/or complex
11+
evaluations by chaining operators. The data in DPF is defined based on
12+
physics agnostic mathematical quantities described in a
13+
self-sufficient entity called field. This allows DPF to be a modular
14+
and easy to use tool with a large range of capabilities. It's a
15+
product designed to handle large amount of data.
16+
17+
The Python ``ansys.dpf.core`` module provides a Python interface to
18+
the powerful DPF framework enabling rapid post-processing of a variety
19+
of Ansys file formats and physics solutions without ever leaving a
20+
Python environment.
221

322

423
## Installation
524

6-
Clone and install this repository with:
25+
Install this repository with:
726

827
```
9-
git clone https://github.com/pyansys/DPF-Core
10-
cd DPF-Core
11-
pip install . --user
28+
pip install ansys-dpf-core
1229
```
1330

14-
Install any missing libraries from Artifactory with:
31+
You can also clone and install this repository with:
1532

1633
```
17-
pip install --extra-index-url=http://canartifactory.ansys.com:8080/artifactory/api/pypi/pypi/simple --trusted-host canartifactory.ansys.com ansys-grpc-dpf
34+
git clone https://github.com/pyansys/DPF-Core
35+
cd DPF-Core
36+
pip install . --user
1837
```
1938

20-
This step will be eliminated once DPF is live on PyPi.
21-
2239

2340
## Running DPF
2441

42+
### Brief Demo
2543
Provided you have ANSYS 2021R1 installed, a DPF server will start
26-
automatically once you start using DPF:
44+
automatically once you start using DPF.
2745

46+
Opening a result file generated from Ansys workbench or MAPDL is as easy as:
2847

29-
```py
30-
from ansys.dpf import core
48+
```
49+
>>> from ansys.dpf.core import Model
50+
>>> model = Model('file.rst')
51+
>>> print(model)
52+
DPF Model
53+
------------------------------
54+
Static analysis
55+
Unit system: Metric (m, kg, N, s, V, A)
56+
Physics Type: Mecanic
57+
Available results:
58+
- displacement
59+
- element_nodal_forces
60+
- volume
61+
- energy_stiffness_matrix
62+
- hourglass_energy
63+
- thermal_dissipation_energy
64+
- kinetic_energy
65+
- co_energy
66+
- incremental_energy
67+
- temperature
68+
```
69+
70+
Open up an result with:
3171

32-
norm = core.Operator('norm_fc')
72+
```py
73+
>>> model.displacement
74+
```
3375

34-
# or open up a model
35-
model = core.Model('file.rst')
76+
Then start linking operators with:
3677

78+
```py
79+
>>> norm = core.Operator('norm_fc')
3780
```
3881

39-
The `ansys.dpf.core` module takes care of starting your local server
40-
for you so you don't have to. If you need to connect to a remote DPF
41-
instance, use the ``connect_to_server`` function:
82+
### Starting the Service
83+
84+
The `ansys.dpf.core` automatically starts the DPF service in the
85+
background and connects to it. If you need to connect to an existing
86+
remote DPF instance, use the ``connect_to_server`` function:
4287

4388
```py
4489
from ansys.dpf import core
@@ -48,26 +93,3 @@ connect_to_server('10.0.0.22, 50054)
4893
Once connected, this connection will remain for the duration of the
4994
module until you exit python or connect to a different server.
5095

51-
52-
## Unit Testing
53-
54-
Unit tests can be run by first installing the testing requirements with `pip install -r requirements_test.txt` and then running pytest with:
55-
56-
```
57-
pytest
58-
```
59-
60-
If you have ANSYS v2021R1 installed locally, the unit tests will
61-
automatically start up the DPF server and run the tests. If you need
62-
to disable this and have the unit tests run against a remote server,
63-
setup the following environment variables:
64-
65-
```
66-
set DPF_START_SERVER=False
67-
set DPF_IP=<IP of Remote Computer>
68-
set DPF_PORT=<Port of Remote DPF Server>
69-
```
70-
71-
72-
## Examples
73-
See the example scripts in the examples folder for some basic examples.

ansys/dpf/core/data_sources.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,8 @@ class DataSources:
99
"""Represent the file sources of a model.
1010
1111
Initialize the data_sources with either optional data_sources
12-
message, or by connecting to a stub A Result path can be
13-
directly set
12+
message, or by connecting to a stub. Result path can also be
13+
directly set.
1414
1515
Parameters
1616
----------

ansys/dpf/core/operators_helper.py

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,8 @@
1-
"""Wrappers for operators"""
1+
"""Wrappers for DPF operators.
2+
3+
These operators are available as functions from ``dpf.operators`` and
4+
simplify the creation of new chained operators.
5+
"""
26
from ansys import dpf
37
from ansys.dpf.core.common import types as dpf_types
48

ansys/dpf/core/plotter.py

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
"""Dpf plotter class is contained in this module.
2-
Allows to plot a mesh and a fields container
3-
using pyvista."""
2+
3+
Allows to plot a mesh and a fields container using pyvista.
4+
"""
45
import tempfile
56

67
import pyvista as pv
@@ -16,6 +17,8 @@
1617

1718

1819
class Plotter:
20+
"""Internal class used by DPF-Core to plot fields and meshed regions"""
21+
1922
def __init__(self, mesh):
2023
self._mesh = mesh
2124

docker/env.sh

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
# Setup the testing environment using docker
2+
# run with:
3+
# source env.sh
4+
export DPF_START_SERVER=FALSE
5+
export DPF_DOCKER=True

docs/source/api/data_sources.rst

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
******************
2+
Data Sources Class
3+
******************
4+
.. autoclass:: ansys.dpf.core.data_sources.DataSources
5+
:members:
6+
:private-members:

docs/source/api/element.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
*************
2+
Element Class
3+
*************
4+
.. autoclass:: ansys.dpf.core.meshed_region.Element
5+
:members:

docs/source/api/elements.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
**************
2+
Elements Class
3+
**************
4+
.. autoclass:: ansys.dpf.core.meshed_region.Elements
5+
:members:

0 commit comments

Comments
 (0)