Skip to content

Commit 393435e

Browse files
authored
Change the scans to be pyansys scans (#375)
Change the scans to use pyansys and public actions instead of the ansys-internal scans
1 parent 6a0374a commit 393435e

26 files changed

+278
-134
lines changed

.github/workflows/nightly.yml

Lines changed: 30 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ jobs:
6363
with:
6464
dpf-standalone-TOKEN: ${{ secrets.PYANSYS_CI_BOT_TOKEN }}
6565
standalone_suffix: ${{ env.DPF_STANDALONE_SUFFIX }}
66-
ANSYS_VERSION : ${{ env.ANSYS_VERSION }}
66+
ANSYS_VERSION: ${{ env.ANSYS_VERSION }}
6767

6868
- name: Run pytest
6969
run: make test
@@ -94,10 +94,38 @@ jobs:
9494
TWINE_PASSWORD: ${{ secrets.PYANSYS_PYPI_PRIVATE_PAT }}
9595
TWINE_REPOSITORY_URL: ${{ secrets.PRIVATE_PYPI_URL }}
9696

97+
vulnerabilities:
98+
name: Vulnerabilities
99+
needs: [ nightly_test ]
100+
runs-on: ubuntu-latest
101+
102+
steps:
103+
104+
- name: PyAnsys Vulnerability check (on main)
105+
if: github.ref == 'refs/heads/main'
106+
uses: ansys/actions/[email protected]
107+
with:
108+
python-version: ${{ env.MAIN_PYTHON_VERSION }}
109+
python-package-name: ${{ env.PACKAGE_NAME }}
110+
token: ${{ secrets.PYANSYS_CI_BOT_TOKEN }}
111+
run-bandit: false
112+
hide-log: false
113+
114+
- name: PyAnsys Vulnerability check (on dev)
115+
if: github.ref != 'refs/heads/main'
116+
uses: ansys/actions/[email protected]
117+
with:
118+
python-version: ${{ env.MAIN_PYTHON_VERSION }}
119+
python-package-name: ${{ env.PACKAGE_NAME }}
120+
token: ${{ secrets.PYANSYS_CI_BOT_TOKEN }}
121+
run-bandit: false
122+
dev-mode: true
123+
hide-log: false
124+
97125
ci-failure:
98126
name: Teams notify on failure
99127
if: failure()
100-
needs: [ nightly_and_upload ]
128+
needs: [ nightly_and_upload, vulnerabilities ]
101129
runs-on: ubuntu-latest
102130
steps:
103131
- uses: actions/checkout@v5

.github/workflows/nightly_scan.yml

Lines changed: 0 additions & 18 deletions
This file was deleted.

.github/workflows/scan_sbom.yml

Lines changed: 79 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,79 @@
1+
name: Security Scan
2+
3+
env:
4+
MAIN_PYTHON_VERSION: '3.13'
5+
PACKAGE_NAME: 'ansys-dynamicreporting-core'
6+
7+
on:
8+
push:
9+
branches:
10+
- main
11+
- maint/*
12+
- release/*
13+
14+
jobs:
15+
16+
sbom:
17+
name: Generate SBOM
18+
runs-on: ubuntu-latest
19+
20+
steps:
21+
22+
- name: Checkout code
23+
uses: actions/checkout@v4
24+
25+
- name: Set up Python
26+
uses: actions/setup-python@v5
27+
with:
28+
python-version: ${{ env.MAIN_PYTHON_VERSION }}
29+
30+
- name: Build wheelhouse
31+
uses: ansys/actions/build-wheelhouse@v10
32+
with:
33+
library-name: ${{ env.PACKAGE_NAME }}
34+
operating-system: ubuntu-latest
35+
python-version: ${{ env.MAIN_PYTHON_VERSION }}
36+
37+
- name: Install from wheelhouse
38+
run: python -m pip install --no-index --find-links=wheelhouse ${{ env.PACKAGE_NAME }}
39+
40+
- name: Generate SBOM with Syft
41+
uses: anchore/[email protected]
42+
with:
43+
format: cyclonedx-json
44+
output-file: sbom.cyclonedx.json
45+
upload-artifact: false
46+
47+
- name: Upload SBOM as artifact
48+
uses: actions/upload-artifact@v4
49+
with:
50+
name: ${{ env.PACKAGE_NAME }}-sbom
51+
path: sbom.cyclonedx.json
52+
53+
54+
vulnerabilities:
55+
name: Vulnerabilities
56+
runs-on: ubuntu-latest
57+
58+
steps:
59+
60+
- name: PyAnsys Vulnerability check (on main)
61+
if: github.ref == 'refs/heads/main'
62+
uses: ansys/actions/[email protected]
63+
with:
64+
python-version: ${{ env.MAIN_PYTHON_VERSION }}
65+
python-package-name: ${{ env.PACKAGE_NAME }}
66+
token: ${{ secrets.PYANSYS_CI_BOT_TOKEN }}
67+
run-bandit: false
68+
hide-log: false
69+
70+
- name: PyAnsys Vulnerability check (on dev)
71+
if: github.ref != 'refs/heads/main'
72+
uses: ansys/actions/[email protected]
73+
with:
74+
python-version: ${{ env.MAIN_PYTHON_VERSION }}
75+
python-package-name: ${{ env.PACKAGE_NAME }}
76+
token: ${{ secrets.PYANSYS_CI_BOT_TOKEN }}
77+
run-bandit: false
78+
dev-mode: true
79+
hide-log: false

codegen/adr_utils.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ def in_ipynb():
4141
return True
4242
if "terminal" in ipy_str:
4343
return False
44-
except Exception: # todo: please specify the possible exceptions here.
44+
except Exception as e: # todo: please specify the possible exceptions here.
4545
return False
4646

4747

src/ansys/dynamicreporting/core/adr_service.py

Lines changed: 11 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -241,7 +241,7 @@ def connect(
241241
username: str = "nexus",
242242
password: str = "cei",
243243
session: str | None = "",
244-
) -> None:
244+
) -> None: # nosec B107
245245
"""
246246
Connect to a running service.
247247
@@ -282,8 +282,8 @@ def connect(
282282
)
283283
try:
284284
self.serverobj.validate()
285-
except Exception:
286-
self.logger.error("Can not validate dynamic reporting server.\n")
285+
except Exception as e:
286+
self.logger.error(f"Can not validate dynamic reporting server.\nError: {str(e)}")
287287
raise NotValidServer
288288
# set url after connection succeeds
289289
self._url = url
@@ -301,7 +301,7 @@ def start(
301301
error_if_create_db_exists: bool = False,
302302
exit_on_close: bool = False,
303303
delete_db: bool = False,
304-
) -> str:
304+
) -> str: # nosec B107
305305
"""
306306
Start a new service.
307307
@@ -392,11 +392,11 @@ def start(
392392
if self._docker_launcher:
393393
try:
394394
create_output = self._docker_launcher.create_nexus_db()
395-
except Exception: # pragma: no cover
395+
except Exception as e: # pragma: no cover
396396
self._docker_launcher.cleanup()
397397
self.logger.error(
398-
f"Error creating the database at the path {self._db_directory} in the "
399-
"Docker container.\n"
398+
"Error creating the database at the path {self._db_directory} in the "
399+
f"Docker container.\nError: {str(e)}"
400400
)
401401
raise CannotCreateDatabaseError
402402
for f in ["db.sqlite3", "view_report.nexdb"]:
@@ -511,10 +511,11 @@ def stop(self) -> None:
511511
v = False
512512
try:
513513
v = self.serverobj.validate()
514-
except Exception:
514+
except Exception as e:
515+
self.logger.error(f"Error: {str(e)}")
515516
pass
516517
if v is False:
517-
self.logger.error("Error validating the connected service. Can't shut it down.\n")
518+
self.logger.error("Error validating the connected service. Can't shut it down.")
518519
else:
519520
# If coming from a docker image, clean that up
520521
try:
@@ -814,7 +815,7 @@ def delete(self, items: list) -> None:
814815
try:
815816
_ = self.serverobj.del_objects(items_to_delete)
816817
except Exception as e:
817-
self.logger.warning(f"Error in deleting items: {e}")
818+
self.logger.warning(f"Error in deleting items: {str(e)}")
818819

819820
def get_report(self, report_name: str) -> Report:
820821
"""

src/ansys/dynamicreporting/core/docker_support.py

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -64,8 +64,8 @@ def __init__(self, image_url: str | None = None, use_dev: bool = False) -> None:
6464
# Load up Docker from the user's environment
6565
try:
6666
self._client: docker.client.DockerClient = docker.from_env()
67-
except Exception: # pragma: no cover
68-
raise RuntimeError("Can't initialize Docker")
67+
except Exception as e: # pragma: no cover
68+
raise RuntimeError(f"Can't initialize Docker: {str(e)}")
6969
self._container: docker.models.containers.Container = None
7070
self._image: docker.models.images.Image = None
7171
# the Ansys / EnSight version we found in the container
@@ -92,8 +92,8 @@ def pull_image(self) -> docker.models.images.Image:
9292
"""
9393
try:
9494
self._image = self._client.images.pull(self._image_url)
95-
except Exception:
96-
raise RuntimeError(f"Can't pull Docker image: {self._image_url}")
95+
except Exception as e:
96+
raise RuntimeError(f"Can't pull Docker image: {self._image_url}\n\n{str(e)}")
9797
return self._image
9898

9999
def create_container(self) -> docker.models.containers.Container:
@@ -119,7 +119,7 @@ def copy_to_host(self, src: str, *, dest: str = ".") -> None:
119119
tar_file.write(chunk)
120120
# Extract the tar archive
121121
with tarfile.open(tar_file_path) as tar:
122-
tar.extractall(path=output_path)
122+
tar.extractall(path=output_path) # nosec B202
123123
# Remove the tar archive
124124
tar_file_path.unlink()
125125
except Exception as e:
@@ -176,7 +176,7 @@ def start(self, host_directory: str, db_directory: str, port: int, ansys_version
176176
existing_names = [x.name for x in self._client.from_env().containers.list()]
177177
container_name = "nexus"
178178
while container_name in existing_names:
179-
container_name += random.choice(string.ascii_letters)
179+
container_name += random.choice(string.ascii_letters) # nosec B311
180180
if len(container_name) > 500:
181181
raise RuntimeError("Can't determine a unique Docker container name.")
182182

src/ansys/dynamicreporting/core/examples/downloads.py

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -42,9 +42,10 @@ def check_url_exists(url: str) -> bool:
4242
logging.debug(f"Passed url is invalid: {url}\n")
4343
return False
4444
try:
45-
with request.urlopen(url) as response:
45+
with request.urlopen(url) as response: # nosec B310
4646
return response.status == 200
47-
except Exception:
47+
except Exception as e:
48+
logging.debug(f"Check url error: {str(e)}\n")
4849
return False
4950

5051

@@ -61,7 +62,7 @@ def get_url_content(url: str) -> str:
6162
str
6263
content of the URL
6364
"""
64-
with request.urlopen(url) as response:
65+
with request.urlopen(url) as response: # nosec B310
6566
return response.read()
6667

6768

src/ansys/dynamicreporting/core/serverless/item.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
from html.parser import HTMLParser as BaseHTMLParser
44
import io
55
from pathlib import Path
6-
import pickle
6+
import pickle # nosec B403
77
import platform
88
import uuid
99

src/ansys/dynamicreporting/core/utils/encoders.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,10 @@ def default(self, obj):
3434
cls = list if isinstance(obj, (list, tuple)) else dict
3535
try:
3636
return cls(obj)
37-
except Exception:
37+
except Exception as e: # nosec
38+
error_str = f"Object of type {type(obj).__name__} is not JSON serializable: "
39+
error_str += str(e)
40+
raise TypeError(error_str)
3841
pass
3942
elif hasattr(obj, "__iter__"):
4043
return tuple(item for item in obj)

src/ansys/dynamicreporting/core/utils/extremely_ugly_hacks.py

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# All Python3 migration-related ugly hacks go here.
22
import base64
3-
import pickle
3+
import pickle # nosec B403
44
from uuid import UUID
55

66
from .report_utils import text_type
@@ -54,18 +54,18 @@ def safe_unpickle(input_data, item_type=None):
5454
try:
5555
# be default, we follow python3's way of loading: default encoding is ascii
5656
# this will work if the data was dumped using python3's pickle. Just do the usual.
57-
data = pickle.loads(bytes_data)
58-
except Exception:
57+
data = pickle.loads(bytes_data) # nosec B301 B502
58+
except Exception: # nosec
5959
try:
60-
data = pickle.loads(bytes_data, encoding="utf-8")
60+
data = pickle.loads(bytes_data, encoding="utf-8") # nosec B301 B502
6161
except Exception:
6262
# if it fails, which it will if the data was dumped using python2's pickle, then:
6363
# As per https://docs.python.org/3/library/pickle.html#pickle.loads,
6464
# "Using encoding='latin1' is required for unpickling NumPy arrays and instances of datetime,
6565
# date and time pickled by Python 2."
6666
# The data does contain a numpy array. So:
6767
try:
68-
data = pickle.loads(bytes_data, encoding="latin-1")
68+
data = pickle.loads(bytes_data, encoding="latin-1") # nosec B301 B502
6969

7070
# if the stream contains international characters which were 'loaded' with latin-1,
7171
# we get garbage text. We have to detect that and then use a workaround.
@@ -80,7 +80,7 @@ def safe_unpickle(input_data, item_type=None):
8080
# this is a tree item ONLY case that has a pickled datetime obj,
8181
# we use bytes as the encoding to workaround this issue, because
8282
# other encodings will not work.
83-
data = pickle.loads(bytes_data, encoding="bytes")
83+
data = pickle.loads(bytes_data, encoding="bytes") # nosec B301 B502
8484

8585
# check again, just in case
8686
if item_type == "tree":

0 commit comments

Comments
 (0)