Skip to content

Commit 2238372

Browse files
Merge master into enable_amd_build
2 parents b27a8a1 + 2966ae6 commit 2238372

File tree

106 files changed

+4376
-3438
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

106 files changed

+4376
-3438
lines changed
Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,5 @@
11
# array API tests to be skipped
22

3-
# hypothesis found failures
4-
array_api_tests/test_operators_and_elementwise_functions.py::test_clip
5-
63
# unexpected result is returned - unmute when dpctl-1986 is resolved
74
array_api_tests/test_operators_and_elementwise_functions.py::test_asin
85
array_api_tests/test_operators_and_elementwise_functions.py::test_asinh

.github/workflows/build-sphinx.yml

Lines changed: 20 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -42,6 +42,7 @@ jobs:
4242
oneapi-pkgs-env: ''
4343
# Enable env when it's required to use only conda packages without OneAPI installation
4444
# oneapi-pkgs-env: '${{ github.workspace }}/environments/oneapi_pkgs.yml'
45+
dpctl-pkg-txt: 'environments/dpctl_pkg.txt'
4546

4647
steps:
4748
- name: Cancel Previous Runs
@@ -135,11 +136,26 @@ jobs:
135136
environment-file: ${{ env.environment-file }}
136137
activate-environment: 'docs'
137138

138-
- name: Conda info
139-
run: mamba info
139+
# We can't install dpctl as a conda package when the environment is created through
140+
# installing of Intel OneAPI packages because the dpctl conda package has a runtime
141+
# dependency on DPC++ RT one. Whereas the DPC++ RT package has been already installed
142+
# by the apt command above and its version has been matched with the DPC++ compiler.
143+
# In case where we install the DPC++ compiler with the apt (including DPC++ RT) and
144+
# install the DPC++ RT conda package while resolving dependencies, this can lead
145+
# to a versioning error, i.e. compatibility issue as the DPC++ compiler only guarantees
146+
# backwards compatibility, not forward compatibility (DPC++ RT may not run a binary built
147+
# with a newer version of the DPC++ compiler).
148+
# Installing dpctl via the pip manager has no such limitation, as the package has no
149+
# run dependency on the DPC++ RT pip package, so this is why the step is necessary here.
150+
- name: Install dpctl
151+
if: env.oneapi-pkgs-env == ''
152+
run: |
153+
pip install -r ${{ env.dpctl-pkg-txt }}
140154
141-
- name: Conda list
142-
run: mamba list
155+
- name: Conda info
156+
run: |
157+
mamba info
158+
mamba list
143159
144160
- name: Build library
145161
run: |

.github/workflows/check-mkl-interfaces.yaml

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,7 @@ env:
1313
environment-file-name: 'environment.yml'
1414
environment-file-loc: '${{ github.workspace }}/environments'
1515
build-with-oneapi-env: 'environments/build_with_oneapi.yml'
16+
dpctl-pkg-env: 'environments/dpctl_pkg.yml'
1617
oneapi-pkgs-env: 'environments/oneapi_pkgs.yml'
1718
test-env-name: 'test_onemkl_interfaces'
1819
rerun-tests-on-failure: 'true'
@@ -47,10 +48,11 @@ jobs:
4748

4849
- name: Merge conda env files
4950
run: |
50-
conda-merge ${{ env.build-with-oneapi-env }} ${{ env.oneapi-pkgs-env }} > ${{ env.environment-file }}
51+
conda-merge ${{ env.dpctl-pkg-env }} ${{ env.oneapi-pkgs-env }} ${{ env.build-with-oneapi-env }} > ${{ env.environment-file }}
52+
cat ${{ env.environment-file }}
5153
5254
- name: Upload artifact
53-
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4.6.1
55+
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
5456
with:
5557
name: ${{ env.environment-file-name }}
5658
path: ${{ env.environment-file }}
@@ -81,7 +83,7 @@ jobs:
8183
fetch-depth: 0
8284

8385
- name: Download artifact
84-
uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4.1.9
86+
uses: actions/download-artifact@95815c38cf2ff2164869cbab79da8d1f422bc89e # v4.2.1
8587
with:
8688
name: ${{ env.environment-file-name }}
8789
path: ${{ env.environment-file-loc }}
@@ -174,7 +176,7 @@ jobs:
174176
fetch-depth: 0
175177

176178
- name: Download artifact
177-
uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4.1.9
179+
uses: actions/download-artifact@95815c38cf2ff2164869cbab79da8d1f422bc89e # v4.2.1
178180
with:
179181
name: ${{ env.environment-file-name }}
180182
path: ${{ env.environment-file-loc }}
@@ -227,7 +229,7 @@ jobs:
227229
python -c "import dpnp; print(dpnp.__version__)"
228230
229231
- name: Run tests
230-
if: env.rerun-tests-on-failure == 'true'
232+
if: env.rerun-tests-on-failure != 'true'
231233
run: |
232234
python -m pytest -ra --pyargs dpnp.tests
233235
env:

.github/workflows/conda-package.yml

Lines changed: 13 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -82,6 +82,9 @@ jobs:
8282
environment-file: ${{ env.build-conda-pkg-env }}
8383
activate-environment: ${{ env.build-env-name }}
8484

85+
- name: List installed packages
86+
run: mamba list
87+
8588
- name: Store conda paths as envs
8689
shell: bash -el {0}
8790
run: |
@@ -93,22 +96,22 @@ jobs:
9396
continue-on-error: true
9497
run: conda build --no-test --python ${{ matrix.python }} --numpy 2.0 ${{ env.channels-list }} conda-recipe
9598
env:
96-
MAX_BUILD_CMPL_MKL_VERSION: '2025.1a0'
99+
MAX_BUILD_CMPL_MKL_VERSION: '2025.2a0'
97100

98101
- name: ReBuild conda package
99102
if: steps.build_conda_pkg.outcome == 'failure'
100103
run: conda build --no-test --python ${{ matrix.python }} --numpy 2.0 ${{ env.channels-list }} conda-recipe
101104
env:
102-
MAX_BUILD_CMPL_MKL_VERSION: '2025.1a0'
105+
MAX_BUILD_CMPL_MKL_VERSION: '2025.2a0'
103106

104107
- name: Upload artifact
105-
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4.6.1
108+
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
106109
with:
107110
name: ${{ env.package-name }} ${{ runner.os }} Python ${{ matrix.python }}
108111
path: ${{ env.CONDA_BLD }}${{ env.package-name }}-*.conda
109112

110113
- name: Upload wheels artifact
111-
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4.6.1
114+
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
112115
with:
113116
name: ${{ env.package-name }} ${{ runner.os }} Wheels Python ${{ matrix.python }}
114117
path: ${{ env.WHEELS_OUTPUT_FOLDER }}${{ env.package-name }}-*.whl
@@ -146,7 +149,7 @@ jobs:
146149
path: ${{ env.dpnp-repo-path }}
147150

148151
- name: Download artifact
149-
uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4.1.9
152+
uses: actions/download-artifact@95815c38cf2ff2164869cbab79da8d1f422bc89e # v4.2.1
150153
with:
151154
name: ${{ env.package-name }} ${{ runner.os }} Python ${{ matrix.python }}
152155
path: ${{ env.pkg-path-in-channel }}
@@ -278,7 +281,7 @@ jobs:
278281
path: ${{ env.dpnp-repo-path }}
279282

280283
- name: Download artifact
281-
uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4.1.9
284+
uses: actions/download-artifact@95815c38cf2ff2164869cbab79da8d1f422bc89e # v4.2.1
282285
with:
283286
name: ${{ env.package-name }} ${{ runner.os }} Python ${{ matrix.python }}
284287
path: ${{ env.pkg-path-in-channel }}
@@ -442,12 +445,12 @@ jobs:
442445
fetch-depth: ${{ env.fetch-depth }}
443446

444447
- name: Download artifact
445-
uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4.1.9
448+
uses: actions/download-artifact@95815c38cf2ff2164869cbab79da8d1f422bc89e # v4.2.1
446449
with:
447450
name: ${{ env.package-name }} ${{ runner.os }} Python ${{ matrix.python }}
448451

449452
- name: Download wheels artifact
450-
uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4.1.9
453+
uses: actions/download-artifact@95815c38cf2ff2164869cbab79da8d1f422bc89e # v4.2.1
451454
with:
452455
name: ${{ env.package-name }} ${{ runner.os }} Wheels Python ${{ matrix.python }}
453456

@@ -530,7 +533,7 @@ jobs:
530533
path: ${{ env.dpnp-repo-path }}
531534

532535
- name: Download artifact
533-
uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4.1.9
536+
uses: actions/download-artifact@95815c38cf2ff2164869cbab79da8d1f422bc89e # v4.2.1
534537
with:
535538
name: ${{ env.package-name }} ${{ runner.os }} Python ${{ env.python-ver }}
536539
path: ${{ env.pkg-path-in-channel }}
@@ -625,6 +628,7 @@ jobs:
625628
python -m pytest --json-report --json-report-file=${{ env.json-report-file }} --disable-deadline --skips-file ${{ env.array-api-skips-file }} array_api_tests || true
626629
env:
627630
ARRAY_API_TESTS_MODULE: 'dpnp'
631+
ARRAY_API_TESTS_VERSION: '2024.12'
628632
SYCL_CACHE_PERSISTENT: 1
629633
working-directory: ${{ env.array-api-tests-path }}
630634

.github/workflows/generate_coverage.yaml

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,7 @@ jobs:
2626
oneapi-pkgs-env: ''
2727
# Enable env when it's required to use only conda packages without OneAPI installation
2828
# oneapi-pkgs-env: '${{ github.workspace }}/environments/oneapi_pkgs.yml'
29+
dpctl-pkg-txt: 'environments/dpctl_pkg.txt'
2930

3031
steps:
3132
- name: Cancel Previous Runs
@@ -94,6 +95,22 @@ jobs:
9495
environment-file: ${{ env.environment-file }}
9596
activate-environment: 'coverage'
9697

98+
# We can't install dpctl as a conda package when the environment is created through
99+
# installing of Intel OneAPI packages because the dpctl conda package has a runtime
100+
# dependency on DPC++ RT one. Whereas the DPC++ RT package has beedn already installed
101+
# by the apt command above and its version has been matched with the DPC++ compiler.
102+
# In case where we install the DPC++ compiler with the apt (including DPC++ RT) and
103+
# install the DPC++ RT conda package while resolving dependencies, this can lead
104+
# to a versioning error, i.e. compatibility issue as the DPC++ compiler only guarantees
105+
# backwards compatibility, not forward compatibility (DPC++ RT may not run a binary built
106+
# with a newer version of the DPC++ compiler).
107+
# Installing dpctl via the pip manager has no such limitation, as the package has no
108+
# run dependency on the DPC++ RT pip package, so this is why the step is necessary here.
109+
- name: Install dpctl
110+
if: env.oneapi-pkgs-env == ''
111+
run: |
112+
pip install -r ${{ env.dpctl-pkg-txt }}
113+
97114
- name: Conda info
98115
run: |
99116
mamba info

.github/workflows/openssf-scorecard.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -60,14 +60,14 @@ jobs:
6060
# Upload the results as artifacts (optional). Commenting out will disable uploads of run results in SARIF
6161
# format to the repository Actions tab.
6262
- name: "Upload artifact"
63-
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4.6.1
63+
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
6464
with:
6565
name: SARIF file
6666
path: results.sarif
6767
retention-days: 14
6868

6969
# Upload the results to GitHub's code scanning dashboard.
7070
- name: "Upload to code-scanning"
71-
uses: github/codeql-action/upload-sarif@6bb031afdd8eb862ea3fc1848194185e076637e5 # v3.28.11
71+
uses: github/codeql-action/upload-sarif@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
7272
with:
7373
sarif_file: results.sarif

.github/workflows/pre-commit.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ jobs:
2929
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
3030

3131
- name: Set up python
32-
uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5.4.0
32+
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
3333
with:
3434
python-version: '3.13'
3535

.pre-commit-config.yaml

Lines changed: 14 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -2,13 +2,13 @@
22
# See https://pre-commit.com/hooks.html for more hooks
33
repos:
44
- repo: https://github.com/PyCQA/bandit
5-
rev: '1.7.9'
5+
rev: '1.8.3'
66
hooks:
77
- id: bandit
88
pass_filenames: false
99
args: ["-r", "dpnp", "-lll"]
1010
- repo: https://github.com/pre-commit/pre-commit-hooks
11-
rev: v4.6.0
11+
rev: v5.0.0
1212
hooks:
1313
- id: check-ast
1414
- id: check-builtin-literals
@@ -44,17 +44,18 @@ repos:
4444
- id: rst-inline-touching-normal
4545
- id: text-unicode-replacement-char
4646
- repo: https://github.com/codespell-project/codespell
47-
rev: v2.3.0
47+
rev: v2.4.1
4848
hooks:
4949
- id: codespell
50+
args: ["-L", "abd"] # ignore "abd" used in einsum tests
5051
additional_dependencies:
5152
- tomli
5253
- repo: https://github.com/psf/black
53-
rev: 24.4.2
54+
rev: 25.1.0
5455
hooks:
5556
- id: black
5657
- repo: https://github.com/pycqa/isort
57-
rev: 5.13.2
58+
rev: 6.0.1
5859
hooks:
5960
- id: isort
6061
name: isort (python)
@@ -65,20 +66,20 @@ repos:
6566
name: isort (pyi)
6667
types: [pyi]
6768
- repo: https://github.com/pycqa/flake8
68-
rev: 7.1.0
69+
rev: 7.1.2
6970
hooks:
7071
- id: flake8
7172
args: ["--config=.flake8"]
7273
additional_dependencies:
7374
- flake8-docstrings==1.7.0
74-
- flake8-bugbear==24.4.26
75+
- flake8-bugbear==24.12.12
7576
- repo: https://github.com/pocc/pre-commit-hooks
7677
rev: v1.3.5
7778
hooks:
7879
- id: clang-format
7980
args: ["-i"]
8081
- repo: https://github.com/gitleaks/gitleaks
81-
rev: v8.18.4
82+
rev: v8.24.0
8283
hooks:
8384
- id: gitleaks
8485
- repo: https://github.com/jumanjihouse/pre-commit-hooks
@@ -102,3 +103,8 @@ repos:
102103
"--disable=unused-wildcard-import"
103104
]
104105
files: '^dpnp/(dpnp_iface.*|fft|linalg)'
106+
- repo: https://github.com/macisamuele/language-formatters-pre-commit-hooks
107+
rev: v2.14.0
108+
hooks:
109+
- id: pretty-format-toml
110+
args: [--autofix]

CHANGELOG.md

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,19 +6,34 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
66

77
## [0.18.0] - MM/DD/2025
88

9+
This release achieves 100% compliance with Python Array API specification (revision [2024.12](https://data-apis.org/array-api/2024.12/)).
10+
911
### Added
1012

1113
* Added implementation of `dpnp.hamming` [#2341](https://github.com/IntelPython/dpnp/pull/2341), [#2357](https://github.com/IntelPython/dpnp/pull/2357)
1214
* Added implementation of `dpnp.hanning` [#2358](https://github.com/IntelPython/dpnp/pull/2358)
1315
* Added implementation of `dpnp.blackman` [#2363](https://github.com/IntelPython/dpnp/pull/2363)
1416
* Added implementation of `dpnp.bartlett` [#2366](https://github.com/IntelPython/dpnp/pull/2366)
17+
* Added implementation of `dpnp.convolve` [#2205](https://github.com/IntelPython/dpnp/pull/2205)
18+
* Added implementation of `dpnp.kaiser` [#2387](https://github.com/IntelPython/dpnp/pull/2387)
1519

1620
### Changed
1721

22+
* Improved performance of `dpnp.nansum`, `dpnp.nanprod`, `dpnp.nancumsum`, and `dpnp.nancumprod` by reusing `dpnp.nan_to_num` function in implementation of the functions [#2339](https://github.com/IntelPython/dpnp/pull/2339)
1823
* Allowed input array of `uint64` dtype in `dpnp.bincount` [#2361](https://github.com/IntelPython/dpnp/pull/2361)
24+
* The vector norms `ord={None, 1, 2, inf}` and the matrix norms `ord={None, 1, 2, inf, "fro", "nuc"}` now consistently return zero for empty arrays, which are arrays with at least one axis of size zero. This change affects `dpnp.linalg.norm`, `dpnp.linalg.vector_norm`, and `dpnp.linalg.matrix_norm`. Previously, dpnp would either raise errors or return zero depending on the parameters provided [#2371](https://github.com/IntelPython/dpnp/pull/2371)
25+
* Extended `dpnp.fft.fftfreq` and `dpnp.fft.rfftfreq` functions to support `dtype` keyword per Python Array API spec 2024.12 [#2384](https://github.com/IntelPython/dpnp/pull/2384)
26+
* Updated `dpnp.fix` to return output with the same data-type of input [#2392](https://github.com/IntelPython/dpnp/pull/2392)
27+
* Updated `dpnp.einsum` to add support for `order=None` [#2411](https://github.com/IntelPython/dpnp/pull/2411)
28+
* Updated Python Array API specification version supported to `2024.12` [#2416](https://github.com/IntelPython/dpnp/pull/2416)
29+
* Removed `einsum_call` keyword from `dpnp.einsum_path` signature [#2421](https://github.com/IntelPython/dpnp/pull/2421)
1930

2031
### Fixed
2132

33+
* Resolved an issue with an incorrect result returned due to missing dependency from the strided kernel on a copy event in `dpnp.erf` [#2378](https://github.com/IntelPython/dpnp/pull/2378)
34+
* Updated `conda create` commands build and install instructions of `Quick start guide` to avoid a compilation error [#2395](https://github.com/IntelPython/dpnp/pull/2395)
35+
* Added handling of empty string passed to a test env variable defining data type scope as a `False` value [#2415](https://github.com/IntelPython/dpnp/pull/2415)
36+
2237

2338
## [0.17.0] - 02/26/2025
2439

conda-recipe/meta.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{% set max_compiler_and_mkl_version = environ.get("MAX_BUILD_CMPL_MKL_VERSION", "2026.0a0") %}
22
{% set required_compiler_and_mkl_version = "2025.0" %}
3-
{% set required_dpctl_version = "0.19.0*" %}
3+
{% set required_dpctl_version = "0.20.0*" %}
44

55
package:
66
name: dpnp

0 commit comments

Comments
 (0)