Skip to content

Commit 25286eb

Browse files
committed
Merge remote-tracking branch 'upstream/main' into better-msg-delta-series_timestamp
2 parents 17ace06 + f165bd5 commit 25286eb

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

78 files changed

+555
-570
lines changed

.gitattributes

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -85,4 +85,5 @@ pandas/tests/io/parser/data export-ignore
8585

8686
# Include cibw script in sdist since it's needed for building wheels
8787
scripts/cibw_before_build.sh -export-ignore
88-
scripts/cibw_before_test.sh -export-ignore
88+
scripts/cibw_before_build_windows.sh -export-ignore
89+
scripts/cibw_before_test_windows.sh -export-ignore

.github/workflows/unit-tests.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -387,8 +387,8 @@ jobs:
387387
- name: Build Environment
388388
run: |
389389
python --version
390-
python -m pip install --upgrade pip setuptools wheel meson[ninja]==1.2.1 meson-python==0.13.1
391-
python -m pip install --pre --extra-index-url https://pypi.anaconda.org/scientific-python-nightly-wheels/simple numpy cython
390+
python -m pip install --upgrade pip setuptools wheel numpy meson[ninja]==1.2.1 meson-python==0.13.1
391+
python -m pip install --pre --extra-index-url https://pypi.anaconda.org/scientific-python-nightly-wheels/simple cython
392392
python -m pip install versioneer[toml]
393393
python -m pip install python-dateutil pytz tzdata hypothesis>=6.84.0 pytest>=7.3.2 pytest-xdist>=3.4.0 pytest-cov
394394
python -m pip install -ve . --no-build-isolation --no-index --no-deps -Csetup-args="--werror"

.github/workflows/wheels.yml

Lines changed: 0 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -111,10 +111,6 @@ jobs:
111111
- buildplat: [ubuntu-22.04, pyodide_wasm32]
112112
python: ["cp312", "3.12"]
113113
cibw_build_frontend: 'build'
114-
# TODO: Build free-threaded wheels for Windows
115-
exclude:
116-
- buildplat: [windows-2022, win_amd64]
117-
python: ["cp313t", "3.13"]
118114

119115
env:
120116
IS_PUSH: ${{ github.event_name == 'push' && startsWith(github.ref, 'refs/tags/v') }}
@@ -181,20 +177,6 @@ jobs:
181177
shell: bash -el {0}
182178
run: for whl in $(ls wheelhouse); do wheel unpack wheelhouse/$whl -d /tmp; done
183179

184-
# Testing on windowsservercore instead of GHA runner to fail on missing DLLs
185-
- name: Test Windows Wheels
186-
if: ${{ matrix.buildplat[1] == 'win_amd64' }}
187-
shell: pwsh
188-
run: |
189-
$TST_CMD = @"
190-
python -m pip install hypothesis>=6.84.0 pytest>=7.3.2 pytest-xdist>=3.4.0;
191-
python -m pip install `$(Get-Item pandas\wheelhouse\*.whl);
192-
python -c `'import pandas as pd; pd.test(extra_args=[`\"--no-strict-data-files`\", `\"-m not clipboard and not single_cpu and not slow and not network and not db`\"])`';
193-
"@
194-
# add rc to the end of the image name if the Python version is unreleased
195-
docker pull python:${{ matrix.python[1] == '3.13' && '3.13-rc' || format('{0}-windowsservercore', matrix.python[1]) }}
196-
docker run --env PANDAS_CI='1' -v ${PWD}:C:\pandas python:${{ matrix.python[1] == '3.13' && '3.13-rc' || format('{0}-windowsservercore', matrix.python[1]) }} powershell -Command $TST_CMD
197-
198180
- uses: actions/upload-artifact@v4
199181
with:
200182
name: ${{ matrix.python[0] }}-${{ matrix.buildplat[1] }}

MANIFEST.in

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -65,3 +65,5 @@ graft pandas/_libs/include
6565

6666
# Include cibw script in sdist since it's needed for building wheels
6767
include scripts/cibw_before_build.sh
68+
include scripts/cibw_before_build_windows.sh
69+
include scripts/cibw_before_test_windows.sh

ci/code_checks.sh

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -84,7 +84,6 @@ if [[ -z "$CHECK" || "$CHECK" == "docstrings" ]]; then
8484
-i "pandas.Timestamp.resolution PR02" \
8585
-i "pandas.Timestamp.tzinfo GL08" \
8686
-i "pandas.api.types.is_re_compilable PR07,SA01" \
87-
-i "pandas.api.types.pandas_dtype PR07,RT03,SA01" \
8887
-i "pandas.arrays.ArrowExtensionArray PR07,SA01" \
8988
-i "pandas.arrays.IntegerArray SA01" \
9089
-i "pandas.arrays.IntervalArray.length SA01" \

doc/source/whatsnew/v2.3.0.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -106,10 +106,10 @@ Conversion
106106
Strings
107107
^^^^^^^
108108
- Bug in :meth:`Series.rank` for :class:`StringDtype` with ``storage="pyarrow"`` incorrectly returning integer results in case of ``method="average"`` and raising an error if it would truncate results (:issue:`59768`)
109+
- Bug in :meth:`Series.replace` with :class:`StringDtype` when replacing with a non-string value was not upcasting to ``object`` dtype (:issue:`60282`)
109110
- Bug in :meth:`Series.str.replace` when ``n < 0`` for :class:`StringDtype` with ``storage="pyarrow"`` (:issue:`59628`)
110111
- Bug in ``ser.str.slice`` with negative ``step`` with :class:`ArrowDtype` and :class:`StringDtype` with ``storage="pyarrow"`` giving incorrect results (:issue:`59710`)
111112
- Bug in the ``center`` method on :class:`Series` and :class:`Index` object ``str`` accessors with pyarrow-backed dtype not matching the python behavior in corner cases with an odd number of fill characters (:issue:`54792`)
112-
-
113113

114114
Interval
115115
^^^^^^^^
@@ -118,7 +118,7 @@ Interval
118118

119119
Indexing
120120
^^^^^^^^
121-
-
121+
- Fixed bug in :meth:`Index.get_indexer` round-tripping through string dtype when ``infer_string`` is enabled (:issue:`55834`)
122122
-
123123

124124
Missing

doc/source/whatsnew/v3.0.0.rst

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -702,6 +702,8 @@ I/O
702702
- Bug in :meth:`read_stata` raising ``KeyError`` when input file is stored in big-endian format and contains strL data. (:issue:`58638`)
703703
- Bug in :meth:`read_stata` where extreme value integers were incorrectly interpreted as missing for format versions 111 and prior (:issue:`58130`)
704704
- Bug in :meth:`read_stata` where the missing code for double was not recognised for format versions 105 and prior (:issue:`58149`)
705+
- Bug in :meth:`set_option` where setting the pandas option ``display.html.use_mathjax`` to ``False`` has no effect (:issue:`59884`)
706+
- Bug in :meth:`to_excel` where :class:`MultiIndex` columns would be merged to a single row when ``merge_cells=False`` is passed (:issue:`60274`)
705707

706708
Period
707709
^^^^^^

pandas/core/array_algos/replace.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -151,4 +151,6 @@ def re_replacer(s):
151151
if mask is None:
152152
values[:] = f(values)
153153
else:
154+
if values.ndim != mask.ndim:
155+
mask = np.broadcast_to(mask, values.shape)
154156
values[mask] = f(values[mask])

pandas/core/arrays/arrow/array.py

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1644,7 +1644,11 @@ def _accumulate(
16441644
else:
16451645
data_to_accum = data_to_accum.cast(pa.int64())
16461646

1647-
result = pyarrow_meth(data_to_accum, skip_nulls=skipna, **kwargs)
1647+
try:
1648+
result = pyarrow_meth(data_to_accum, skip_nulls=skipna, **kwargs)
1649+
except pa.ArrowNotImplementedError as err:
1650+
msg = f"operation '{name}' not supported for dtype '{self.dtype}'"
1651+
raise TypeError(msg) from err
16481652

16491653
if convert_to_int:
16501654
result = result.cast(pa_dtype)

pandas/core/arrays/string_.py

Lines changed: 25 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -730,20 +730,9 @@ def _values_for_factorize(self) -> tuple[np.ndarray, libmissing.NAType | float]:
730730

731731
return arr, self.dtype.na_value
732732

733-
def __setitem__(self, key, value) -> None:
734-
value = extract_array(value, extract_numpy=True)
735-
if isinstance(value, type(self)):
736-
# extract_array doesn't extract NumpyExtensionArray subclasses
737-
value = value._ndarray
738-
739-
key = check_array_indexer(self, key)
740-
scalar_key = lib.is_scalar(key)
741-
scalar_value = lib.is_scalar(value)
742-
if scalar_key and not scalar_value:
743-
raise ValueError("setting an array element with a sequence.")
744-
745-
# validate new items
746-
if scalar_value:
733+
def _maybe_convert_setitem_value(self, value):
734+
"""Maybe convert value to be pyarrow compatible."""
735+
if lib.is_scalar(value):
747736
if isna(value):
748737
value = self.dtype.na_value
749738
elif not isinstance(value, str):
@@ -753,8 +742,11 @@ def __setitem__(self, key, value) -> None:
753742
"instead."
754743
)
755744
else:
745+
value = extract_array(value, extract_numpy=True)
756746
if not is_array_like(value):
757747
value = np.asarray(value, dtype=object)
748+
elif isinstance(value.dtype, type(self.dtype)):
749+
return value
758750
else:
759751
# cast categories and friends to arrays to see if values are
760752
# compatible, compatibility with arrow backed strings
@@ -764,11 +756,26 @@ def __setitem__(self, key, value) -> None:
764756
"Invalid value for dtype 'str'. Value should be a "
765757
"string or missing value (or array of those)."
766758
)
759+
return value
767760

768-
mask = isna(value)
769-
if mask.any():
770-
value = value.copy()
771-
value[isna(value)] = self.dtype.na_value
761+
def __setitem__(self, key, value) -> None:
762+
value = self._maybe_convert_setitem_value(value)
763+
764+
key = check_array_indexer(self, key)
765+
scalar_key = lib.is_scalar(key)
766+
scalar_value = lib.is_scalar(value)
767+
if scalar_key and not scalar_value:
768+
raise ValueError("setting an array element with a sequence.")
769+
770+
if not scalar_value:
771+
if value.dtype == self.dtype:
772+
value = value._ndarray
773+
else:
774+
value = np.asarray(value)
775+
mask = isna(value)
776+
if mask.any():
777+
value = value.copy()
778+
value[isna(value)] = self.dtype.na_value
772779

773780
super().__setitem__(key, value)
774781

0 commit comments

Comments
 (0)