Skip to content

Commit 89ab94e

Browse files
Merge remote-tracking branch 'upstream/main' into td-construction
2 parents 9bf564b + 9222cb0 commit 89ab94e

30 files changed

+111
-165
lines changed

.github/workflows/32-bit-linux.yml

Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
name: 32 Bit Linux
2+
3+
on:
4+
push:
5+
branches:
6+
- main
7+
- 1.4.x
8+
pull_request:
9+
branches:
10+
- main
11+
- 1.4.x
12+
paths-ignore:
13+
- "doc/**"
14+
15+
jobs:
16+
pytest:
17+
runs-on: ubuntu-latest
18+
steps:
19+
- name: Checkout
20+
uses: actions/checkout@v3
21+
with:
22+
fetch-depth: 0
23+
24+
- name: Run 32-bit manylinux2014 Docker Build / Tests
25+
run: |
26+
docker pull quay.io/pypa/manylinux2014_i686
27+
docker run --platform linux/386 -v $(pwd):/pandas quay.io/pypa/manylinux2014_i686 \
28+
/bin/bash -xc "cd pandas && \
29+
/opt/python/cp38-cp38/bin/python -m venv ~/virtualenvs/pandas-dev && \
30+
. ~/virtualenvs/pandas-dev/bin/activate && \
31+
python -m pip install --no-deps -U pip wheel 'setuptools<60.0.0' && \
32+
pip install cython numpy python-dateutil pytz pytest pytest-xdist pytest-asyncio>=0.17 hypothesis && \
33+
python setup.py build_ext -q -j2 && \
34+
python -m pip install --no-build-isolation --no-use-pep517 -e . && \
35+
export PANDAS_CI=1 && \
36+
pytest -m 'not slow and not network and not clipboard and not single_cpu' pandas --junitxml=test-data.xml"
37+
38+
- name: Publish test results for Python 3.8-32 bit full Linux
39+
uses: actions/upload-artifact@v3
40+
with:
41+
name: Test results
42+
path: test-data.xml
43+
if: failure()

.github/workflows/code-checks.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -74,7 +74,7 @@ jobs:
7474

7575
- name: Install pyright
7676
# note: keep version in sync with .pre-commit-config.yaml
77-
run: npm install -g pyright@1.1.245
77+
run: npm install -g pyright@1.1.247
7878

7979
- name: Build Pandas
8080
id: build

.github/workflows/python-dev.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
# Unfreeze(by commentingthe if: false() condition) once the
33
# next Python Dev version has released beta 1 and both Cython and numpy support it
44
# After that Python has released, migrate the workflows to the
5-
# posix GHA workflows/Azure pipelines and "freeze" this file by
5+
# posix GHA workflows and "freeze" this file by
66
# uncommenting the if: false() condition
77
# Feel free to modify this comment as necessary.
88

.pre-commit-config.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -89,7 +89,7 @@ repos:
8989
types: [python]
9090
stages: [manual]
9191
# note: keep version in sync with .github/workflows/code-checks.yml
92-
additional_dependencies: ['pyright@1.1.245']
92+
additional_dependencies: ['pyright@1.1.247']
9393
- repo: local
9494
hooks:
9595
- id: flake8-rst

LICENSES/ULTRAJSON_LICENSE

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ Portions of code from MODP_ASCII - Ascii transformations (upper/lower, etc)
2828
https://github.com/client9/stringencoders
2929
Copyright (c) 2007 Nick Galbreath -- nickg [at] modp [dot] com. All rights reserved.
3030

31-
Numeric decoder derived from from TCL library
31+
Numeric decoder derived from TCL library
3232
http://www.opensource.apple.com/source/tcl/tcl-14/tcl/license.terms
3333
* Copyright (c) 1988-1993 The Regents of the University of California.
3434
* Copyright (c) 1994 Sun Microsystems, Inc.

README.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,6 @@
1010
[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.3509134.svg)](https://doi.org/10.5281/zenodo.3509134)
1111
[![Package Status](https://img.shields.io/pypi/status/pandas.svg)](https://pypi.org/project/pandas/)
1212
[![License](https://img.shields.io/pypi/l/pandas.svg)](https://github.com/pandas-dev/pandas/blob/main/LICENSE)
13-
[![Azure Build Status](https://dev.azure.com/pandas-dev/pandas/_apis/build/status/pandas-dev.pandas?branch=main)](https://dev.azure.com/pandas-dev/pandas/_build/latest?definitionId=1&branch=main)
1413
[![Coverage](https://codecov.io/github/pandas-dev/pandas/coverage.svg?branch=main)](https://codecov.io/gh/pandas-dev/pandas)
1514
[![Downloads](https://static.pepy.tech/personalized-badge/pandas?period=month&units=international_system&left_color=black&right_color=orange&left_text=PyPI%20downloads%20per%20month)](https://pepy.tech/project/pandas)
1615
[![Gitter](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/pydata/pandas)

asv_bench/benchmarks/sparse.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -146,10 +146,10 @@ def setup(self, fill_value):
146146

147147
def make_block_array(self, length, num_blocks, block_size, fill_value):
148148
arr = np.full(length, fill_value)
149-
indicies = np.random.choice(
149+
indices = np.random.choice(
150150
np.arange(0, length, block_size), num_blocks, replace=False
151151
)
152-
for ind in indicies:
152+
for ind in indices:
153153
arr[ind : ind + block_size] = np.random.randint(0, 100, block_size)
154154
return SparseArray(arr, fill_value=fill_value)
155155

azure-pipelines.yml

Lines changed: 0 additions & 50 deletions
This file was deleted.

ci/setup_env.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -104,6 +104,6 @@ echo "Build extensions"
104104
python setup.py build_ext -q -j3
105105

106106
echo "Install pandas"
107-
python -m pip install --no-build-isolation -e .
107+
python -m pip install --no-build-isolation --no-use-pep517 -e .
108108

109109
echo "done"

doc/source/development/contributing_codebase.rst

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -289,13 +289,11 @@ library. This makes type checkers aware of the type annotations shipped with pan
289289
Testing with continuous integration
290290
-----------------------------------
291291

292-
The pandas test suite will run automatically on `GitHub Actions <https://github.com/features/actions/>`__ and
293-
`Azure Pipelines <https://azure.microsoft.com/en-us/services/devops/pipelines/>`__
292+
The pandas test suite will run automatically on `GitHub Actions <https://github.com/features/actions/>`__
294293
continuous integration services, once your pull request is submitted.
295294
However, if you wish to run the test suite on a branch prior to submitting the pull request,
296295
then the continuous integration services need to be hooked to your GitHub repository. Instructions are here
297-
for `GitHub Actions <https://docs.github.com/en/actions/>`__ and
298-
`Azure Pipelines <https://docs.microsoft.com/en-us/azure/devops/pipelines/?view=azure-devops>`__.
296+
for `GitHub Actions <https://docs.github.com/en/actions/>`__.
299297

300298
A pull-request will be considered for merging when you have an all 'green' build. If any tests are failing,
301299
then you will get a red 'X', where you can click through to see the individual failed tests.

doc/source/user_guide/io.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3529,7 +3529,7 @@ See the :ref:`cookbook<cookbook.excel>` for some advanced strategies.
35293529
**Please do not report issues when using ``xlrd`` to read ``.xlsx`` files.**
35303530
This is no longer supported, switch to using ``openpyxl`` instead.
35313531

3532-
Attempting to use the the ``xlwt`` engine will raise a ``FutureWarning``
3532+
Attempting to use the ``xlwt`` engine will raise a ``FutureWarning``
35333533
unless the option :attr:`io.excel.xls.writer` is set to ``"xlwt"``.
35343534
While this option is now deprecated and will also raise a ``FutureWarning``,
35353535
it can be globally set and the warning suppressed. Users are recommended to
@@ -5470,7 +5470,7 @@ See the documentation for `pyarrow <https://arrow.apache.org/docs/python/>`__ an
54705470
.. note::
54715471

54725472
These engines are very similar and should read/write nearly identical parquet format files.
5473-
Currently ``pyarrow`` does not support timedelta data, ``fastparquet>=0.1.4`` supports timezone aware datetimes.
5473+
``pyarrow>=8.0.0`` supports timedelta data, ``fastparquet>=0.1.4`` supports timezone aware datetimes.
54745474
These libraries differ by having different underlying dependencies (``fastparquet`` by using ``numba``, while ``pyarrow`` uses a c-library).
54755475

54765476
.. ipython:: python

doc/source/user_guide/style.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -612,7 +612,7 @@
612612
"source": [
613613
"### Acting on the Index and Column Headers\n",
614614
"\n",
615-
"Similar application is acheived for headers by using:\n",
615+
"Similar application is achieved for headers by using:\n",
616616
" \n",
617617
"- [.applymap_index()][applymapindex] (elementwise): accepts a function that takes a single value and returns a string with the CSS attribute-value pair.\n",
618618
"- [.apply_index()][applyindex] (level-wise): accepts a function that takes a Series and returns a Series, or numpy array with an identical shape where each element is a string with a CSS attribute-value pair. This method passes each level of your Index one-at-a-time. To style the index use `axis=0` and to style the column headers use `axis=1`.\n",

doc/source/whatsnew/v1.5.0.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -360,7 +360,7 @@ If installed, we now require:
360360
+-----------------+-----------------+----------+---------+
361361
| Package | Minimum Version | Required | Changed |
362362
+=================+=================+==========+=========+
363-
| mypy (dev) | 0.941 | | X |
363+
| mypy (dev) | 0.950 | | X |
364364
+-----------------+-----------------+----------+---------+
365365

366366

environment.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ dependencies:
2424
- flake8-bugbear=21.3.2 # used by flake8, find likely bugs
2525
- flake8-comprehensions=3.7.0 # used by flake8, linting of unnecessary comprehensions
2626
- isort>=5.2.1 # check that imports are in the right order
27-
- mypy=0.941
27+
- mypy=0.950
2828
- pre-commit>=2.9.2
2929
- pycodestyle # used by flake8
3030
- pyupgrade

pandas/_config/localization.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,9 @@ def set_locale(
4545
locale.setlocale(lc_var, new_locale)
4646
normalized_locale = locale.getlocale()
4747
if all(x is not None for x in normalized_locale):
48-
yield ".".join(normalized_locale)
48+
# error: Argument 1 to "join" of "str" has incompatible type
49+
# "Tuple[Optional[str], Optional[str]]"; expected "Iterable[str]"
50+
yield ".".join(normalized_locale) # type: ignore[arg-type]
4951
else:
5052
yield new_locale
5153
finally:

pandas/_libs/tslibs/__init__.py

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -58,9 +58,7 @@
5858
)
5959
from pandas._libs.tslibs.timestamps import Timestamp
6060
from pandas._libs.tslibs.timezones import tz_compare
61-
from pandas._libs.tslibs.tzconversion import (
62-
py_tz_convert_from_utc_single as tz_convert_from_utc_single,
63-
)
61+
from pandas._libs.tslibs.tzconversion import tz_convert_from_utc_single
6462
from pandas._libs.tslibs.vectorized import (
6563
dt64arr_to_periodarr,
6664
get_resolution,

pandas/_libs/tslibs/offsets.pyx

Lines changed: 3 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -54,10 +54,7 @@ from pandas._libs.tslibs.ccalendar cimport (
5454
get_firstbday,
5555
get_lastbday,
5656
)
57-
from pandas._libs.tslibs.conversion cimport (
58-
convert_datetime_to_tsobject,
59-
localize_pydatetime,
60-
)
57+
from pandas._libs.tslibs.conversion cimport localize_pydatetime
6158
from pandas._libs.tslibs.nattype cimport (
6259
NPY_NAT,
6360
c_NaT as NaT,
@@ -68,7 +65,6 @@ from pandas._libs.tslibs.np_datetime cimport (
6865
npy_datetimestruct,
6966
pydate_to_dtstruct,
7067
)
71-
from pandas._libs.tslibs.tzconversion cimport tz_convert_from_utc_single
7268

7369
from .dtypes cimport PeriodDtypeCode
7470
from .timedeltas cimport (
@@ -270,10 +266,8 @@ cdef _to_dt64D(dt):
270266
if getattr(dt, 'tzinfo', None) is not None:
271267
# Get the nanosecond timestamp,
272268
# equiv `Timestamp(dt).value` or `dt.timestamp() * 10**9`
273-
nanos = getattr(dt, "nanosecond", 0)
274-
i8 = convert_datetime_to_tsobject(dt, tz=None, nanos=nanos).value
275-
dt = tz_convert_from_utc_single(i8, dt.tzinfo)
276-
dt = np.int64(dt).astype('datetime64[ns]')
269+
naive = dt.astimezone(None)
270+
dt = np.datetime64(naive, "D")
277271
else:
278272
dt = np.datetime64(dt)
279273
if dt.dtype.name != "datetime64[D]":

pandas/_libs/tslibs/timestamps.pyi

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -113,7 +113,7 @@ class Timestamp(datetime):
113113
def time(self) -> _time: ...
114114
def timetz(self) -> _time: ...
115115
def replace(
116-
self,
116+
self: _DatetimeT,
117117
year: int = ...,
118118
month: int = ...,
119119
day: int = ...,
@@ -123,7 +123,7 @@ class Timestamp(datetime):
123123
microsecond: int = ...,
124124
tzinfo: _tzinfo | None = ...,
125125
fold: int = ...,
126-
) -> datetime: ...
126+
) -> _DatetimeT: ...
127127
def astimezone(self: _DatetimeT, tz: _tzinfo | None = ...) -> _DatetimeT: ...
128128
def ctime(self) -> str: ...
129129
def isoformat(self, sep: str = ..., timespec: str = ...) -> str: ...

pandas/_libs/tslibs/timestamps.pyx

Lines changed: 12 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1981,22 +1981,19 @@ default 'raise'
19811981
value = tz_localize_to_utc_single(self.value, tz,
19821982
ambiguous=ambiguous,
19831983
nonexistent=nonexistent)
1984-
out = Timestamp(value, tz=tz)
1985-
if out is not NaT:
1986-
out._set_freq(self._freq) # avoid warning in constructor
1987-
return out
1984+
elif tz is None:
1985+
# reset tz
1986+
value = tz_convert_from_utc_single(self.value, self.tz)
1987+
19881988
else:
1989-
if tz is None:
1990-
# reset tz
1991-
value = tz_convert_from_utc_single(self.value, self.tz)
1992-
out = Timestamp(value, tz=tz)
1993-
if out is not NaT:
1994-
out._set_freq(self._freq) # avoid warning in constructor
1995-
return out
1996-
else:
1997-
raise TypeError(
1998-
"Cannot localize tz-aware Timestamp, use tz_convert for conversions"
1999-
)
1989+
raise TypeError(
1990+
"Cannot localize tz-aware Timestamp, use tz_convert for conversions"
1991+
)
1992+
1993+
out = Timestamp(value, tz=tz)
1994+
if out is not NaT:
1995+
out._set_freq(self._freq) # avoid warning in constructor
1996+
return out
20001997

20011998
def tz_convert(self, tz):
20021999
"""

pandas/_libs/tslibs/tzconversion.pxd

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,8 +6,8 @@ from numpy cimport (
66
)
77

88

9-
cdef int64_t tz_convert_from_utc_single(
10-
int64_t utc_val, tzinfo tz, bint* fold=?, Py_ssize_t* outpos=?
9+
cpdef int64_t tz_convert_from_utc_single(
10+
int64_t utc_val, tzinfo tz
1111
) except? -1
1212
cdef int64_t tz_localize_to_utc_single(
1313
int64_t val, tzinfo tz, object ambiguous=*, object nonexistent=*

pandas/_libs/tslibs/tzconversion.pyi

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,8 +8,8 @@ import numpy as np
88

99
from pandas._typing import npt
1010

11-
# py_tz_convert_from_utc_single exposed for testing
12-
def py_tz_convert_from_utc_single(val: np.int64, tz: tzinfo) -> np.int64: ...
11+
# tz_convert_from_utc_single exposed for testing
12+
def tz_convert_from_utc_single(val: np.int64, tz: tzinfo) -> np.int64: ...
1313
def tz_localize_to_utc(
1414
vals: npt.NDArray[np.int64],
1515
tz: tzinfo | None,

0 commit comments

Comments
 (0)