Skip to content

BUG: NaT instead of error for timestamp concat with None #53042

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions doc/source/whatsnew/v2.1.0.rst
Original file line number Diff line number Diff line change
Expand Up @@ -363,6 +363,7 @@ Datetimelike
^^^^^^^^^^^^
- :meth:`DatetimeIndex.map` with ``na_action="ignore"`` now works as expected. (:issue:`51644`)
- Bug in :class:`DateOffset` which had inconsistent behavior when multiplying a :class:`DateOffset` object by a constant (:issue:`47953`)
- Bug in :func:`concat` raises ``AttributeError`` when concate ``None`` dtype DataFrame with ``timestamp`` dtype DataFrame. (:issue:`52093`)
- Bug in :func:`date_range` when ``freq`` was a :class:`DateOffset` with ``nanoseconds`` (:issue:`46877`)
- Bug in :meth:`Timestamp.date`, :meth:`Timestamp.isocalendar`, :meth:`Timestamp.timetuple`, and :meth:`Timestamp.toordinal` were returning incorrect results for inputs outside those supported by the Python standard library's datetime module (:issue:`53668`)
- Bug in :meth:`Timestamp.round` with values close to the implementation bounds returning incorrect results instead of raising ``OutOfBoundsDatetime`` (:issue:`51494`)
Expand Down
3 changes: 2 additions & 1 deletion pandas/core/internals/managers.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@
BlockPlacement,
BlockValuesRefs,
)
from pandas._libs.tslibs import Timestamp
from pandas.errors import PerformanceWarning
from pandas.util._decorators import cache_readonly
from pandas.util._exceptions import find_stack_level
Expand Down Expand Up @@ -2389,7 +2390,7 @@ def _preprocess_slice_or_indexer(
def make_na_array(dtype: DtypeObj, shape: Shape, fill_value) -> ArrayLike:
if isinstance(dtype, DatetimeTZDtype):
# NB: exclude e.g. pyarrow[dt64tz] dtypes
i8values = np.full(shape, fill_value._value)
i8values = np.full(shape, Timestamp(fill_value)._value)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think this may mess up if your dtype has a different unit than the fill_value. can you add a test where this is the case? also needs a test for the original bug

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also needs a test for the original bug

My bad! I forgot to add that. Now added that👍
Thanks for the reminding.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think this may mess up if your dtype has a different unit than the fill_value. can you add a test where this is the case?

Are you talking about something like this? where the value will be timestamp but its dtype will be different?

df = pd.DataFrame([{'A':None}])
df_2 = pd.DataFrame([{'A': pd.to_datetime("1990-12-20 00:00:00+00:00")}], dtype="int32")
cd = pd.concat([df, df_2])

sorry! if I did not get it correctly. Could you please explain it in detail?

return DatetimeArray(i8values, dtype=dtype)

elif is_1d_only_ea_dtype(dtype):
Expand Down
11 changes: 11 additions & 0 deletions pandas/tests/reshape/concat/test_concat.py
Original file line number Diff line number Diff line change
Expand Up @@ -826,3 +826,14 @@ def test_concat_mismatched_keys_length():
concat((x for x in sers), keys=(y for y in keys), axis=1)
with tm.assert_produces_warning(FutureWarning, match=msg):
concat((x for x in sers), keys=(y for y in keys), axis=0)


def test_concat_none_with_datetime():
# GH 52093
df1 = DataFrame([{"A": None}], dtype="datetime64[ns, UTC]")
df2 = DataFrame([{"A": pd.to_datetime("1990-12-20 00:00:00+00:00")}])
result = concat([df1, df2])
expected = DataFrame(
[{"A": None}, {"A": pd.to_datetime("1990-12-20 00:00:00+00:00")}], index=[0, 0]
)
tm.assert_frame_equal(result, expected)