Description
Currently, we have
dataframe-api/spec/purpose_and_scope.md
Lines 270 to 276 in ee1fe4c
I can't really make sense of this, so let's work through an example
Suppose we've defined the following (non-beta/alpha) versions of the spec:
- 2023.11
- 2023.12 (which adds
DataFrame.new_feature_1
) - 2024.01 (which adds
DataFrame.new_feature_2
)
Suppose:
- cudf 0.6.1 supports versions 2023.11 and 2023.12 of the spec
- cudf 0.7 supports versions 2023.11, 2023.12, and 2024.01 of the spec
I then run
df my_func(df):
df = df.__dataframe_consortium_standard__(api_version=None)
df = df.new_feature_2()
return df.dataframe
with cudf 0.6.1
I see two issues
1. Which version of the spec will it use?
If I read the docstring linked above, it should try using version 2024.01, and then raise an error. Problem is, how is the code above supposed to know that 2024.01 is the latest version, given that the package the user is using (cudf 0.6.1) only knows about versions 2023.11 and 2023.12?
2. what happens on the day that the consortium releases a new spec?
Suppose we release 2024.02. DataFrame libraries haven't had time to implement api version 2024.02, and
Answer I anticipate
An answer I anticipate receiving is that the package would use the latest api version which it has implemented. But then, the snippet above would break with cudf 0.6.1, because then the latest api version available would be 2023.12 which doesn't have DataFrame.new_feature_2
.
Solutions?
One solution could be to not allow api_version=None
, the api version must be specified.
Another could be to take inspiration from Rust Editions (https://doc.rust-lang.org/book/appendix-05-editions.html):
- if
api_version=None
, use the earliest api version - given that each api version is a strict subset of the following one (Perfect backwards compatibility #220), then this should work (note: we would only make such guarantees after we're out of alpha/beta releases)