fix(deps): update dependency ibis-framework to v10 #1255
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR contains the following updates:
>=9,<10
->>=10,<11
Release Notes
ibis-project/ibis (ibis-framework)
v10.8.0
Compare Source
Features
Schema.from_sqlglot
method to produce an Ibis schema from SQLGlot (#11351) (f6641e7)Bug Fixes
InSubquery
operations (69be180)Documentation
v10.7.0
Compare Source
Features
Expr.to_sql()
method (#11357) (493d6e5)Bug Fixes
on
is unspecified (#11383) (4902ec6).sql
calls are compiled in topological order (#11439) (8e227ac)DateDiff
operands are in the correct order (#11434) (e29881c)Documentation
Value.substitute
(#11413) (1c05d2f)See Also
section to fix rendered formatting (#11412) (d381536)Value.to_sql
toibis.to_sql
(7e0042c)Refactors
to_sqlglot
toto_sqlglot_columns_definition
(#11455) (0e4cf43)Performance
v10.6.0
Compare Source
Features
create_table
(6ea5cd4)includes
increate_source
(f88c0dc)raw_type
to dt.Unknown (#11337) (9296107)Bug Fixes
Struct.__getitem__
(#11299) (ccd9359)job_id_prefix
functionality fromraw_sql
(#11349) (cf404c2)Documentation
database
param (#11112) (458f06d)Refactors
connect
function inibis/__init__.py
(#11305) (77a6dff)Performance
v10.5.0
Compare Source
Features
Bug Fixes
AS JSON
for programmatic output of schema information (d55a5ee)temp.main
(#11092) (20bec13)sqlglot.DataType
, notstr
, when compiling string dtype (#11124) (99be73b)value
is resolvable with no underlying projection (#11125) (a92c3cb)ibis.struct()
typing (#11070) (9d3aece)Documentation
SECURITY.md
formatting (63aeff4)v10.4.0
Compare Source
Features
StringSplit
(#11049) (b83a88e)asof_join
API via a lateral join (#11024) (8eb9d33)Bug Fixes
None
across all supported backends (#10913) (a02a392)raw_sql
(#11001) (c3097a7)Documentation
Refactors
get_schema
invocation to simplify code (#11037) (9b25ab1)Performance
v10.3.1
Compare Source
Bug Fixes
BLOB
type is not accessed if it does not exist (6452a5e)pytz
from dependencies (#10976) (3ecf731).transaction
method instead of managing our own (0edbab9)Documentation
v10.3.0
Compare Source
Features
read_xlsx
implementation (705aa16)to_xlsx
implementation (1800abd)Bug Fixes
Documentation
just
recipe and setup entry foruv
(#10959) (dd33f47)Refactors
owner
column fromselect
metadata query (#10935) (56dcc42)hstore
extension registration into separate method (0bdb5a0)Performance
to_pyarrow_batches
by using server-side cursors (#10954) (cb17b8b), closes #10938v10.2.0
Compare Source
Features
Bug Fixes
nullable
arg for type hints (#10893) (601aabe)Documentation
literal
in "Getting started" (#10918) (b69061b)Refactors
Performance
v10.1.0
Compare Source
Features
Bug Fixes
params
withraw_sql
(#10874) (0a684c3)delta
extension for reading deltalake data (#10833) (beeaa29), closes #10829database
parameter tolist_tables
are used as path delineators (#10863) (cdbbcb9)None
(e589344)get
instead ofget_path
;get_path
does not support columns with spaces (#10836) (50c978b), closes #10835sge.Median
is only accessed when it exists (dc6b7e0)Documentation
__getitem__
docs so that quarto publishes them (#10870) (269cdfe)Refactors
read_parquet
fallback (5fa0103)v10.0.0
Compare Source
⚠ BREAKING CHANGES
as_interval
unit
argument to be positional-onlyas_timestamp
unit
argument to be positional-onlyTable.relabel
methodStringValue
method signaturesNumericValue
methodsGeoSpatialValue.contains
positional-onlyTable.describe
quantile
argument keyword-onlyTable.relabel
methodTable.drop_null
/Table.fill_null
/Table.window_by
/Table.alias
argument positional-onlyTable.sample
fraction
argument positional-onlyTable.aggregate
metrics
argument positional-onlyTable
set operation methods positional-onlyTable.cast
andTable.try_cast
methods positional-onlynth
positional-onlyisin
/notin
/cases
/identical_to
positional-onlynull
function positional-onlyValue.cast
andValue.try_cast
positional-onlyValue.name
positional-onlyExpr.pipe
positional-onlyExpr.equals
positional-onlyto_json
methodsto_delta
methodsto_csv
/to_csv_dir
methodsto_parquet
/to_parquet_dir
methods.sql
method signatures across polars and sql as well as theTable
methodconnect
method now takes its first argument as positional-onlyread_sqlite
/read_mysql
/read_postgres
methods in the duckdb backendread_delta
method; sources are positional-only, everything else is required-keywordhas_operation
backend method; single argument is positional-onlyread_kafka
andto_kafka
methods of the PySpark backenddrop_table_or_view
method of the impala backendto_geo
signature of the the DuckDB backendread_geo
signature of the the DuckDB backendlist_catalogs;
like` argument is now keyword-onlyset_database
signaturelist_databases
arguments all required-keywordcreate_*
methodsread_pandas
methoddrop_table
method;name
is positional-only; the rest are keyword-onlycreate_catalog
anddrop_catalog
methods;name
is positional-only; the rest are keyword-onlycompile
method is now the same across backendscreate_table
method;name
is positional-only;obj
is positional-or-keyword; the rest are keyword-onlycreate_view
method;name
is positional-only;obj
is positional-or-keyword; the rest are keyword-onlydrop_view
method;name
is positional-only; the rest are keyword-onlytruncate_table
method;name
is positional-only; the rest are keyword-onlyinsert
method;name
is positional-only;obj
is positional-or-keyword; the rest are keyword-onlyread_json
method; sources are positional-only, everything else is required-keywordread_csv
method; sources are positional-only, everything else is required-keywordread_parquet
method; sources are positional-only, everything else is required-keywordto_torch
methodto_polars
methodBackend.list_tables
method; all arguments are now keyword-onlyBackend.table
method;name
is positional-only; everything else is required-keywordcreate_database
anddrop_database
;name
is positional-only; everything else is required-keywordMapValue
method signaturesArrayValue
method signaturestype
argument ofstruct
function is now required-keywordTemporalValue
APIswhere
argument of aggregate functions is now required-keywordhashbytes
andhexdigest
are now positional-onlyhow
argument tojoin
methods as keyword-only and standardize remaining argumentsibis.coalesce
/ibis.greatest
/ibis.least
are now positional-onlyExpr.ifelse
is now positional-onlyset_backend
andget_backend
functions are now positional-onlyntile
function and method is now positional-only/
ibis.following` are now positional-onlyexpr
argument ofibis.asc
/ibis.desc
is now positional-only;nulls_first
is keyword-onlydata
argument ofibis.memtable
is now positional-only; the rest are keyword-onlypairs
argument ofibis.schema
is now positional-only; the rest are keyword-onlyibis.param
is now positional-onlyn
argument inTable.limit
andTable.head
is now required-positionaloffset
argument inTable.limit
is now required-keywordto_pyarrow
andto_pyarrow_batches
requiresexpr
as positional-only and keyword for everything elseto_pandas_batches
requiresexpr
as positional-onlyexecute
andto_pandas
methods now requireexpr
as positional-onlydistance
is now a required keyword argument for thed_within
apiread_csv
method accepts only DuckDB types for the values components of thecolumns
andtypes
arguments. You may need need to adjust existing code. For example, the string"float64"
should be replaced with the string"double"
.read_in_memory
method is removed from the duckdb backend. Useibis.memtable
instead.how
parameter of theValue.arbitrary
method is removed. callValue.first
orValue.last
explicitlyStringValue.initcap
method is removed. UseStringValue.capitalize
instead.IntegerValue.label
is redundant with theIntegerValue.cases
method, use that instead. Replaceexpr.label(labels)
withexpr.cases(*enumerate(labels))
register
method has been removed. Please use the file-specificread_*
methods instead. For in-memory objects, pass them toibis.memtable
orcreate_table
.temp_directory
argument passed to Ibis is removed in favor of passing the argument through directly toduckdb.connect
. Interior nodes of directory trees must be created, e.g., usingPath.mkdir(exists_ok=True, parents=True)
,mkdir -p
, etc.option_context
is removed. Usecontextlib.contextmanager
to create your own version of this functionality if necessary.has_name
has always returnedTrue
since 9.0. It is safe to remove any calls tohas_name
.execute
now returns non-numpy objects for scalar values.ibis.negate
is removed. Use thenegate
method on aspecific column, instead.
ibis.geo_*
functions are removed. Equivalentmethods are available on all geo columns.
where
is removed. Useibis.ifelse
instead.Value.greatest
andValue.least
are removed. Useibis.greatest
andibis.least
, instead.pyarrow.Table
or apandas.DataFrame
asthe right-hand-side of a join is no longer supported.
To join against in-memory data, you can pass the in-memory object to
ibis.memtable
orcon.create_table
and use the resulting table objectinstead.
Issues closed
api: Removed hierarchical usage of schema.
Ibis uses the following naming conventions:
mysql: Ibis now uses the
MySQLdb
driver. You may need to install MySQL client libraries to build the extension.padding: String padding operations now follow Python semantics and leave strings greater than the padding length untouched.
pandas: The
pandas
backend is removed. Note that pandas DataFrames are STILL VALID INPUTS AND OUTPUTS and will remain so for the foreseeable future. Please use one of the other local backends like DuckDB, Polars, or DataFusion to perform operations directly on pandas DataFrames.dask: The
dask
backend is removed. Please use one of theother backends that Ibis supports.
api: remove deprecated
where
methodism (886b2d1)api: remove top-level
negate
function (c8c37dd)api: remove top-level geo functions (6b187c3)
backends: convert scalars to non-numpy python objects (#10233) (df08d5e)
duckdb: bump version lower bound to 0.10 (8dbbc8b)
mysql: port to MySQLdb instead of pymysql (#10077) (2b6633c), closes #10055
value: remove deprecated
greatest
andleast
methods (65f0973)Features
distinct
option tocollect
(13cf036)name
kwarg toTable.value_counts()
(#10361) (12e6057)StringValue.as_time
for parsing strings into times (#10278) (9134ef5)to_geo
methods for writing geospatial output (#10299) (9f565a9), closes #10296modes
array aggregation (#10737) (6603c6c)Table.sample
to nativeTABLESAMPLE
syntax when possible (321a3b5)read_csv
(#10317) (b57be01)Configuration
📅 Schedule: Branch creation - "after 5pm on friday" (UTC), Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
♻ Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR was generated by Mend Renovate. View the repository job log.