You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CHANGELOG.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -41,7 +41,7 @@
41
41
* Gives the possibility to truncate specific tables from a database, filtered with the `LIKE` keyword. [#78597](https://github.com/ClickHouse/ClickHouse/pull/78597) ([Yarik Briukhovetskyi](https://github.com/yariks5s)).
42
42
* Support `_part_starting_offset` virtual column in `MergeTree`-family tables. This column represents the cumulative row count of all preceding parts, calculated at query time based on the current part list. The cumulative values are retained throughout query execution and remain effective even after part pruning. Related internal logic has been refactored to support this behavior. [#79417](https://github.com/ClickHouse/ClickHouse/pull/79417) ([Amos Bird](https://github.com/amosbird)).
43
43
* Add functions `divideOrNull`,`moduloOrNull`, `intDivOrNull`,`positiveModuloOrNull` to return NULL when right argument is zero. [#78276](https://github.com/ClickHouse/ClickHouse/pull/78276) ([kevinyhzou](https://github.com/KevinyhZou)).
44
-
*TODO: WTF is that? Explain it in a way your gradma will understand. Support for Iceberg partition pruning bucket transform. [#79262](https://github.com/ClickHouse/ClickHouse/pull/79262) ([Daniil Ivanik](https://github.com/divanik)).
44
+
*Add [`icebergHash`](https://iceberg.apache.org/spec/#appendix-b-32-bit-hash-requirements) and [`icebergBucketTransform`](https://iceberg.apache.org/spec/#bucket-transform-details) functions. Support data files pruning in `Iceberg` tables partitioned with [`bucket transfom`](https://iceberg.apache.org/spec/#partitioning). [#79262](https://github.com/ClickHouse/ClickHouse/pull/79262) ([Daniil Ivanik](https://github.com/divanik)).
45
45
46
46
#### Experimental Feature
47
47
* Hive metastore catalog for Iceberg datalake. [#77677](https://github.com/ClickHouse/ClickHouse/pull/77677) ([scanhex12](https://github.com/scanhex12)).
-`format` — The [format](/sql-reference/formats) of the file.
32
32
-`structure` — Structure of the table. Format `'column1_name column1_type, column2_name column2_type, ...'`.
33
33
-`compression_method` — Parameter is optional. Supported values: `none`, `gzip` or `gz`, `brotli` or `br`, `xz` or `LZMA`, `zstd` or `zst`. By default, it will autodetect compression method by file extension.
34
-
-`headers` - Parameter is optional. Allows headers to be passed in the S3 request. Pass in the format `headers(key=value)` e.g. `headers('x-amz-request-payer' = 'requester')`. See [here](/sql-reference/table-functions/s3#accessing-requester-pays-buckets) for example of use.
34
+
-`headers` - Optional. Allows headers to be passed in the S3 request. Pass in the format `headers(key=value)` e.g. `headers('x-amz-request-payer' = 'requester')`. See [here](/sql-reference/table-functions/s3#accessing-requester-pays-buckets) for an example.
35
+
-`extra_credentials` - Optional. `roleARN` can be passed via this parameter. See [here](/cloud/security/secure-s3#access-your-s3-bucket-with-the-clickhouseaccess-role) for an example.
35
36
36
37
Arguments can also be passed using [named collections](operations/named-collections.md). In this case `url`, `access_key_id`, `secret_access_key`, `format`, `structure`, `compression_method` work in the same way, and some extra parameters are supported:
0 commit comments