You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/config.md
+57-66Lines changed: 57 additions & 66 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,31 +1,15 @@
1
1
# Configuration Reference
2
2
3
+
In the following all possible configuration settings are described.
3
4
4
-
## `version`
5
+
## Target Outline
5
6
6
-
Configuration schema version. Allows the schema to evolve while still preserving backwards compatibility.
7
-
Its value is `1`.
8
-
Defaults to `1`.
9
-
10
-
## `zarr_version`
11
-
12
-
The Zarr version to be used.
13
-
Its value is `2`.
14
-
Defaults to `2`.
15
-
16
-
## `fixed_dims`
17
-
18
-
Type _object_.
19
-
Specifies the fixed dimensions of the target dataset. Keys are dimension names, values are dimension sizes.
20
-
The object's values are of type _integer_.
21
-
22
-
## `append_dim`
7
+
### `append_dim`
23
8
24
9
Type _string_.
25
10
The name of the variadic append dimension.
26
11
Defaults to `"time"`.
27
-
28
-
## `append_step`
12
+
### `append_step`
29
13
30
14
If set, enforces a step size in the append dimension between two slices or just enforces a direction.
31
15
Must be one of the following:
@@ -46,20 +30,22 @@ Must be one of the following:
46
30
A positive or negative numerical delta value.
47
31
48
32
Defaults to `null`.
33
+
### `fixed_dims`
49
34
50
-
## `included_variables`
35
+
Type _object_.
36
+
Specifies the fixed dimensions of the target dataset. Keys are dimension names, values are dimension sizes.
37
+
The object's values are of type _integer_.
38
+
### `included_variables`
51
39
52
40
Type _array_.
53
41
Specifies the names of variables to be included in the target dataset. Defaults to all variables found in the first contributing dataset.
54
42
The items of the array are of type _string_.
55
-
56
-
## `excluded_variables`
43
+
### `excluded_variables`
57
44
58
45
Type _array_.
59
46
Specifies the names of individual variables to be excluded from all contributing datasets.
60
47
The items of the array are of type _string_.
61
-
62
-
## `variables`
48
+
### `variables`
63
49
64
50
Type _object_.
65
51
Defines dimensions, encoding, and attributes for variables in the target dataset. Object property names refer to variable names. The special name `*` refers to all variables, which is useful for defining common values.
@@ -149,13 +135,11 @@ Variable metadata.
149
135
*`attrs`:
150
136
Type _object_.
151
137
Arbitrary variable metadata attributes.
152
-
153
-
## `attrs`
138
+
### `attrs`
154
139
155
140
Type _object_.
156
141
Arbitrary dataset attributes. If `permit_eval` is set to `true`, string values may include Python expressions enclosed in `{{` and `}}` to dynamically compute attribute values; in the expression, the current dataset is named `ds`. Refer to the user guide for more information.
157
-
158
-
## `attrs_update_mode`
142
+
### `attrs_update_mode`
159
143
160
144
The mode used update target attributes from slice attributes. Independently of this setting, extra attributes configured by the `attrs` setting will finally be used to update the resulting target attributes.
161
145
Must be one of the following:
@@ -173,39 +157,37 @@ Must be one of the following:
173
157
Its value is `"ignore"`.
174
158
175
159
Defaults to `"keep"`.
160
+
### `zarr_version`
176
161
177
-
## `permit_eval`
178
-
179
-
Type _boolean_.
180
-
Allow for dynamically computed values in dataset attributes `attrs` using the syntax `{{ expression }}`. Executing arbitrary Python expressions is a security risk, therefore this must be explicitly enabled. Refer to the user guide for more information.
181
-
Defaults to `false`.
162
+
The Zarr version to be used.
163
+
Its value is `2`.
164
+
Defaults to `2`.
165
+
## Data I/O - Target
182
166
183
-
## `target_dir`
167
+
###`target_dir`
184
168
185
169
Type _string_.
186
170
The URI or local path of the target Zarr dataset. Must specify a directory whose parent directory must exist.
187
-
188
-
## `target_storage_options`
171
+
### `target_storage_options`
189
172
190
173
Type _object_.
191
174
Options for the filesystem given by the URI of `target_dir`.
175
+
### `force_new`
192
176
193
-
## `slice_source`
194
-
195
-
Type _string_.
196
-
The fully qualified name of a class or function that receives a slice item as argument(s) and provides the slice dataset. If a class is given, it must be derived from `zappend.api.SliceSource`. If the function is a context manager, it must yield an `xarray.Dataset`. If a plain function is given, it must return any valid slice item type. Refer to the user guide for more information.
197
-
198
-
## `slice_engine`
199
-
200
-
Type _string_.
201
-
The name of the engine to be used for opening contributing datasets. Refer to the `engine` argument of the function `xarray.open_dataset()`.
177
+
Type _boolean_.
178
+
Force creation of a new target dataset. An existing target dataset (and its lock) will be permanently deleted before appending of slice datasets begins. WARNING: the deletion cannot be rolled back.
179
+
Defaults to `false`.
180
+
## Data I/O - Slices
202
181
203
-
## `slice_storage_options`
182
+
###`slice_storage_options`
204
183
205
184
Type _object_.
206
185
Options for the filesystem given by the protocol of the URIs of contributing datasets.
186
+
### `slice_engine`
207
187
208
-
## `slice_polling`
188
+
Type _string_.
189
+
The name of the engine to be used for opening contributing datasets. Refer to the `engine` argument of the function `xarray.open_dataset()`.
190
+
### `slice_polling`
209
191
210
192
Defines how to poll for contributing datasets.
211
193
Must be one of the following:
@@ -230,36 +212,52 @@ Must be one of the following:
230
212
Polling timeout in seconds.
231
213
Defaults to `60`.
232
214
215
+
### `slice_source`
216
+
217
+
Type _string_.
218
+
The fully qualified name of a class or function that receives a slice item as argument(s) and provides the slice dataset. If a class is given, it must be derived from `zappend.api.SliceSource`. If the function is a context manager, it must yield an `xarray.Dataset`. If a plain function is given, it must return any valid slice item type. Refer to the user guide for more information.
219
+
### `slice_source_kwargs`
233
220
234
-
## `persist_mem_slices`
221
+
Type _object_.
222
+
Extra keyword-arguments passed to a configured `slice_source` together with each slice item.
223
+
### `persist_mem_slices`
235
224
236
225
Type _boolean_.
237
226
Persist in-memory slices and reopen from a temporary Zarr before appending them to the target dataset. This can prevent expensive re-computation of dask chunks at the cost of additional i/o.
238
227
Defaults to `false`.
228
+
## Data I/O - Transactions
239
229
240
-
## `temp_dir`
230
+
###`temp_dir`
241
231
242
232
Type _string_.
243
233
The URI or local path of the directory that will be used to temporarily store rollback information.
244
-
245
-
## `temp_storage_options`
234
+
### `temp_storage_options`
246
235
247
236
Type _object_.
248
237
Options for the filesystem given by the protocol of `temp_dir`.
249
-
250
-
## `force_new`
238
+
### `disable_rollback`
251
239
252
240
Type _boolean_.
253
-
Force creation of a new target dataset. An existing target dataset (and its lock) will be permanently deleted before appending of slice datasets begins. WARNING: the deletion cannot be rolled back.
241
+
Disable rolling back dataset changes on failure. Effectively disables transactional dataset modifications, so use this setting with care.
254
242
Defaults to `false`.
243
+
## Misc.
244
+
245
+
### `version`
255
246
256
-
## `disable_rollback`
247
+
Configuration schema version. Allows the schema to evolve while still preserving backwards compatibility.
248
+
Its value is `1`.
249
+
Defaults to `1`.
250
+
### `dry_run`
257
251
258
252
Type _boolean_.
259
-
Disable rolling back dataset changes on failure. Effectively disables transactional dataset modifications, so use this setting with care.
253
+
If `true`, log only what would have been done, but don't apply any changes.
260
254
Defaults to `false`.
255
+
### `permit_eval`
261
256
262
-
## `profiling`
257
+
Type _boolean_.
258
+
Allow for dynamically computed values in dataset attributes `attrs` using the syntax `{{ expression }}`. Executing arbitrary Python expressions is a security risk, therefore this must be explicitly enabled. Refer to the user guide for more information.
259
+
Defaults to `false`.
260
+
### `profiling`
263
261
264
262
Profiling configuration. Allows for runtime profiling of the processing.
265
263
Must be one of the following:
@@ -307,8 +305,7 @@ Must be one of the following:
307
305
Pattern-match the standard name that is printed.
308
306
309
307
310
-
311
-
## `logging`
308
+
### `logging`
312
309
313
310
Logging configuration.
314
311
Must be one of the following:
@@ -402,9 +399,3 @@ Must be one of the following:
402
399
The items of the array are of type _string_.
403
400
404
401
405
-
## `dry_run`
406
-
407
-
Type _boolean_.
408
-
If `true`, log only what would have been done, but don't apply any changes.
0 commit comments