Skip to content

Commit 65df68e

Browse files
m-mohrsoxofaan
andauthored
Clarify handling of workspace paths (#545)
* `load_uploaded_files` and `run_udf`: Clarify handling of file paths and added `FileNotFound` exception. #461 Clarify workspace naming: user file workspace and cloud workspace. * Apply suggestions from code review Co-authored-by: Stefaan Lippens <[email protected]> * Update proposals/load_uploaded_files.json --------- Co-authored-by: Stefaan Lippens <[email protected]>
1 parent 7974ff2 commit 65df68e

File tree

4 files changed

+67
-60
lines changed

4 files changed

+67
-60
lines changed

CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -52,6 +52,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
5252
- `filter_spatial`: Clarified that masking is applied using the given geometries. [#469](https://github.com/Open-EO/openeo-processes/issues/469)
5353
- `load_stac`: Clarify handling of the `properties` parameter in the context of STAC APIs and static catalogs. [#536](https://github.com/Open-EO/openeo-processes/issues/536)
5454
- `load_collection` and `load_stac`: Clarified that scale and offset are not applied automatically when loading the data. [#503](https://github.com/Open-EO/openeo-processes/issues/503)
55+
- `load_uploaded_files` and `run_udf`: Clarify handling of file paths and added `FileNotFound` exception. [#461](https://github.com/Open-EO/openeo-processes/issues/461)
5556
- `mask`: Add missing exception `IncompatibleDataCubes` [#538](https://github.com/Open-EO/openeo-processes/issues/538)
5657
- `mod`: Clarified behavior for y = 0
5758
- `run_udf`: Simplified and clarified the schema for `data` - no functional change.

proposals/export_workspace.json

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
{
22
"id": "export_workspace",
3-
"summary": "Export data to a cloud user workspace",
4-
"description": "Exports the given processing results made available through a STAC resource (e.g., a STAC Collection) to the given user workspace. The STAC resource itself is exported with all STAC resources and assets underneath.",
3+
"summary": "Export data to a cloud workspace",
4+
"description": "Exports the given processing results made available through a STAC resource (e.g., a STAC Collection) to the given cloud workspace. The STAC resource itself is exported with all STAC resources and assets underneath.",
55
"categories": [
66
"export",
77
"stac"
@@ -10,15 +10,15 @@
1010
"parameters": [
1111
{
1212
"name": "data",
13-
"description": "The data to export to the user workspace as a STAC resource.",
13+
"description": "The data to export to the cloud workspace as a STAC resource.",
1414
"schema": {
1515
"type": "object",
1616
"subtype": "stac"
1717
}
1818
},
1919
{
2020
"name": "workspace",
21-
"description": "The identifier of the workspace to export to.",
21+
"description": "The identifier of the cloud workspace to export to.",
2222
"schema": {
2323
"type": "string",
2424
"pattern": "^[\\w\\-\\.~]+$",

proposals/load_uploaded_files.json

Lines changed: 58 additions & 55 deletions
Original file line numberDiff line numberDiff line change
@@ -1,55 +1,58 @@
1-
{
2-
"id": "load_uploaded_files",
3-
"summary": "Load files from the user workspace",
4-
"description": "Loads one or more user-uploaded files from the server-side workspace of the authenticated user and returns them as a single data cube. The files must have been stored by the authenticated user on the back-end currently connected to.",
5-
"categories": [
6-
"cubes",
7-
"import"
8-
],
9-
"experimental": true,
10-
"parameters": [
11-
{
12-
"name": "paths",
13-
"description": "The files to read. Folders can't be specified, specify all files instead. An exception is thrown if a file can't be read.",
14-
"schema": {
15-
"type": "array",
16-
"subtype": "file-paths",
17-
"items": {
18-
"type": "string",
19-
"subtype": "file-path",
20-
"pattern": "^[^\r\n\\:'\"]+$"
21-
}
22-
}
23-
},
24-
{
25-
"name": "format",
26-
"description": "The file format to read from. It must be one of the values that the server reports as supported input file formats, which usually correspond to the short GDAL/OGR codes. If the format is not suitable for loading the data, a `FormatUnsuitable` exception will be thrown. This parameter is *case insensitive*.",
27-
"schema": {
28-
"type": "string",
29-
"subtype": "input-format"
30-
}
31-
},
32-
{
33-
"name": "options",
34-
"description": "The file format parameters to be used to read the files. Must correspond to the parameters that the server reports as supported parameters for the chosen `format`. The parameter names and valid values usually correspond to the GDAL/OGR format options.",
35-
"schema": {
36-
"type": "object",
37-
"subtype": "input-format-options"
38-
},
39-
"default": {},
40-
"optional": true
41-
}
42-
],
43-
"returns": {
44-
"description": "A data cube for further processing.",
45-
"schema": {
46-
"type": "object",
47-
"subtype": "datacube"
48-
}
49-
},
50-
"exceptions": {
51-
"FormatUnsuitable": {
52-
"message": "Data can't be loaded with the requested input format."
53-
}
54-
}
55-
}
1+
{
2+
"id": "load_uploaded_files",
3+
"summary": "Load files from the user workspace",
4+
"description": "Loads one or more user-uploaded files from the server-side workspace of the authenticated user and returns them as a single data cube. The files must have been stored by the authenticated user on the back-end currently connected to.",
5+
"categories": [
6+
"cubes",
7+
"import"
8+
],
9+
"experimental": true,
10+
"parameters": [
11+
{
12+
"name": "paths",
13+
"description": "The files to read. Folders can't be specified, specify all files instead. An exception is thrown if a file can't be read.\n\nAs the workspace acts as an isolated root folder, the absolute path `/folder/file.txt` and relative paths `folder/file.txt` and `./folder/file.txt` are all equivalent. Likewise, specifying a path outside of the workspace results in a `FileNotFound` error.",
14+
"schema": {
15+
"type": "array",
16+
"subtype": "file-paths",
17+
"items": {
18+
"type": "string",
19+
"subtype": "file-path",
20+
"pattern": "^[^\r\n\\:'\"]+$"
21+
}
22+
}
23+
},
24+
{
25+
"name": "format",
26+
"description": "The file format to read from. It must be one of the values that the server reports as supported input file formats, which usually correspond to the short GDAL/OGR codes. If the format is not suitable for loading the data, a `FormatUnsuitable` exception will be thrown. This parameter is *case insensitive*.",
27+
"schema": {
28+
"type": "string",
29+
"subtype": "input-format"
30+
}
31+
},
32+
{
33+
"name": "options",
34+
"description": "The file format parameters to be used to read the files. Must correspond to the parameters that the server reports as supported parameters for the chosen `format`. The parameter names and valid values usually correspond to the GDAL/OGR format options.",
35+
"schema": {
36+
"type": "object",
37+
"subtype": "input-format-options"
38+
},
39+
"default": {},
40+
"optional": true
41+
}
42+
],
43+
"returns": {
44+
"description": "A data cube for further processing.",
45+
"schema": {
46+
"type": "object",
47+
"subtype": "datacube"
48+
}
49+
},
50+
"exceptions": {
51+
"FormatUnsuitable": {
52+
"message": "Data can't be loaded with the requested input format."
53+
},
54+
"FileNotFound": {
55+
"message": "The specified file does not exist."
56+
}
57+
}
58+
}

run_udf.json

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@
2727
"pattern": "^https?://"
2828
},
2929
{
30-
"description": "Path to a UDF uploaded to the server.",
30+
"description": "Path to a UDF uploaded to the user workspace.\n\nAs the workspace acts as an isolated root folder, the absolute path `/folder/file.txt` and relative paths `folder/file.txt` and `./folder/file.txt` are all equivalent. Likewise, specifying a path outside of the workspace results in a `FileNotFound` error.",
3131
"type": "string",
3232
"subtype": "file-path",
3333
"pattern": "^[^\r\n\\:'\"]+$"
@@ -80,6 +80,9 @@
8080
},
8181
"InvalidVersion": {
8282
"message": "The specified UDF runtime version is not supported."
83+
},
84+
"FileNotFound": {
85+
"message": "The specified file does not exist."
8386
}
8487
},
8588
"returns": {

0 commit comments

Comments
 (0)