Skip to content

Commit 3b38a69

Browse files
committed
load_uploaded_files and run_udf: Clarify handling of file paths and added FileNotFound exception. #461
Clarify workspace naming: user file workspace and cloud workspace.
1 parent cf7e2f5 commit 3b38a69

File tree

4 files changed

+67
-60
lines changed

4 files changed

+67
-60
lines changed

CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -36,6 +36,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
3636
- `filter_bbox`, `load_collection`, `load_stac`: Clarified that the bounding box is reprojected to the CRS of the spatial data cube dimensions if required.
3737
- `filter_spatial`: Clarified that masking is applied using the given geometries. [#469](https://github.com/Open-EO/openeo-processes/issues/469)
3838
- `load_collection` and `load_stac`: Clarified that scale and offset are not applied automatically when loading the data. [#503](https://github.com/Open-EO/openeo-processes/issues/503)
39+
- `load_uploaded_files` and `run_udf`: Clarify handling of file paths and added `FileNotFound` exception. [#461](https://github.com/Open-EO/openeo-processes/issues/461)
3940
- `mod`: Clarified behavior for y = 0
4041
- `sqrt`: Clarified that NaN is returned for negative numbers.
4142
- Clarify allowed `FeatureCollection` geometries in `load_collection`, `mask_polygon`, `apply_polygon`, and `load_stac` [#527](https://github.com/Open-EO/openeo-processes/issues/527)

proposals/export_workspace.json

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
{
22
"id": "export_workspace",
3-
"summary": "Export data to a cloud user workspace",
4-
"description": "Exports the given processing results made available through a STAC resource (e.g., a STAC Collection) to the given user workspace. The STAC resource itself is exported with all STAC resources and assets underneath.",
3+
"summary": "Export data to a cloud workspace",
4+
"description": "Exports the given processing results made available through a STAC resource (e.g., a STAC Collection) to the given cloud workspace. The STAC resource itself is exported with all STAC resources and assets underneath.",
55
"categories": [
66
"export",
77
"stac"
@@ -10,15 +10,15 @@
1010
"parameters": [
1111
{
1212
"name": "data",
13-
"description": "The data to export to the user workspace as a STAC resource.",
13+
"description": "The data to export to the cloud workspace as a STAC resource.",
1414
"schema": {
1515
"type": "object",
1616
"subtype": "stac"
1717
}
1818
},
1919
{
2020
"name": "workspace",
21-
"description": "The identifier of the workspace to export to.",
21+
"description": "The identifier of the cloud workspace to export to.",
2222
"schema": {
2323
"type": "string",
2424
"pattern": "^[\\w\\-\\.~]+$",

proposals/load_uploaded_files.json

Lines changed: 58 additions & 55 deletions
Original file line numberDiff line numberDiff line change
@@ -1,55 +1,58 @@
1-
{
2-
"id": "load_uploaded_files",
3-
"summary": "Load files from the user workspace",
4-
"description": "Loads one or more user-uploaded files from the server-side workspace of the authenticated user and returns them as a single data cube. The files must have been stored by the authenticated user on the back-end currently connected to.",
5-
"categories": [
6-
"cubes",
7-
"import"
8-
],
9-
"experimental": true,
10-
"parameters": [
11-
{
12-
"name": "paths",
13-
"description": "The files to read. Folders can't be specified, specify all files instead. An exception is thrown if a file can't be read.",
14-
"schema": {
15-
"type": "array",
16-
"subtype": "file-paths",
17-
"items": {
18-
"type": "string",
19-
"subtype": "file-path",
20-
"pattern": "^[^\r\n\\:'\"]+$"
21-
}
22-
}
23-
},
24-
{
25-
"name": "format",
26-
"description": "The file format to read from. It must be one of the values that the server reports as supported input file formats, which usually correspond to the short GDAL/OGR codes. If the format is not suitable for loading the data, a `FormatUnsuitable` exception will be thrown. This parameter is *case insensitive*.",
27-
"schema": {
28-
"type": "string",
29-
"subtype": "input-format"
30-
}
31-
},
32-
{
33-
"name": "options",
34-
"description": "The file format parameters to be used to read the files. Must correspond to the parameters that the server reports as supported parameters for the chosen `format`. The parameter names and valid values usually correspond to the GDAL/OGR format options.",
35-
"schema": {
36-
"type": "object",
37-
"subtype": "input-format-options"
38-
},
39-
"default": {},
40-
"optional": true
41-
}
42-
],
43-
"returns": {
44-
"description": "A data cube for further processing.",
45-
"schema": {
46-
"type": "object",
47-
"subtype": "datacube"
48-
}
49-
},
50-
"exceptions": {
51-
"FormatUnsuitable": {
52-
"message": "Data can't be loaded with the requested input format."
53-
}
54-
}
55-
}
1+
{
2+
"id": "load_uploaded_files",
3+
"summary": "Load files from the user workspace",
4+
"description": "Loads one or more user-uploaded files from the server-side workspace of the authenticated user and returns them as a single data cube. The files must have been stored by the authenticated user on the back-end currently connected to.",
5+
"categories": [
6+
"cubes",
7+
"import"
8+
],
9+
"experimental": true,
10+
"parameters": [
11+
{
12+
"name": "paths",
13+
"description": "The files to read. Folders can't be specified, specify all files instead. An exception is thrown if a file can't be read.\n\nFile paths are relative to the file workspace of the user. The workspace is the root folder, i.e. the paths `/folder/file.txt` and `folder/file.txt` and `./folder/file.txt` are all equivalent. Specifying paths outside of the workspace is not allowed and throws a `FileNotFound` exception.",
14+
"schema": {
15+
"type": "array",
16+
"subtype": "file-paths",
17+
"items": {
18+
"type": "string",
19+
"subtype": "file-path",
20+
"pattern": "^[^\r\n\\:'\"]+$"
21+
}
22+
}
23+
},
24+
{
25+
"name": "format",
26+
"description": "The file format to read from. It must be one of the values that the server reports as supported input file formats, which usually correspond to the short GDAL/OGR codes. If the format is not suitable for loading the data, a `FormatUnsuitable` exception will be thrown. This parameter is *case insensitive*.",
27+
"schema": {
28+
"type": "string",
29+
"subtype": "input-format"
30+
}
31+
},
32+
{
33+
"name": "options",
34+
"description": "The file format parameters to be used to read the files. Must correspond to the parameters that the server reports as supported parameters for the chosen `format`. The parameter names and valid values usually correspond to the GDAL/OGR format options.",
35+
"schema": {
36+
"type": "object",
37+
"subtype": "input-format-options"
38+
},
39+
"default": {},
40+
"optional": true
41+
}
42+
],
43+
"returns": {
44+
"description": "A data cube for further processing.",
45+
"schema": {
46+
"type": "object",
47+
"subtype": "datacube"
48+
}
49+
},
50+
"exceptions": {
51+
"FormatUnsuitable": {
52+
"message": "Data can't be loaded with the requested input format."
53+
},
54+
"FileNotFound": {
55+
"message": "The specified file does not exist."
56+
}
57+
}
58+
}

run_udf.json

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@
3838
"pattern": "^https?://"
3939
},
4040
{
41-
"description": "Path to a UDF uploaded to the server.",
41+
"description": "Path to a UDF uploaded to the server.\n\nFile paths are relative to the file workspace of the user. The workspace is the root folder, i.e. the paths `/folder/file.txt` and `folder/file.txt` and `./folder/file.txt` are all equivalent. Specifying paths outside of the workspace is not allowed and throws a `FileNotFound` exception.",
4242
"type": "string",
4343
"subtype": "file-path",
4444
"pattern": "^[^\r\n\\:'\"]+$"
@@ -91,6 +91,9 @@
9191
},
9292
"InvalidVersion": {
9393
"message": "The specified UDF runtime version is not supported."
94+
},
95+
"FileNotFound": {
96+
"message": "The specified file does not exist."
9497
}
9598
},
9699
"returns": {

0 commit comments

Comments
 (0)