Skip to content

Handle ConcurrentJobLimit errors more intelligently #838

@soxofaan

Description

@soxofaan

geopyspark driver deployements typically have limit on max number of parallel jobs, and going over that limit might cause something like:

ERROR openeo.extra.job_management: 400 ConcurrentJobLimit: Job was not started because concurrent job limit (10) is reached.

the client could handle this more intelligently in some situations
e.g. back off a bit (instead of hard failing) when creating/starting new jobs in the job manager loop

Note however that ConcurrentJobLimit is not an official error code (yet) in openEO API, so pushing for that might be part of the work here.

related to

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions