Skip to content

DAB unable to validate container local file:///path-to-jar #3891

@sshpuntoff

Description

@sshpuntoff

Describe the issue

The CLI treats file:// URIs in job libraries as local paths and attempts to validate they exist in the bundle directory, causing deployment to fail even though these URIs reference files on the cluster's runtime filesystem.

Configuration

resources:
  jobs:
    my_job:
      tasks:
        - task_key: my_task
          libraries:
            - jar: file:///opt/spark/jars/my-library.jar

Steps to reproduce the behavior

  1. Create a bundle with a job that references a file:// URI in libraries
  2. Run databricks bundle deploy
  3. See error: file doesn't exist file:///opt/spark/jars/my-library.jar

Expected Behavior

The CLI should pass file:// URIs through to the Jobs API without validation or upload. These URIs reference files already present on the cluster's filesystem (via init scripts, container images, or pre-installed dependencies), and the Jobs API supports this pattern.

Actual Behavior

The CLI validates file:// paths as if they were local files in the bundle directory and fails deployment with "file doesn't exist" error.

OS and CLI version

macOS, CLI version [run databricks --version to fill this in]

Is this a regression?

Unknown - this may never have worked.

Debug Logs

Error: file doesn't exist file:///opt/spark/jars/my-library.jar
  at resources.jobs.my_job.tasks[0].libraries[0].jar

Note: The issue appears to be in bundle/libraries/local_path.go where file:// URIs are incorrectly treated as local paths.

Metadata

Metadata

Assignees

No one assigned

    Labels

    DABsDABs related issues

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions