Skip to content

Examples lack a serverless job with environmentsΒ #55

@haschdl

Description

@haschdl

For Serverless jobs, Python dependencies must be declared as part of an environment.

Customers are going to private_wheel_packages and job_with_multiple_wheels and realizing it doesn't work with Serverless. New examples that combine Serverless and wheels are needed.

Example of a working serverless job and an environment that deploys wheels - a direct adaptation of job_with_multiple_wheels

resources:
  jobs:
    serverless_job:
      name: "Example with multiple wheels"
      tasks:
        - task_key: task
          spark_python_task:
            python_file: ../src/call_wheel.py
          environment_key: default
      # A list of task execution environment specifications that can be referenced by tasks of this job.
      environments:
        - environment_key: default

          # Full documentation of this spec can be found at:
          # https://docs.databricks.com/api/workspace/jobs/create#environments-spec
          spec:
            client: "1"
            dependencies:
              - ../my_custom_wheel2/dist/my_custom_wheel2-0.0.1-py3-none-any.whl
              - ../my_custom_wheel1/dist/my_custom_wheel1-0.0.1-py3-none-any.whl

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions