bundle-examples icon indicating copy to clipboard operation
bundle-examples copied to clipboard

Examples lack a serverless job with environments

Open haschdl opened this issue 1 year ago • 0 comments

For Serverless jobs, Python dependencies must be declared as part of an environment.

Customers are going to private_wheel_packages and job_with_multiple_wheels and realizing it doesn't work with Serverless. New examples that combine Serverless and wheels are needed.

Example of a working serverless job and an environment that deploys wheels - a direct adaptation of job_with_multiple_wheels

resources:
  jobs:
    serverless_job:
      name: "Example with multiple wheels"
      tasks:
        - task_key: task
          spark_python_task:
            python_file: ../src/call_wheel.py
          environment_key: default
      # A list of task execution environment specifications that can be referenced by tasks of this job.
      environments:
        - environment_key: default

          # Full documentation of this spec can be found at:
          # https://docs.databricks.com/api/workspace/jobs/create#environments-spec
          spec:
            client: "1"
            dependencies:
              - ../my_custom_wheel2/dist/my_custom_wheel2-0.0.1-py3-none-any.whl
              - ../my_custom_wheel1/dist/my_custom_wheel1-0.0.1-py3-none-any.whl

haschdl avatar Dec 22 '24 12:12 haschdl