website
website copied to clipboard
[Feedback] docs/components/pipelines/user-guides/core-functions/execute-kfp-pipelines-locally.md
Documentation claims:
Local execution doesn’t support authentication mechanisms. If your component interacts with cloud resources or requires other privileged actions, you must test your pipeline in the cloud.
It is not exactly true. In Google Cloud with Application Default Credentials I'm able to successfully execute following snippet from my local computer:
# Before running this script setup Application Default Credentials
# https://cloud.google.com/docs/authentication/provide-credentials-adc
# with command `gcloud auth application-default login`
# for authorization into Google Cloud
from kfp import dsl, local
PROJECT_ID = "YOUR-PROJECT-ID"
local.init(runner=local.SubprocessRunner())
@dsl.component(
packages_to_install=["google-cloud-storage"]
)
def list_buckets(project_id: str) -> str:
import json
from google.cloud import storage
storage_client = storage.Client(project=project_id)
buckets = storage_client.list_buckets()
return json.dumps([bucket.name for bucket in buckets])
@dsl.pipeline
def gcs_list_pipeline(project_id: str) -> str:
list_task = list_buckets(
project_id=project_id,
)
return list_task.output
if __name__ == "__main__":
gcs_list_pipeline(project_id=PROJECT_ID)
Could we change documentation page, to reflect this infromation?