ssr8998
ssr8998
I tried to follow your steps :- Here is how my deployment.yaml.j2 look like : {% set db_name =env['db_name'] | default('name_of_my_db') %} ......basic config etc.etc... spark_python_task: python_file: file://my_path_/name_of_python_notebook_converted_to_job.py" parameters: ["db_name","{{env['db_name']}}"]...
if I use export DATABASE_NAME=dev dbx deploy -e dev --deployment-file conf/deployment.yml.j2 "my-workflow", it complains that "environment dev not found in the project file .dbx/project.json . In my project json I've...
Thanks for your reply , Well, I converted the notebook to a pure python file , no #magic and no #widget and no dbutils can be and should be used...