Python files being to converted to notebooks when using Databricks CLI
Hello, I am using the attached piece of code to import both notebooks and python files from my git repo, and am writing them to a Workspace folder. However, when running the attached code, all of my Python files get converted to notebooks, which is not what I want. I have tried several approaches but still cannot manage to make it work. Has someone found a way to avoid this convertion? I want my Python files to remain Python files to be able to use them as Python modules. Thanks a lot in advance, Sacha
You need to use new CLI! https://github.com/databricks/cli & https://docs.databricks.com/en/dev-tools/cli/install.html
From the docs:
The legacy Databricks CLI is in an Experimental state. Databricks plans no new feature work for the legacy Databricks CLI at this time.
Hi @alexott, I have tried to use the latest version but it seems like I am unable to use a version that's above 0.18. Do you have a way to go through this issue ? Thanks for your help, Sacha
new CLI isn't Python based - follow the documentation on how to install it
@alexott I still have not really been able to find a solution. Is there a way to use the new CLI within an Azure DevOps pipeline the way that's shown on my screenshot ? Thanks for the help !
you can achieve the same, just install it differently: https://github.com/alexott/dabs-playground/blob/main/jdemo/azure-pipelines.yml#L48
The installation indeed worked ! However, for the rest, I get "databricks command not found" errors when trying to configure the CLI.
you still need to import corresponding env because brew put it's into it's own folder