Add execution context API
This PR adds the still supported functionality from the REST API 1.2.
It allows users to create execution contexts and run Python, SQL, and R code on a cluster.
Note that his PR is still missing tests and can't be merged as is.
I moved the dev container code out into https://github.com/databricks/databricks-cli/pull/450 and made sure the linter is happy.
@nfx what do you think about the following changes:
- Keep the execution-context group but only keep the 1:1 mapping of commands to the API there. This are the low level commands for advanced usage. e.g. When running multiple commands, potentially in different languages, in the same context. This allows us to simulate notebooks from the CLI.
- Add a single
executecommand to the cluster group that is smart and convenient and covers the 80% used case. This could look like described in https://github.com/databricks/databricks-cli/issues/341#issuecomment-1118302184
@fjakobs it's just that i've literally never heard of any customer asking for interactive notebook individual commands to be exposed in CLI in any of workflows.
It's quite the opposite for SDK for command execution - yes, we need to expose a python context manager for command context in the SDK, it has plenty of real-world scenarios. Some of the applications are https://github.com/databrickslabs/jupyterlab-integration and https://github.com/databrickslabs/migrate
and of course, there were high-level workflow requests, like #122 (10 votes since 2018) / #341 (back then there were too many issues, so i followed the broken window theory and created more).