Andre Mesarovic
Andre Mesarovic
- I've added to the README files that the copy mode is deprecated. - Since the MLflowClient constructor doesn't allow you to pass in credentials, you can't have two (or...
Fix: https://github.com/amesar/mlflow-export-import/commit/24b9efb4dd03e1a2c89f2caa140834475ea9612b
@wqp89324 and @hebo-yang. Just checked in a fix. There was an edge case for nested run tags when importing into Databricks. Let me know if it works.
Not enough information to diagnose. Can you send me the exported zip file to email andre at databricks?
Try using these Databricks notebooks: https://github.com/amesar/mlflow-export-import#databricks-notebooks
To serve the TF model locally see: [Keras Local Web server](https://github.com/amesar/mlflow-examples/tree/master/python/keras_tf_mnist#1-local-web-server)
https://github.com/mlflow/mlflow-export-import This package provides tools to copy MLflow objects (runs, experiments or registered models) from one MLflow tracking server (Databricks workspace) to another.
> experiment_name: xxxxxx You need specify a legal workspace path as the experiment name.
Permissions for experiments and registered models are exported for all main programs and notebooks. TODO: figure out import semantics for users who do not exist in the target workspace.
Issue is that relative links to architecture do not display on the PyPI doc page.