maruppel

Results 9 comments of maruppel

It seems we can set the property: ((databricks.labs.ucx.skip, true)) but as far as I can tell, there is no way to unset without running an ALTER in the UI. Additionally...

All the tables with the error are dbfs root tables.

Seems like the issue persists, see log running version 0.37.0. ` details = "INVALID_PARAMETER_VALUE: Invalid input: RPC CreateTable Field managedcatalog.ColumnInfo.name: At columns.0: name "" is not a valid name" debug_error_string...

> hmm, do you get more information in the debug logs? See the `logs` folder in the UCX installation folder in your workspace No other details other than the stack...

> Could you share the stack trace? I would like to see which table migration method is called `JVM stacktrace: org.apache.spark.sql.AnalysisException at com.databricks.managedcatalog.ErrorDetailsHandler.wrapServiceException(ErrorDetailsHandler.scala:62) at com.databricks.managedcatalog.ErrorDetailsHandler.wrapServiceException$(ErrorDetailsHandler.scala:35) at com.databricks.managedcatalog.ManagedCatalogClientImpl.wrapServiceException(ManagedCatalogClientImpl.scala:171) at com.databricks.managedcatalog.ManagedCatalogClientImpl.recordAndWrapException(ManagedCatalogClientImpl.scala:5648) at...

> Are these Delta/Non Delta Tables? All are dbfs root delta tables.

> Is there also the Python stack trace to see which Python code triggered this ```python 15:20:12 ERROR [p.s.c.client.logging][migrate_tables_7] GRPC Error received: Traceback (most recent call last): File "/databricks/spark/python/pyspark/sql/connect/client/core.py", line...

> @maruppel : What Databricks runtime is the `migrate-tables` workflow failling for? DBR is 15.3, also this is after assessment has been run, which was a question in the other...

Running the above code I am getting the same whitelist error, sdk=0.29.0