Query information_schema not working (Exception thrown in awaitResult)
When i try to query the list of tables in a schema from a Redshift DB. I get following error. I have tried to use query and dbtable options with the same result. When i query the DB with lets say dbeaver I can extract the list of tables with no problem. If I use below script with a "real" table it works fine. DataBricks: 5.3 (includes Apache Spark 2.4.0, Scala 2.11)
`java.sql.SQLException: Exception thrown in awaitResult: /databricks/spark/python/pyspark/sql/dataframe.py in show(self, n, truncate, vertical) 377 """ 378 if isinstance(truncate, bool) and truncate: --> 379 print(self._jdf.showString(n, 20, vertical)) 380 else: 381 print(self._jdf.showString(n, int(truncate), vertical))
/databricks/spark/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py in call(self, *args) 1255 answer = self.gateway_client.send_command(command) 1256 return_value = get_return_value( -> 1257 answer, self.gateway_client, self.target_id, self.name) 1258 1259 for temp_arg in temp_args:`
This is the script i use: `JDBC_URL = "jdbc:redshift://xyz.redshift.amazonaws.com:5439/xyz?user=user&password=pwd" SQL_QUERY = "SELECT * FROM information_schema.tables t WHERE t.table_schema = 'schema_name' AND t.table_type = 'BASE TABLE'" REDSHIFT_S3_TEMP_FOLDER = 's3a://xyz'
df = spark.read
.format("com.databricks.spark.redshift")
.option("url", JDBC_URL)
.option("query", SQL_QUERY)
.option("tempdir", REDSHIFT_S3_TEMP_FOLDER)
.option("forward_spark_s3_credentials", "true")
.load()
df.show()`
Exact same issue!
hi, do we have a solution for this? I am having exactly the same issue of not being able to write to Redshift from Databricks but from DBeaver it works well. I can read from Redshift using Databricks (FYI)..
Any solution available?
Installing the JAR file of redshift on Databricks server resolves the issue.
https://docs.aws.amazon.com/redshift/latest/mgmt/jdbc20-previous-driver-version-20.html
Regards, Uday
On Thu, Apr 13, 2023 at 10:16 AM Aryan Gupta @.***> wrote:
Any solution available?
— Reply to this email directly, view it on GitHub https://github.com/databricks/spark-redshift/issues/443#issuecomment-1506628149, or unsubscribe https://github.com/notifications/unsubscribe-auth/A4CIQ3XYRHX3RRYBOQNQRR3XA675PANCNFSM4ISB5JAQ . You are receiving this because you commented.Message ID: @.***>