kyuubi
kyuubi copied to clipboard
[Bug] failed to insert data to hive partition table if using `INSERT INTO` for spark-hive-connector
Code of Conduct
- [X] I agree to follow this project's Code of Conduct
Search before asking
- [X] I have searched in the issues and found no similar issues.
Describe the bug
failed to insert data to hive partition table if using INSERT INTO for spark-hive-connector, could be reproduced by replace INSERT OVERWRITE with INSERT INTO in HiveQuerySuite.scala
private def readPartitionedTable(format: String, hiveTable: Boolean): Unit = {
withSparkSession() { spark =>
val table = "hive.default.employee"
withTempPartitionedTable(spark, table, format, hiveTable) {
spark.sql(
s"""
| INSERT INTO
| $table PARTITION(year = '2023')
| VALUES("zhao", "09")
|""".stripMargin)
checkQueryResult(s"select * from $table", spark, Array(Row.apply("zhao", "2023", "09")))
}
}
}
Affects Version(s)
master
Kyuubi Server Log Output
No response
Kyuubi Engine Log Output
org.apache.kyuubi.spark.connector.hive.KyuubiHiveConnectorException: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict
at org.apache.kyuubi.spark.connector.hive.write.HiveWrite.extractAndValidatePartitionCols(HiveWrite.scala:205)
at org.apache.kyuubi.spark.connector.hive.write.HiveWrite.toBatch(HiveWrite.scala:84)
at org.apache.spark.sql.execution.datasources.v2.V2ExistingTableWriteExec.run(WriteToDataSourceV2Exec.scala:360)
at org.apache.spark.sql.execution.datasources.v2.V2ExistingTableWriteExec.run$(WriteToDataSourceV2Exec.scala:359)
Kyuubi Server Configurations
No response
Kyuubi Engine Configurations
No response
Additional context
No response
Are you willing to submit PR?
- [ ] Yes. I would be willing to submit a PR with guidance from the Kyuubi community to fix.
- [ ] No. I cannot submit a PR at this time.
@Yikf do u have time to take a look?
The process of writing with Apache Spark DataSourceV2 using dynamic partitioning to handle static partitions, This exception by the issue reported should be a bug in KSHC, and will take a look.
thanks @Yikf and @pan3793 for fixing the issue.