DengWang
DengWang
> @MrAladdin There is a related fix here - https://github.com/apache/hudi/pull/10883/files Can you try this out? The 0.14.1 version does not have hudi-client/hudi-client-common/src/main/java/org/apache/hudi/client/timeline/LSMTimelineWriter.java.
> I'm wondering how the table got written, is it written by Flink streaming pipeline? Spark Structured Streaming When using the record_index type index to upsert an MOR type table,...
> [@MrAladdin](https://github.com/MrAladdin) Can you please share the timeline and writer configurations. df .writeStream .format("hudi") .option("hoodie.table.base.file.format", "PARQUET") .option("hoodie.allow.empty.commit", "true") .option("hoodie.datasource.write.drop.partition.columns", "false") .option("hoodie.table.services.enabled", "true") .option("hoodie.datasource.write.streaming.checkpoint.identifier", "lakehouse-dwd-social-kbi-beauty-lower-v1-writer-1") .option(PRECOMBINE_FIELD.key(), "date_kbiudate") .option(RECORDKEY_FIELD.key(), "records_key") .option(PARTITIONPATH_FIELD.key(), "partition_index_date")...
@ad1happy2go I need your help to answer the question I replied to you above, thank you.
> @MrAladdin > > 1. Ideally this should not be the reason for this exception, as it's more like parquet file only got corrupted. Are you facing this issue frequently?...
@xushiyan I need your help to answer the question I replied to you above, thank you. 2、I have a question: When using Spark Structured Streaming to write data, the number...
The writing program goes through multiple deltacommit processes and finally, by observing the Spark UI, it is noted that the data flow is stalled and no further calculations are being...
These errors occurred when switching from version 0.14.1 to 0.15.0, and the Hudi write configurations have not changed.
> Thanks for the feedback, the error stacktrace is not very detailed, it would be very helpful if you can past the "caused by" part, @nsivabalan , not sure if...
> Thanks for the feedback, the error stacktrace is not very detailed, it would be very helpful if you can past the "caused by" part, @nsivabalan , not sure if...