donghaihu

Results 7 comments of donghaihu

@codope :hello,this issue still exists in version 0.14, why was it closed?

@codope :As stated by the Issue, the problem is a necessary occurrence. The version we are currently using is 0.14. @Limess :Have you not encountered this problem again? May I...

> enables How can we configure it to avoid this issue?

> I mean did you have both Spark and Flink job writing into the same table, if it is, you might need to disable the MDT on Spark writer. Oh....

@danny0405 :I found that this situation tends to occur after Session tasks report errors.