fanfanAlice
fanfanAlice

**flink sql task1:** read kafka topic data join hbase table insert to hudi table table1. kafka topic only keeps data for three hours. There are also hundreds of millions of...
The problem cannot be closed again first
I have the same problem. Is there any solution to this problem? I use the hudi1 inner join hive1 table to write to another hudi2 table, and then I get...
yes , I use concurerncy control lock
yes set hoodie.embed.timeline.server=false