[Bug] [Data Development] Use CDCSOURCE CDC to synchronize PG data to MSSQL, and the task is not sent to flink
Search before asking
- [x] I had searched in the issues and found no similar issues.
What happened
我想使用CDCSOURCE的功能同步两张表,但执行时只是报了一个null,没有详细的错误说明。 不使用CDCSOURCE功能单独每张表进行同步是可以的。 是不是脚本少了某些参数?
What you expected to happen
无
How to reproduce
脚本如下
ADD JAR '/opt/dinky/customJar/flink-sql-connector-postgres-cdc-3.0.1.jar'; -- str path EXECUTE CDCSOURCE cdc_mssql WITH ( 'connector' = 'postgres-cdc', 'hostname' = '100.22.1.26', -- 改成你的 PG 地址 'port' = '15432', 'username' = 'flink_test', 'password' = 'flink_test', 'database-name' = 'his', -- 'slot.name' = 'flink_slot_cdc_test_3', -- 一个作业一个 slot -- 'decoding.plugin.name' = 'pgoutput', 'checkpoint' = '3000', 'scan.startup.mode' = 'initial', 'parallelism' = '1', 'table-name' = 'public\.cdc_test,public\.cdc_test_2', 'sink.connector' = 'jdbc', 'sink.url' = 'jdbc:sqlserver://100.22.1.26:11433;DatabaseName=master;encrypt=true;trustServerCertificate=true', 'sink.username' = 'sa', 'sink.password' = 'Mssql201912345', 'sink.table.mapping-routes' = 'public\.cdc_test:dbo\.cdc_test,public\.cdc_test_2:dbo\.cdc_test_2', 'sink.driver' = 'com.microsoft.sqlserver.jdbc.SQLServerDriver', 'sink.sink.buffer-flush.interval' = '2s', 'sink.sink.buffer-flush.max-rows' = '100', 'sink.sink.max-retries' = '5' )
Anything else
No response
Version
1.2.3
Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
Code of Conduct
- [x] I agree to follow this project's Code of Conduct
Hello @iFoxox, this issue is about CDC/CDCSOURCE, so I assign it to @aiwenmo. If you have any questions, you can comment and reply.
你好 @iFoxox, 这个 issue 是关于 CDC/CDCSOURCE 的,所以我把它分配给了 @aiwenmo。如有任何问题,可以评论回复。
版本是1.2.4
Hi. Try deleting the comments in the SQL.
Hi. Try deleting the comments in the SQL.
没有作用还是一样的错误
Hello @, this issue has not been active for more than 30 days. This issue will be closed in 7 days if there is no response. If you have any questions, you can comment and reply.
你好 @, 这个 issue 30 天内没有活跃,7 天后将关闭,如需回复,可以评论回复。