flink-cdc icon indicating copy to clipboard operation
flink-cdc copied to clipboard

oracle-connector实时读取只读备库的问题

Open wangdabin1216 opened this issue 3 years ago • 8 comments

因为目前的现状是公司只提供一个oracle只读备库的环境用于数据同步,因为cdc在部署的时候需要在对接的oracle库下创建相关信息表,故无法执行,麻烦请问,对于这种情况,如何使用oracle-connector完成数据同步?

wangdabin1216 avatar Jun 26 '22 06:06 wangdabin1216

DBZ-3866 类似这个问题

wangdabin1216 avatar Jun 26 '22 07:06 wangdabin1216

  1. The backup library cannot use Logminer core cause: DBMS_LOGMNR_D and DBMS_LOGMNR write data to V$LOGMNR_CONTENTS view is required to write permission. Active/standby deployment is used for DISASTER recovery deployment, not for Logminer data capture for analysis

  2. The official recommendation is to deploy a separate Mining Database for data analysis, not master and slave.This is official document https://docs.oracle.com/goldengate/c1230/gg-winux/GGODB/configuring-downstream-mining-database.htm#GGODB-GUID-E265AB7E-6255-496E-896F-32E943C362D9

  3. The official documentation of logMiner : https://docs.oracle.com/cd/B19306_01/server.102/b14215/logminer.htm

@wangdabin1216 I hope this can help you

molsionmo avatar Jul 11 '22 01:07 molsionmo

Thank you very much It is not a good suggestion to deploy a separate mining database for data analysis, because our data analysis may be calculated based on flink or spark in the future. At this stage, only the upstream data is synchronized in real time, and the subsequent analysis requirements are yet to be determined. A separate mining database will double the capacity of the original oracle database, and we do not do data analysis based on oracle. @molsionmo

wangdabin1216 avatar Jul 11 '22 12:07 wangdabin1216

因为目前的现状是公司只提供一个oracle只读备库的环境用于数据同步,因为cdc在部署的时候需要在对接的oracle库下创建相关信息表,故无法执行,麻烦请问,对于这种情况,如何使用oracle-connector完成数据同步?

请问你们这个问题解决了吗 我现在也被这个只读备库卡住了 。。。

afsun1996 avatar Aug 05 '22 07:08 afsun1996

无解,目前。只能用OGG@ afsun1996

wangdabin1216 avatar Aug 08 '22 03:08 wangdabin1216

oracle 19c支持了备库DML自动重定向到主库,请问能解决flink-cdc创建相关信息表的问题吗

zhaoqiangaa1 avatar Dec 13 '22 08:12 zhaoqiangaa1

2.3解决备库的问题了吗

zhbdesign avatar Jun 17 '23 08:06 zhbdesign

Considering collaboration with developers around the world, please re-create your issue in English on Apache Jira under project Flink with component tag Flink CDC. Thank you! cc @GOODBOY008

gong avatar Feb 03 '24 05:02 gong

Closing this issue because it was created before version 2.3.0 (2022-11-10). Please try the latest version of Flink CDC to see if the issue has been resolved. If the issue is still valid, kindly report it on Apache Jira under project Flink with component tag Flink CDC. Thank you!

PatrickRen avatar Feb 28 '24 15:02 PatrickRen