Re: Tracking DML

From: Lucas Pimentel Lellis <lucaslellis_at_gmail.com>
Date: Tue, 5 Jul 2022 17:31:23 -0300
Message-ID: <CAMWwQgdoNoe2Zm5EqHa7UoXeJcVzft5UHXRYUHR58fOhAdKO6A_at_mail.gmail.com>



Hi,

I suggest taking a look at GoldenGate for Big Data Kafka Handler <https://docs.oracle.com/en/middleware/goldengate/big-data/19.1/gadbd/using-kafka-handler.html> (provided
you have a license).

Kind Regards,

Lucas Pimentel Lellis

On Tue, Jul 5, 2022 at 3:55 PM Lok P <loknath.73_at_gmail.com> wrote:

> Hello All, We have a requirement to stream data from an OLTP database(It's
> in Oracle 19C) to the redshift data warehouse in AWS cloud through Kafka
> streaming.
>
> Team is planning to move the data using JSON format by clubbing all the
> required fields. And as the newly inserted data is coming from one source
> so it looks fine to form the JSON and pass it through Kafka stream for the
> newly inserted records. But the Updates are happening in the source Oracle
> database , from multiple places(From GUI and batch jobs etc) and it would
> be a challenge to have the code added in all of those places to pass those
> updates to kafka stream through JSON.
>
> Team is planning to use triggers for this purpose. We already have a bunch
> of triggers in many tables. (Say for e.g. one table has 7 Update and 1
> Delete trigger and few of them have some business logic written in them).
> And we have seen DML slowness because of the trigger code in this
> application in the past. But Dev team planning, using additional Oracle
> triggers for tracking the DML's here and then creating the JSON elements
> and passing those to Kafka through oracle AQ's as the data streaming option
> in this scenario. So my question is , is there any other better approach
> for streaming the data here or is the trigger approach a suitable one?
>

--
http://www.freelists.org/webpage/oracle-l
Received on Tue Jul 05 2022 - 22:31:23 CEST

Original text of this message