Re: Tracking DML
Date: Wed, 6 Jul 2022 16:01:09 +0530
Message-ID: <CAKna9VbGjRczjN3RKwH0MWvsPyQ+tPHN9J2myZTFge1kGNZyEQ_at_mail.gmail.com>
Thank you Lucas.
Team was trying to see to achieve the data movement/streaming without additional licensing option. It looks like creating triggers for each and every DMLS on many transaction table can be overwhelming and cause performance bottleneck for the application. We were looking, if there any other possible strategy to achieve this?
But i understand the positive side of the golden gate is this kept accumulating and moving the changes from the source database redologs (which is as per default architecture of Oracle ) and converting those back to the respective DML statement on the target and apply those. So no additional overhead as such on the source application.
On Wed, 6 Jul 2022, 2:01 am Lucas Pimentel Lellis, <lucaslellis_at_gmail.com> wrote:
> Hi,
>
> I suggest taking a look at GoldenGate for Big Data Kafka Handler
> <https://docs.oracle.com/en/middleware/goldengate/big-data/19.1/gadbd/using-kafka-handler.html> (provided
> you have a license).
>
> Kind Regards,
>
> Lucas Pimentel Lellis
>
>
> On Tue, Jul 5, 2022 at 3:55 PM Lok P <loknath.73_at_gmail.com> wrote:
>
>> Hello All, We have a requirement to stream data from an OLTP
>> database(It's in Oracle 19C) to the redshift data warehouse in AWS cloud
>> through Kafka streaming.
>>
>> Team is planning to move the data using JSON format by clubbing all the
>> required fields. And as the newly inserted data is coming from one source
>> so it looks fine to form the JSON and pass it through Kafka stream for the
>> newly inserted records. But the Updates are happening in the source Oracle
>> database , from multiple places(From GUI and batch jobs etc) and it would
>> be a challenge to have the code added in all of those places to pass those
>> updates to kafka stream through JSON.
>>
>> Team is planning to use triggers for this purpose. We already have a
>> bunch of triggers in many tables. (Say for e.g. one table has 7 Update and
>> 1 Delete trigger and few of them have some business logic written in them).
>> And we have seen DML slowness because of the trigger code in this
>> application in the past. But Dev team planning, using additional Oracle
>> triggers for tracking the DML's here and then creating the JSON elements
>> and passing those to Kafka through oracle AQ's as the data streaming option
>> in this scenario. So my question is , is there any other better approach
>> for streaming the data here or is the trigger approach a suitable one?
>>
>
-- http://www.freelists.org/webpage/oracle-lReceived on Wed Jul 06 2022 - 12:31:09 CEST