RE: SQL Loader Vs DBLINK to Migrate Data

From: Mark W. Farnham <mwf_at_rsiz.com>
Date: Thu, 3 Nov 2016 01:37:52 -0400
Message-ID: <0c0001d23594$6c165300$4442f900$_at_rsiz.com>



A more likely scenario is that after a certain point in time most data rows reach an age where they are no longer candidates for change.  

IF that is the case and you can partition by some type of time based variable (there are builtins for intervals of DATE), then moving reasonably sized transportable tablespace sets where each tablespace sets contains a contiguous set of time intervals.  

So when you’re petabytes in total, maybe only a 20 or 50 terabytes needs to be moved. And the files get moved at the OS-like level of ASM files, not pumping the data out and sucking it back in through the SQL engine.  

At scale you have to pay attention to avoid work that is not needed. NOT moving data that has not changed is one such opportunity.  

mwf  

From: oracle-l-bounce_at_freelists.org [mailto:oracle-l-bounce_at_freelists.org] On Behalf Of Jack van Zanen Sent: Wednesday, November 02, 2016 6:21 PM To: william.ndolo_at_intertek.com
Cc: oracle-l
Subject: Re: SQL Loader Vs DBLINK to Migrate Data  

100's of TB doesnt sound like a job for db_link or sql loader or datapump to me..  

What are you trying to achieve when prod is that big?

If you are looking to refresh whole environments at those sizes I would not be looking at oracle tools.

Jack van Zanen



This e-mail and any attachments may contain confidential material for the sole use of the intended recipient. If you are not the intended recipient, please be aware that any disclosure, copying, distribution or use of this e-mail or any attachment is prohibited. If you have received this e-mail in error, please contact the sender and delete all copies. Thank you for your cooperation  

On Wed, Nov 2, 2016 at 12:37 AM, William Ndolo Intertek <william.ndolo_at_intertek.com> wrote:

I am setting up a process of moving data from UAT to Test environment and eventually to production.

The databases are very small at this time(about 10G each for both Test and UAT).

We expect to use the same method in production but the production databases are projected to grow rapidly into hundreds of terabytes.

At this point, DBLINK seems to be doing the job but considering sqlloader as an alternative when the databases get large.

There are many other tools/methods however, we are looking for something simple that can be automated.

Can anyone share their experience with both and other oracle tools?

Can anyone point me to Oracle documentation that does that kind of comparison or maybe recommends one as opposed to the other?  

Thanks and best regards,  

Bill  

Valued Quality. Delivered.


CONFIDENTIALITY NOTICE
This e-mail may contain confidential or privileged information, if you are not the intended recipient, or the person responsible for delivering the message to the intended recipient then please notify us by return e-mail immediately. Should you have received this e-mail in error then you should not copy this for any purpose nor disclose its contents to any other person.

Intertek is dedicated to Customer Service and welcomes your feedback. Please visit http://www.intertek.com/email-feedback/ to send us your suggestions or comments. We thank you for your time.

Except where explicitly agreed in writing, all work and services performed by Intertek is subject to our Standard Terms and Conditions of Business which can be obtained at our website: http://www.intertek.com/terms/ Should you have any difficulty obtaining these from the web site, please contact us immediately and we will send a copy by return.  

--

http://www.freelists.org/webpage/oracle-l Received on Thu Nov 03 2016 - 06:37:52 CET

Original text of this message