Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Tips for migration of Records Scaling to 200 Million Rows

Tips for migration of Records Scaling to 200 Million Rows

From: <seshan.parameswaran_at_gmail.com>
Date: Thu, 29 Nov 2007 13:47:20 -0800 (PST)
Message-ID: <cdda1313-c2b3-46ac-81e9-aad702e6637a@e67g2000hsc.googlegroups.com>


Hello Group

                     Excuse the blast please.  I am seeking an expert
opinion in Migrating data from an existing Oracle RDBMS to another. The existing system is proprietory and can be accessed only via a JAVA API Interface provided. I am working on a migration plan that does this migration. I am planning to use the java API Interface to extract the essential data and create SQL files with the data generated and then directly use the files to upload data into the new Oracle System with the nologging option to prevent undo/redo. In doing so, I wanted to use partioning as much as possible as I will be dealing with records scanning across 30 years. I am aware of the fact that to do updates to a large number of rows, it is advisable to use partioning and to create a new temp table and insert data from the partition you are planning to update and do the update to the temp table and replace the partition with the updated rows. Is there a similar approach that can be used for inserting new records as well?, particularly as new rows are inserted, the table is tend to grow and subsequent inserts would be slower. I am also investigating the option of using Oracle Data Pump and to use a java api in coordination with the same. Moreover, I have a limited timeframe of a month to perform this migration. I am looking for expert advice on the most feasible approcach that would work for this scenario. Thanks in advance for all inputs

Seshan Received on Thu Nov 29 2007 - 15:47:20 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US