RE: copy large amounts of data to MSSQL

From: Mark W. Farnham <mwf_at_rsiz.com>
Date: Fri, 6 Nov 2020 08:59:50 -0500
Message-ID: <326101d6b445$18dd2ea0$4a978be0$_at_rsiz.com>



This sounds useful. If you need to keep track of things pushed and things loaded, you probably want the name or number of the file on each row in the arrival loading table to make it easy to know from your side that the file has arrived and you can do control totals on the rows you think you sent for verification (which is less traffic than scrutinizing each row), and if the totals don’t match, any easy way to query back what did arrive so you can easily know what is missing. Good luck.  

From: oracle-l-bounce_at_freelists.org [mailto:oracle-l-bounce_at_freelists.org] On Behalf Of Shane Borden (Redacted sender "sborden76" for DMARC) Sent: Friday, November 06, 2020 8:11 AM
To: backseatdba_at_gmail.com
Cc: Frank Gordon; Mladen Gogala; oracle-l-freelist Subject: Re: copy large amounts of data to MSSQL  

Just do a dump to a delimited file and use the BCP tool to load it.

Shane Borden

sborden76_at_yahoo.com

Sent from my iPhone

On Nov 6, 2020, at 7:39 AM, Jeff Chirco <backseatdba_at_gmail.com> wrote:



Thanks for the ideas. We want to avoid having the Cloud instance have a direct connection to our database, hence why we want to push instead. Plus that works out better for our users, when they are done processing orders they initiate the send to the cloud vendor. This will be a multi day event to push to MSSQL.

I've been testng with 5000 rows, right now it is taking 2.5 minutes before it was 6-8 minutes. I believe the latency to OCI is fluctuating, currently I am getting 14ms ping times, however it is early in the morning here in California as I am working on a 19c upgrade. We've had ping latency in the upper 60ms ranges and we've experience a lot worse around 8:30am with packet loss.  

I was thinking the same thing about a Perl or Python script "might" be faster. Any one happen to have a sample of that?    

On Fri, Nov 6, 2020 at 1:35 AM Frank Gordon <frankagordon_at_gmail.com> wrote:

Hello,  

Have you tried doing a pull from the MS-SQL-Server side?

How much data in the table?

How "far" apart are the Oracle and MS sides?

How you tried sending one row and timing that?

Look at the MS-SQL-Server funtion OPENQUERY.  

Regards,

Frank    

On Fri, Nov 6, 2020 at 12:31 AM Jeff Chirco <backseatdba_at_gmail.com> wrote:

We have a new project where we need to copy a large dataset from Oracle (on prem) to SQL Server (cloud).  

I found that just a

insert into table_at_mssql

select * from table;  

Returns ORA-02025: all tables in the SQL statement must be at the remote database  

I am trying out the SQL Server Gateway driver from Oracle because the documentation looks like it would work with the Remote Insert Rowsource

https://docs.oracle.com/en/database/oracle/oracle-database/12.2/gmswn/database-gateway-for-sqlserver-features.html#GUID-513FBA3C-3458-4129-93E4-38DB2DF97F7A  

However I get the same error. Does anyone know if this should work?  

We've converted it to a PL/SQL loop inserting row by row but 5000 rows is taking 8 minutes. Oh and the funny thing is that this SQL Server instance is running on Oracle Cloud Infrastructure.  

Any other ways you can think of to do this quick? 8 minutes is killer.  

Thanks,

Jeff  

-- 

+353-86-0695383




--
http://www.freelists.org/webpage/oracle-l
Received on Fri Nov 06 2020 - 14:59:50 CET

Original text of this message