Home » RDBMS Server » Server Utilities » Creating a SQL Loader ctl file with over 1000 columns (Oracle Database 10g Enterprise Edition Release 10.2.0.4.0)
Creating a SQL Loader ctl file with over 1000 columns [message #484921] Fri, 03 December 2010 14:53 Go to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Hi,

Next week I will be getting an input file which will contain over 1000 data columns to be loaded into ORACLE. It's about 6,400 characters in length.

My question is...has anyone ever created a huge ctl file like this to be used for SQLLoader, using so many columns? I will be sending certain columns(data) to certain tables, so it's not just going into 1. It will be about 6 tables.

Thanks,
Joe

[Updated on: Fri, 03 December 2010 14:54]

Report message to a moderator

Re: Creating a SQL Loader ctl file with over 1000 columns [message #484922 is a reply to message #484921] Fri, 03 December 2010 14:54 Go to previous messageGo to next message
BlackSwan
Messages: 26766
Registered: January 2009
Location: SoCal
Senior Member
You should consider using EXTERNAL TABLE instead

It would be helpful if you followed Posting Guidelines - http://www.orafaq.com/forum/t/88153/0/
Re: Creating a SQL Loader ctl file with over 1000 columns [message #484924 is a reply to message #484922] Fri, 03 December 2010 15:05 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Thanks BlackSwan.

I'll read up on what that is this weekend. It's Friday and the whistle has blown.

Have a great weekend....
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485201 is a reply to message #484924] Mon, 06 December 2010 14:22 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Hi,

I read up on EXTERNAL tables a little and still not sure which way to go. I found out today that there are 1,165 data columns that will be sent to me the end of this month totalling 6,400 bytes.

The txt file will be sent in tilde delimitted format and certain columns will be placed into one of 5 different tables.

Has anyone ever experienced or had a control file with around 1000 columns to load or insert into multiple tables, and does anyone know how the speed will be to load about 18,000 records like this? I don't think I've ever loaded an more that about 60 columns before.

Thank you

Re: Creating a SQL Loader ctl file with over 1000 columns [message #485202 is a reply to message #485201] Mon, 06 December 2010 14:26 Go to previous messageGo to next message
BlackSwan
Messages: 26766
Registered: January 2009
Location: SoCal
Senior Member
>Item Type Limit
>Columns Per table 1000 columns maximum

http://download.oracle.com/docs/cd/B19306_01/server.102/b14237/limits003.htm#i288032
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485203 is a reply to message #485202] Mon, 06 December 2010 14:33 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Yes, the DBA just told me that 1000 columns is the max as well. So that's why I guess it's best to use SQLldr and for me to split the file into 5 different tables, along with a primary key. In this case a unique ID number for joining.

Have you ever done something like this before with such a large amount of columns? I just don't want to set the server on fire with this one, every 2 weeks.

Thank for your input... Checking out your link now...
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485204 is a reply to message #485203] Mon, 06 December 2010 14:39 Go to previous messageGo to next message
BlackSwan
Messages: 26766
Registered: January 2009
Location: SoCal
Senior Member
>Have you ever done something like this before with such a large amount of columns?
If data was NORMALIZED, most likely the limit would NEVER be reached.
If I were in your position, I'd write a custom "script" that would carve up the file into multiple smaller chunks.

>So that's why I guess it's best to use SQLldr
AFAIK, for V10+ SQLLDR & EXTERNAL table share a common API.
I expect you'll hit the same column limit with SQLLDR.
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485205 is a reply to message #485204] Mon, 06 December 2010 14:42 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
>>>If I were in your position, I'd write a custom "script" that would carve up the file into multiple smaller chunks.

That's a good idea. I'll discuss this with the other guys here.

Thanks
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485211 is a reply to message #485205] Mon, 06 December 2010 19:23 Go to previous messageGo to next message
Barbara Boehmer
Messages: 9101
Registered: November 2002
Location: California, USA
Senior Member
With an external table, your data file must be on the server, and you have to load the whole thing into one external table, then use insert into your_tables ... select .. from your_external_table where ... to load the data into your tables. Obviously, with more than 1,000 columns you cannot do this, since you cannot create an external table with that many columns.

Presumably, since you cannot have a table with more than 1,000 columns, not every column is going into every table. With SQL*Loader, your data file can be on the server or client and it can handle more than 1,000 columns, using when conditions to distribute the appropriate columns into the appropriate tables. SQL*Loader is fast and can handle 18,000 records easily.

I have provided a script below to demonstrate how SQL*Loader can be used for something like this. If you copy and paste, save, and run the whole script, changing scott/tiger to your username and password, it will create a test.dat data file of four rows of data with 1,165 columns delimited by ~, with each row 5,884 characters long. It will also create a test.ctl control file that loads rows where the first character of the first column is 'a' into taba and rows where the first letter of the first column is 'b' into tabb and each record is terminated by 'x'. This is just an example. Your conditions for your when clauses and record terminator may be different. It uses filler fields to skip the columns that are not needed for the specific table. It creates a taba and a tabb and loads the first 3 columns into taba and the last 3 columns into tab. It drops the tables, so that you are not left with test tables on your system, and can re-run the script. It then allows you to view, in spooled files, the results in test.txt, the log file in test.log, the control file in test.ctl, and data file in test.dat.

Here is the script:
store set saved_settings replace
set echo off feedback off heading off pagesize 0 serveroutput on verify off
spool test.dat
declare
  v_clob  clob;
begin
  v_clob := null;
  for i in 1 .. 1165 loop
    v_clob := v_clob || 'a' || i || '~';    
  end loop;
  v_clob := v_clob || 'x';
  dbms_output.put_line (v_clob);
  v_clob := null;
  for i in reverse 1 .. 1165 loop
    v_clob := v_clob || 'b' || i || '~';    
  end loop;
  v_clob := v_clob || 'x';
  dbms_output.put_line (v_clob);
  v_clob := null;
  for i in reverse 1 .. 1165 loop
    v_clob := v_clob || 'a' || i || '~';    
  end loop;
  v_clob := v_clob || 'x';
  dbms_output.put_line (v_clob);
  v_clob := null;
  for i in 1 .. 1165 loop
    v_clob := v_clob || 'b' || i || '~';    
  end loop;
  v_clob := v_clob || 'x';
  dbms_output.put_line (v_clob);
end;
/
spool off 
spool test.ctl
prompt load data
prompt infile test.dat
prompt continueif last != 'x'
prompt into table taba
prompt when (1) = 'a'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 position (1), 
prompt  col2, 
prompt  col3)
prompt into table tabb
prompt when (1) = 'b'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 filler position (1), 
select 'col' || to_char (1 + rownum) || ' filler,'
from   dual
connect by level <= 1161
/
prompt  col1163, 
prompt  col1164, 
prompt  col1165)
spool off
start saved_settings
spool test.txt
create table taba
  (col1  varchar2 (5),
   col2  varchar2 (5),
   col3  varchar2 (5))
/
create table tabb
  (col1163  varchar2 (5),
   col1164  varchar2 (5),
   col1165  varchar2 (5))
/
host sqlldr scott/tiger control=test.ctl log=test.log
select * from taba
/
select * from tabb
/
spool off
drop table taba
/
drop table tabb
/
ed test.txt
ed test.log
ed test.ctl
ed test.dat


and here is the result you should get in test.txt:
SCOTT@orcl_11gR2> create table taba
  2    (col1  varchar2 (5),
  3  	col2  varchar2 (5),
  4  	col3  varchar2 (5))
  5  /

Table created.

SCOTT@orcl_11gR2> create table tabb
  2    (col1163  varchar2 (5),
  3  	col1164  varchar2 (5),
  4  	col1165  varchar2 (5))
  5  /

Table created.

SCOTT@orcl_11gR2> host sqlldr scott/tiger control=test.ctl log=test.log

SCOTT@orcl_11gR2> select * from taba
  2  /

COL1  COL2  COL3
----- ----- -----
a1    a2    a3
a1165 a1164 a1163

2 rows selected.

SCOTT@orcl_11gR2> select * from tabb
  2  /

COL11 COL11 COL11
----- ----- -----
b3    b2    b1
b1163 b1164 b1165

2 rows selected.

SCOTT@orcl_11gR2> spool off



your control file will look like this:
load data
infile test.dat
continueif last != 'x'
into table taba
when (1) = 'a'
fields terminated by '~'
trailing nullcols
(col1 position (1),
col2,
col3)
into table tabb
when (1) = 'b'
fields terminated by '~'
trailing nullcols
(col1 filler position (1),
col2 filler,
col3 filler,
col4 filler,
... 
col1160 filler,
col1161 filler,
col1162 filler,
col1163,
col1164,
col1165)




Re: Creating a SQL Loader ctl file with over 1000 columns [message #485214 is a reply to message #485211] Mon, 06 December 2010 20:05 Go to previous messageGo to next message
Barbara Boehmer
Messages: 9101
Registered: November 2002
Location: California, USA
Senior Member
Here is another script that does not use a record terminator, but instead assumes that your linesize is big enough that each row of data is in one row in your test.dat file.

store set saved_settings replace
set echo off feedback off heading off linesize 6000 pagesize 0 serveroutput on verify off
spool test.dat
declare
  v_clob  clob;
begin
  v_clob := null;
  for i in 1 .. 1165 loop
    v_clob := v_clob || 'a' || i || '~';    
  end loop;
  dbms_output.put_line (v_clob);
  v_clob := null;
  for i in reverse 1 .. 1165 loop
    v_clob := v_clob || 'b' || i || '~';    
  end loop;
  dbms_output.put_line (v_clob);
  v_clob := null;
  for i in reverse 1 .. 1165 loop
    v_clob := v_clob || 'a' || i || '~';    
  end loop;
  dbms_output.put_line (v_clob);
  v_clob := null;
  for i in 1 .. 1165 loop
    v_clob := v_clob || 'b' || i || '~';    
  end loop;
  dbms_output.put_line (v_clob);
end;
/
spool off 
spool test.ctl
prompt load data
prompt infile test.dat
prompt into table taba
prompt when (1) = 'a'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 position (1), 
prompt  col2, 
prompt  col3)
prompt into table tabb
prompt when (1) = 'b'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 filler position (1), 
select 'col' || to_char (1 + rownum) || ' filler,'
from   dual
connect by level <= 1161
/
prompt  col1163, 
prompt  col1164, 
prompt  col1165)
spool off
start saved_settings
spool test.txt
create table taba
  (col1  varchar2 (5),
   col2  varchar2 (5),
   col3  varchar2 (5))
/
create table tabb
  (col1163  varchar2 (5),
   col1164  varchar2 (5),
   col1165  varchar2 (5))
/
host sqlldr scott/tiger control=test.ctl log=test.log
select * from taba
/
select * from tabb
/
spool off
drop table taba
/
drop table tabb
/
ed test.txt
ed test.log
ed test.ctl
ed test.dat

Re: Creating a SQL Loader ctl file with over 1000 columns [message #485250 is a reply to message #485214] Tue, 07 December 2010 01:00 Go to previous messageGo to next message
Littlefoot
Messages: 21823
Registered: June 2005
Location: Croatia, Europe
Senior Member
Account Moderator
Just being curious: what information is stored within those "more than 1000 columns"? Must be something very complex (at least, I'd say so).
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485275 is a reply to message #485250] Tue, 07 December 2010 03:53 Go to previous messageGo to next message
Barbara Boehmer
Messages: 9101
Registered: November 2002
Location: California, USA
Senior Member
If each row of 1,165 columns is about 6,400 characters in length, then they are very short values.
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485320 is a reply to message #485250] Tue, 07 December 2010 08:00 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Littlefoot, No not so complex, just a ton of employee information. For many years this data has been going into a MAINFRAME system, but early next year they are pulling the plug on that dinosaur. So the next best thing is to get the data into Oracle, which we have here as well. From there I will create a couple of screens using Oracle Application Express for the users to view.

Barbara, Wow. You did a lot of work, and thanks for the test scripts and for your advice. I'd much rather use SQLLDR since I am familiar with it and am glad that it will work. You're right, the values are very short for each column.

I am going to copy and paste your code to give it a try and will let you know when I am finished with it.

Thank you all so much for your help !!

Re: Creating a SQL Loader ctl file with over 1000 columns [message #485368 is a reply to message #485211] Tue, 07 December 2010 13:53 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Barbara,

Nice script. It created the ctl file really fast. However, something is wrong on my end. Can I specify somewhere in the code for DBMS_OUTPUT to be more than 2000 bytes?

SQLPlus would hang on me, so I had to X out of it. This was in the test.dat file:

declare
*
ERROR at line 1:
ORA-20000: ORU-10027: buffer overflow, limit of 2000 bytes 
ORA-06512: at "SYS.DBMS_OUTPUT", line 32 
ORA-06512: at "SYS.DBMS_OUTPUT", line 97 
ORA-06512: at "SYS.DBMS_OUTPUT", line 112 
ORA-06512: at line 9 



Thanks,
Joe
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485370 is a reply to message #485368] Tue, 07 December 2010 13:57 Go to previous messageGo to next message
BlackSwan
Messages: 26766
Registered: January 2009
Location: SoCal
Senior Member
DBMS_OUTPUT.ENABLE(1000000);
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485372 is a reply to message #485370] Tue, 07 December 2010 14:09 Go to previous messageGo to next message
Barbara Boehmer
Messages: 9101
Registered: November 2002
Location: California, USA
Senior Member
BlackSwan wrote on Tue, 07 December 2010 11:57

DBMS_OUTPUT.ENABLE(1000000);


or, where it says:

set serveroutput on verify off

in the second line of the script, replace it with:

set serveroutput on size 1000000 verify off

[Updated on: Tue, 07 December 2010 14:16]

Report message to a moderator

Re: Creating a SQL Loader ctl file with over 1000 columns [message #485479 is a reply to message #485372] Wed, 08 December 2010 08:28 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
I am not having any luck with either of the new changes. I set the size to a million, and put the DBMS_OUTPUT enable in several places before that. I still can't get the test.dat file created.

With the serveroutput being set, this is the error msg I get. I didn't see any problems with bind variables, or quotes...

ERROR:
ORA-06502: PL/SQL: numeric or value error: host bind array too small 
ORA-06512: at line 1 


...but it's ok. I know the problem must be over here and not the code. I get what Barbara has written and appreciate the effort. If it's an easy fix, again, I'd appreciate the help. If not, it's ok.

This was where I set the output size, plus the next few lines:
store set saved_settings replace
set echo off feedback off heading off pagesize 0     serveroutput on size 1000000 verify off
spool c:\test.dat
declare
  v_clob  clob;
begin
  v_clob := null;
  for i in 1 .. 1165 loop


Thank you
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485488 is a reply to message #485479] Wed, 08 December 2010 09:35 Go to previous messageGo to next message
Barbara Boehmer
Messages: 9101
Registered: November 2002
Location: California, USA
Senior Member
I think the difference is due to versions. I am using 11g and you are using in 10g. I think in 10g dbms_output could not handle a clob. Try the following script instead.

store set saved_settings replace
set echo off feedback off heading off linesize 6000 pagesize 0 serveroutput on size 1000000 verify off
spool test.dat
variable v_clob clob
begin
  :v_clob := null;
  for i in 1 .. 1165 loop
    :v_clob := :v_clob || 'a' || i || '~';    
  end loop;
end;
/
select :v_clob from dual
/
begin
  :v_clob := null;
  for i in reverse 1 .. 1165 loop
    :v_clob := :v_clob || 'b' || i || '~';    
  end loop;
end;
/
select :v_clob from dual
/
begin
  :v_clob := null;
  for i in reverse 1 .. 1165 loop
    :v_clob := :v_clob || 'a' || i || '~';    
  end loop;
end;
/
select :v_clob from dual
/
begin
  :v_clob := null;
  for i in 1 .. 1165 loop
    :v_clob := :v_clob || 'b' || i || '~';    
  end loop;
end;
/
select :v_clob from dual
/
spool off 
spool test.ctl
prompt load data
prompt infile test.dat
prompt into table taba
prompt when (1) = 'a'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 position (1), 
prompt  col2, 
prompt  col3)
prompt into table tabb
prompt when (1) = 'b'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 filler position (1), 
select 'col' || to_char (1 + rownum) || ' filler,'
from   dual
connect by level <= 1161
/
prompt  col1163, 
prompt  col1164, 
prompt  col1165)
spool off
start saved_settings
spool test.txt
create table taba
  (col1  varchar2 (5),
   col2  varchar2 (5),
   col3  varchar2 (5))
/
create table tabb
  (col1163  varchar2 (5),
   col1164  varchar2 (5),
   col1165  varchar2 (5))
/
host sqlldr scott/tiger control=test.ctl log=test.log
select * from taba
/
select * from tabb
/
spool off
drop table taba
/
drop table tabb
/
ed test.txt
ed test.log
ed test.ctl
ed test.dat

Re: Creating a SQL Loader ctl file with over 1000 columns [message #485515 is a reply to message #485488] Wed, 08 December 2010 11:49 Go to previous messageGo to next message
Barbara Boehmer
Messages: 9101
Registered: November 2002
Location: California, USA
Senior Member
Here is another revision. I added long 10000 and longchunksize 10000 to the set commands, because they affect the size of the clob that you are able to select. I also added a set echo on before spooling test.txt.

store set saved_settings replace
set echo off feedback off heading off linesize 6000 long 10000 longchunksize 10000 
set pagesize 0 serveroutput on size 1000000 verify off
spool test.dat
variable v_clob clob
begin
  :v_clob := null;
  for i in 1 .. 1165 loop
    :v_clob := :v_clob || 'a' || i || '~';    
  end loop;
end;
/
select :v_clob from dual
/
begin
  :v_clob := null;
  for i in reverse 1 .. 1165 loop
    :v_clob := :v_clob || 'b' || i || '~';    
  end loop;
end;
/
select :v_clob from dual
/
begin
  :v_clob := null;
  for i in reverse 1 .. 1165 loop
    :v_clob := :v_clob || 'a' || i || '~';    
  end loop;
end;
/
select :v_clob from dual
/
begin
  :v_clob := null;
  for i in 1 .. 1165 loop
    :v_clob := :v_clob || 'b' || i || '~';    
  end loop;
end;
/
select :v_clob from dual
/
spool off 
spool test.ctl
prompt load data
prompt infile test.dat
prompt into table taba
prompt when (1) = 'a'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 position (1), 
prompt  col2, 
prompt  col3)
prompt into table tabb
prompt when (1) = 'b'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 filler position (1), 
select 'col' || to_char (1 + rownum) || ' filler,'
from   dual
connect by level <= 1161
/
prompt  col1163, 
prompt  col1164, 
prompt  col1165)
spool off
start saved_settings
set echo on
spool test.txt
create table taba
  (col1  varchar2 (5),
   col2  varchar2 (5),
   col3  varchar2 (5))
/
create table tabb
  (col1163  varchar2 (5),
   col1164  varchar2 (5),
   col1165  varchar2 (5))
/
host sqlldr scott/tiger control=test.ctl log=test.log
select * from taba
/
select * from tabb
/
spool off
drop table taba
/
drop table tabb
/
ed test.txt
ed test.log
ed test.ctl
ed test.dat

Re: Creating a SQL Loader ctl file with over 1000 columns [message #485516 is a reply to message #485515] Wed, 08 December 2010 12:12 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Hi Barbara,

I was not having much success with the other 2 scripts above, but this one worked much better and it's almost there. It created the .dat and .ctl files just right.

Nothing in TABLEB was loaded though. Not sure why. I've attached the log file to show that the 2 rows failed.

Thank you

  • Attachment: testlog.log
    (Size: 114.82KB, Downloaded 1762 times)
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485517 is a reply to message #485516] Wed, 08 December 2010 12:25 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
I have another question when using this ctl file to load into multiple tables.

Lets say for example in your TABLEA you have your employee info including the ssn. Then in TABLEB other info like employee address or something. Later, I would need to join the 2 tables to get them together. Can I set up the ctl file to also include the ssn to be put into TABLEB using a delimitted ctl file? Or can that only happen when using a ctl file using the positional method?
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485522 is a reply to message #485516] Wed, 08 December 2010 12:35 Go to previous messageGo to next message
Barbara Boehmer
Messages: 9101
Registered: November 2002
Location: California, USA
Senior Member
I seem to recall that in earlier versions, the column to be checked needed to be in the table, not a filler field, so perhaps that is the problem. I added co1 to tableb in the revised script below. If that doesn't work, then I am running out of ideas.

store set saved_settings replace
set echo off feedback off heading off linesize 6000 long 10000 longchunksize 10000 
set pagesize 0 serveroutput on size 1000000 verify off
spool test.dat
variable v_clob clob
begin
  :v_clob := null;
  for i in 1 .. 1165 loop
    :v_clob := :v_clob || 'a' || i || '~';    
  end loop;
end;
/
select :v_clob from dual
/
begin
  :v_clob := null;
  for i in reverse 1 .. 1165 loop
    :v_clob := :v_clob || 'b' || i || '~';    
  end loop;
end;
/
select :v_clob from dual
/
begin
  :v_clob := null;
  for i in reverse 1 .. 1165 loop
    :v_clob := :v_clob || 'a' || i || '~';    
  end loop;
end;
/
select :v_clob from dual
/
begin
  :v_clob := null;
  for i in 1 .. 1165 loop
    :v_clob := :v_clob || 'b' || i || '~';    
  end loop;
end;
/
select :v_clob from dual
/
spool off 
spool test.ctl
prompt load data
prompt infile test.dat
prompt into table taba
prompt when (1) = 'a'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 position (1), 
prompt  col2, 
prompt  col3)
prompt into table tabb
prompt when (1) = 'b'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 position (1), 
select 'col' || to_char (1 + rownum) || ' filler,'
from   dual
connect by level <= 1161
/
prompt  col1163, 
prompt  col1164, 
prompt  col1165)
spool off
start saved_settings
set echo on
spool test.txt
create table taba
  (col1  varchar2 (5),
   col2  varchar2 (5),
   col3  varchar2 (5))
/
create table tabb
  (col1     varchar2 (5),
   col1163  varchar2 (5),
   col1164  varchar2 (5),
   col1165  varchar2 (5))
/
host sqlldr scott/tiger control=test.ctl log=test.log
select * from taba
/
select * from tabb
/
spool off
drop table taba
/
drop table tabb
/
ed test.txt
ed test.log
ed test.ctl
ed test.dat

Re: Creating a SQL Loader ctl file with over 1000 columns [message #485529 is a reply to message #485517] Wed, 08 December 2010 12:47 Go to previous messageGo to next message
Barbara Boehmer
Messages: 9101
Registered: November 2002
Location: California, USA
Senior Member
rappaj wrote on Wed, 08 December 2010 10:25

I have another question when using this ctl file to load into multiple tables.

Lets say for example in your TABLEA you have your employee info including the ssn. Then in TABLEB other info like employee address or something. Later, I would need to join the 2 tables to get them together. Can I set up the ctl file to also include the ssn to be put into TABLEB using a delimitted ctl file? Or can that only happen when using a ctl file using the positional method?


No problem. Please see the demo below.

-- test.dat:
7782~CLARK~NEW YORK
7839~KING~NEW YORK
7934~MILLER~NEW YORK


-- test.ctl:
load data
infile test.dat
into table taba
fields terminated by '~'
trailing nullcols
(ssn position (1),
name)
into table tabb
fields terminated by '~'
trailing nullcols
(ssn position (1),
name filler,
address)


-- load and results:
SCOTT@orcl_11gR2> create table taba
  2    (ssn	 varchar2 (5),
  3  	name	 varchar2 (15))
  4  /

Table created.

SCOTT@orcl_11gR2> create table tabb
  2    (ssn	 varchar2 (5),
  3  	address  varchar2 (15))
  4  /

Table created.

SCOTT@orcl_11gR2> host sqlldr scott/tiger control=test.ctl log=test.log

SCOTT@orcl_11gR2> select * from taba order by ssn
  2  /

SSN   NAME
----- ---------------
7782  CLARK
7839  KING
7934  MILLER

3 rows selected.

SCOTT@orcl_11gR2> select * from tabb order by ssn
  2  /

SSN   ADDRESS
----- ---------------
7782  NEW YORK
7839  NEW YORK
7934  NEW YORK

3 rows selected.

SCOTT@orcl_11gR2>

Re: Creating a SQL Loader ctl file with over 1000 columns [message #485530 is a reply to message #485522] Wed, 08 December 2010 12:49 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Barbara,

So close and almost perfect. TABLEB empty still. But believe me, what you have done so far was excellent. It was a lot of work you did to create a test scenerio for me, and I see that it can work loading over 1165 columns split into several tables.

They will be sending me this file towards the end of the month so I have plenty of time to continue testing and playing around with it.

I think we can close this topic now if you want. I appreciate all the help. But if you want me to test anything other changes, I'd be glad to.

Thank you
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485535 is a reply to message #485529] Wed, 08 December 2010 12:55 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Ahhh. Ok. So position(1) is the key to both tables. Got it.

Is the NAME FILLER required in TABB in the ctl file?

into table tabb
fields terminated by '~'
trailing nullcols
(ssn position (1),
name filler,
address)

[Updated on: Wed, 08 December 2010 12:57]

Report message to a moderator

Re: Creating a SQL Loader ctl file with over 1000 columns [message #485537 is a reply to message #485535] Wed, 08 December 2010 13:00 Go to previous messageGo to next message
Barbara Boehmer
Messages: 9101
Registered: November 2002
Location: California, USA
Senior Member
Quote:

Ahhh. Ok. So position(1) is the key to both tables. Got it.


Position(1) is required for the first column following any when clause after the first when clause. Without it, it does not know where to reposition.

Quote:

Is the NAME FILLER required in TABLEB?

into table tabb
fields terminated by '~'
trailing nullcols
(ssn position (1),
name filler,
address)


Yes. Any columns, prior to the last column loaded, that are not to be loaded into that table, must be marked as filler columns. Otherwise, it would load the name field into the address column of tableb.
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485540 is a reply to message #485537] Wed, 08 December 2010 13:08 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Ok, I understand. Thanks for the SQLLdr lesson over the last few days.

It looks like with me using that ssn position (1), inserting into 4 different tables, I will be having a LOT of fillers then.

Thanks,
Joe
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485541 is a reply to message #485540] Wed, 08 December 2010 13:13 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
P.S. By looking at this example I found on the net and in your opinion, would you say it would be easier for me to get the file ~ delimitted, or positional? What would be your preference?


This sample below doesn't use fillers when using positional.

LOAD DATA
INFILE /u01/app/oracle/load.dat
REPLACE
INTO TABLE emp
   WHEN empno != ' '
   ( empno POSITION(1:4) INTEGER EXTERNAL,
     ename POSITION(6:15) CHAR
)
INTO TABLE proj
WHEN projno != ' '
   ( projno POSITION(25:27) INTEGER EXTERNAL,
     empno POSITION(1:4) INTEGER EXTERNAL
)
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485548 is a reply to message #485541] Wed, 08 December 2010 13:55 Go to previous messageGo to next message
Barbara Boehmer
Messages: 9101
Registered: November 2002
Location: California, USA
Senior Member
The ssn position(1) is for when the ssn is the first column in your data file. The position(1) needs to go with the first column, whatever it is.

Whether you use delimited or fixed position depends on your data. If your data is such that you are certain that your delimiter will never be within your data, then a delimited file is easier to create a control file for and you can generate a bunch of filler fields as I did. If your delimiter may be present in your data, then a fixed position file is safest, but it will be a major pain counting the columns and trying to get it right for a row that long. With a row that long, I would almost certainly go for a delimited data file, then after loading, check the bad file, remove any delimiters using an editor like notepad, then reload the fixed bad file.
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485698 is a reply to message #485548] Thu, 09 December 2010 09:00 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Thanks Barbara,

The delimitted file would be best then.

Thanks again for all your work and test scenerios for me. That goes for BlackSwan and LittleFoot as well.

Joe
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485716 is a reply to message #485548] Thu, 09 December 2010 10:08 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Hi,

Am I doing this correctly With the FILLERS in the sample below? Do my WHEN clauses look ok?

For each record
- I want to Load the ssn and columns cola-colj into TABLEAAA
- I want to Load the ssn and columns colk-colp into TABLEBBB
- I want to Load the ssn and columns colq-colt into TABLECCC
- I want to Load the ssn and columns colu-colz into TABLEDDD

load data
infile 'employee.txt'
append
into table tableAAA
when ssn != ' '
fields terminated by '~' 
trailing nullcols 
(
ssn                   position(1),     
cola,
colb,
colc,
cold,
cole,
colf,
colg,
colh,
coli,
colj
INTO TABLE tableBBB
when ssn != ' '
fields terminated by '~'
trailing nullcols
ssn                   position(1),
cola filler,
colb filler,
colc filler,
cold filler,
cole filler,
colf filler,
colg filler,
colh filler,
coli filler,
colj filler,
colk,
coll,
colm,
coln,
colo,
colp,
INTO TABLE tableCCC
when ssn != ' '
fields terminated by '~'
trailing nullcols
ssn                   position(1),
cola filler,
colb filler,
colc filler,
cold filler,
cole filler,
colf filler,
colg filler,
colh filler,
coli filler,
colj filler,
colk filler,
coll filler,
colm filler,
coln filler,
colo filler,
colp filler,
colq,
colr
cols,
colt,
INTO TABLE tableDDD
when ssn != ' '
fields terminated by '~'
trailing nullcols
ssn                   position(1),
cola filler,
colb filler,
colc filler,
cold filler,
cole filler,
colf filler,
colg filler,
colh filler,
coli filler,
colj filler,
colk filler,
coll filler,
colm filler,
coln filler,
colo filler,
colp filler,
colq filler,
colr filler,
cols filler,
colt filler,
COLu,
colv,
colw,
colx,
coly,
colz
)


Thank you

[Updated on: Thu, 09 December 2010 10:42]

Report message to a moderator

Re: Creating a SQL Loader ctl file with over 1000 columns [message #485731 is a reply to message #485716] Thu, 09 December 2010 12:10 Go to previous messageGo to next message
Barbara Boehmer
Messages: 9101
Registered: November 2002
Location: California, USA
Senior Member
You need to enclose each group of fields after each table within parentheses. So, you need a left parenthesis before each ssn and after colj under tableaaa and colp under tablebbb and colt under tableccc. You are also missing a comma after colr under tableccc. This is assuming that your data matches. I don't have your data. I also don't have your tables. It will help if you would supply create table statements and sample data. What you should do is get or make some test data and test it, check the log, and work through the errors one at a time. After fixing the parentheses and comma, here is what I got with some made-up data and tables.

-- employee.txt:
1~cola~colb~colc~cold~cole~colf~colg~colh~coli~colj~colk~coll~colm~coln~colo~colp~colq~colr~cols~colt~colu~colv~colw~colx~coly~colz
2~COLA~COLB~COLC~COLD~COLE~COLF~COLG~COLH~COLI~COLJ~COLK~COLL~COLM~COLN~COLO~COLP~COLQ~COLR~COLS~COLT~COLU~COLV~COLW~COLX~COLY~COLZ


-- test.ctl:
load data
infile 'employee.txt'
append
into table tableAAA
when ssn != ' '
fields terminated by '~'
trailing nullcols
(ssn                   position(1),
cola,
colb,
colc,
cold,
cole,
colf,
colg,
colh,
coli,
colj)
INTO TABLE tableBBB
when ssn != ' '
fields terminated by '~'
trailing nullcols
(ssn                   position(1),
cola filler,
colb filler,
colc filler,
cold filler,
cole filler,
colf filler,
colg filler,
colh filler,
coli filler,
colj filler,
colk,
coll,
colm,
coln,
colo,
colp)
INTO TABLE tableCCC
when ssn != ' '
fields terminated by '~'
trailing nullcols
(ssn                   position(1),
cola filler,
colb filler,
colc filler,
cold filler,
cole filler,
colf filler,
colg filler,
colh filler,
coli filler,
colj filler,
colk filler,
coll filler,
colm filler,
coln filler,
colo filler,
colp filler,
colq,
colr,
cols,
colt)
INTO TABLE tableDDD
when ssn != ' '
fields terminated by '~'
trailing nullcols
(ssn                   position(1),
cola filler,
colb filler,
colc filler,
cold filler,
cole filler,
colf filler,
colg filler,
colh filler,
coli filler,
colj filler,
colk filler,
coll filler,
colm filler,
coln filler,
colo filler,
colp filler,
colq filler,
colr filler,
cols filler,
colt filler,
COLu,
colv,
colw,
colx,
coly,
colz)


-- create tables:
SCOTT@orcl_11gR2> create table tableaaa
  2    (ssn    number,
  3  	cola   varchar2 (5),
  4  	colb   varchar2 (5),
  5  	colc   varchar2 (5),
  6  	cold   varchar2 (5),
  7  	cole   varchar2 (5),
  8  	colf   varchar2 (5),
  9  	colg   varchar2 (5),
 10  	colh   varchar2 (5),
 11  	coli   varchar2 (5),
 12  	colj   varchar2 (5))
 13  /

Table created.

SCOTT@orcl_11gR2> create table tablebbb
  2    (ssn    number,
  3  	colk   varchar2 (5),
  4  	coll   varchar2 (5),
  5  	colm   varchar2 (5),
  6  	coln   varchar2 (5),
  7  	colo   varchar2 (5),
  8  	colp   varchar2 (5))
  9  /

Table created.

SCOTT@orcl_11gR2> create table tableccc
  2    (ssn    number,
  3  	colq   varchar2 (5),
  4  	colr   varchar2 (5),
  5  	cols   varchar2 (5),
  6  	colt   varchar2 (5))
  7  /

Table created.

SCOTT@orcl_11gR2> create table tableddd
  2    (ssn    number,
  3  	colu   varchar2 (5),
  4  	colv   varchar2 (5),
  5  	colw   varchar2 (5),
  6  	colx   varchar2 (5),
  7  	coly   varchar2 (5),
  8  	colz   varchar2 (5))
  9  /

Table created.


-- load:
SCOTT@orcl_11gR2> host sqlldr scott/tiger control=test.ctl log=test.log


-- results:
SCOTT@orcl_11gR2> select * from tableaaa
  2  /

       SSN COLA  COLB  COLC  COLD  COLE  COLF  COLG  COLH  COLI  COLJ
---------- ----- ----- ----- ----- ----- ----- ----- ----- ----- -----
         1 cola  colb  colc  cold  cole  colf  colg  colh  coli  colj
         2 COLA  COLB  COLC  COLD  COLE  COLF  COLG  COLH  COLI  COLJ

2 rows selected.

SCOTT@orcl_11gR2> select * from tablebbb
  2  /

       SSN COLK  COLL  COLM  COLN  COLO  COLP
---------- ----- ----- ----- ----- ----- -----
         1 colk  coll  colm  coln  colo  colp
         2 COLK  COLL  COLM  COLN  COLO  COLP

2 rows selected.

SCOTT@orcl_11gR2> select * from tableccc
  2  /

       SSN COLQ  COLR  COLS  COLT
---------- ----- ----- ----- -----
         1 colq  colr  cols  colt
         2 COLQ  COLR  COLS  COLT

2 rows selected.

SCOTT@orcl_11gR2> select * from tableddd
  2  /

       SSN COLU  COLV  COLW  COLX  COLY  COLZ
---------- ----- ----- ----- ----- ----- -----
         1 colu  colv  colw  colx  coly  colz
         2 COLU  COLV  COLW  COLX  COLY  COLZ

2 rows selected.

SCOTT@orcl_11gR2>

Re: Creating a SQL Loader ctl file with over 1000 columns [message #485738 is a reply to message #485731] Thu, 09 December 2010 12:48 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Ooops. I had several mistakes, and I see them now. You are very fast creating these test cases. So I need the FILLERS. I'm going to have a super huge ctl file when I get finished with this. I'll show you what it looks like when I'm finished with it.

I don't have any data yet, and the tables have not been created yet as well. I think I will be getting a test file next week hopefully. They said the end of the month, but I'm ready to get this thing going already.

Now I will start creating the ctl file since I know what the columns are.

Another question. Over the years, in my ctl files for the numeric columns I just supplied the column name. For the varchar columns, I did it like this below. Do I need to trim, or am I just wasting my time?

...
pay_amount,
LAST_NAME                "ltrim(rtrim(:LAST_NAME))" ,
FIRST_NAME               "ltrim(rtrim(:FIRST_NAME))" , 
badge_number,
...


Thanks
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485740 is a reply to message #485738] Thu, 09 December 2010 12:58 Go to previous messageGo to next message
Barbara Boehmer
Messages: 9101
Registered: November 2002
Location: California, USA
Senior Member
If there are leading and/or trailing spaces in your data, then you need to trim, otherwise no. " name " is not the same as "name".

icon14.gif  Re: Creating a SQL Loader ctl file with over 1000 columns [message #485741 is a reply to message #485740] Thu, 09 December 2010 13:06 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Ok Barbara, thank you for all your help with this.
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485742 is a reply to message #485741] Thu, 09 December 2010 13:17 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Oh, for the amounts data that I will be getting, they will be putting a minus or plus before
the number like this (+12345.67 and -12345.64)

I guess the column on the table has to be varchar2. Then later on I can convert it to to_number to do totalling and summarize for reporting. Is that correct?

Re: Creating a SQL Loader ctl file with over 1000 columns [message #485743 is a reply to message #485742] Thu, 09 December 2010 13:49 Go to previous messageGo to next message
Barbara Boehmer
Messages: 9101
Registered: November 2002
Location: California, USA
Senior Member
Wrong, the + and - are recognized as part of the numeric values. When in doubt, test and see, as shown below.

-- test.dat:
+12345.67~-12345.64
-12345.64~+12345.67


-- test.ctl:
load data
infile test.dat
into table test_tab
fields terminated by '~'
trailing nullcols
(col1, col2)


SCOTT@orcl_11gR2> create table test_tab
  2    (col1  number,
  3  	col2  number)
  4  /

Table created.

SCOTT@orcl_11gR2> host sqlldr scott/tiger control=test.ctl log=test.log

SCOTT@orcl_11gR2> select * from test_tab
  2  /

      COL1       COL2
---------- ----------
  12345.67  -12345.64
 -12345.64   12345.67

2 rows selected.

SCOTT@orcl_11gR2>

Re: Creating a SQL Loader ctl file with over 1000 columns [message #485756 is a reply to message #485743] Thu, 09 December 2010 14:46 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Yes, I see. It can easily be tested. Thanks sgain Barbara.
Re: Creating a SQL Loader ctl file with over 1000 columns [message #486459 is a reply to message #485756] Wed, 15 December 2010 12:43 Go to previous messageGo to next message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Still waiting for the data.......but the ctl file is now set up and hopefully it's correct.

4,244 lines in the ctl file, and that has to be a worlds record!!
Re: Creating a SQL Loader ctl file with over 1000 columns [message #486562 is a reply to message #486459] Thu, 16 December 2010 14:16 Go to previous messageGo to previous message
rappaj
Messages: 147
Registered: June 2010
Location: NYC
Senior Member
Hi,

I see that we have ssn position(1) defined as is so it can be loaded into several tables during the load, and because it happens to be the first column of the datafile.

I want to put in an additional column into all the tables called pay_period and it's in column 450 of the input data file.

Do I define the column as pay_period position(450) and can I put it directly under ssn?

into table tableBBB
when ssn != ' '
fields terminated by '~' 
trailing nullcols 
(
ssn        position(1),
pay_period position(450),
cola       filler,
colb       filler,
colc,
cold,
cole
...
...


Thanks,
Joe
Previous Topic: sqlloader vs external tables
Next Topic: probem_using_import_export_when_Installing_form_developer
Goto Forum:
  


Current Time: Mon Dec 23 11:01:40 CST 2024