Home » RDBMS Server » Server Utilities » Creating a SQL Loader ctl file with over 1000 columns (Oracle Database 10g Enterprise Edition Release 10.2.0.4.0)
Creating a SQL Loader ctl file with over 1000 columns [message #484921] |
Fri, 03 December 2010 14:53 |
rappaj
Messages: 147 Registered: June 2010 Location: NYC
|
Senior Member |
|
|
Hi,
Next week I will be getting an input file which will contain over 1000 data columns to be loaded into ORACLE. It's about 6,400 characters in length.
My question is...has anyone ever created a huge ctl file like this to be used for SQLLoader, using so many columns? I will be sending certain columns(data) to certain tables, so it's not just going into 1. It will be about 6 tables.
Thanks,
Joe
[Updated on: Fri, 03 December 2010 14:54] Report message to a moderator
|
|
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485201 is a reply to message #484924] |
Mon, 06 December 2010 14:22 |
rappaj
Messages: 147 Registered: June 2010 Location: NYC
|
Senior Member |
|
|
Hi,
I read up on EXTERNAL tables a little and still not sure which way to go. I found out today that there are 1,165 data columns that will be sent to me the end of this month totalling 6,400 bytes.
The txt file will be sent in tilde delimitted format and certain columns will be placed into one of 5 different tables.
Has anyone ever experienced or had a control file with around 1000 columns to load or insert into multiple tables, and does anyone know how the speed will be to load about 18,000 records like this? I don't think I've ever loaded an more that about 60 columns before.
Thank you
|
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485203 is a reply to message #485202] |
Mon, 06 December 2010 14:33 |
rappaj
Messages: 147 Registered: June 2010 Location: NYC
|
Senior Member |
|
|
Yes, the DBA just told me that 1000 columns is the max as well. So that's why I guess it's best to use SQLldr and for me to split the file into 5 different tables, along with a primary key. In this case a unique ID number for joining.
Have you ever done something like this before with such a large amount of columns? I just don't want to set the server on fire with this one, every 2 weeks.
Thank for your input... Checking out your link now...
|
|
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485211 is a reply to message #485205] |
Mon, 06 December 2010 19:23 |
|
Barbara Boehmer
Messages: 9101 Registered: November 2002 Location: California, USA
|
Senior Member |
|
|
With an external table, your data file must be on the server, and you have to load the whole thing into one external table, then use insert into your_tables ... select .. from your_external_table where ... to load the data into your tables. Obviously, with more than 1,000 columns you cannot do this, since you cannot create an external table with that many columns.
Presumably, since you cannot have a table with more than 1,000 columns, not every column is going into every table. With SQL*Loader, your data file can be on the server or client and it can handle more than 1,000 columns, using when conditions to distribute the appropriate columns into the appropriate tables. SQL*Loader is fast and can handle 18,000 records easily.
I have provided a script below to demonstrate how SQL*Loader can be used for something like this. If you copy and paste, save, and run the whole script, changing scott/tiger to your username and password, it will create a test.dat data file of four rows of data with 1,165 columns delimited by ~, with each row 5,884 characters long. It will also create a test.ctl control file that loads rows where the first character of the first column is 'a' into taba and rows where the first letter of the first column is 'b' into tabb and each record is terminated by 'x'. This is just an example. Your conditions for your when clauses and record terminator may be different. It uses filler fields to skip the columns that are not needed for the specific table. It creates a taba and a tabb and loads the first 3 columns into taba and the last 3 columns into tab. It drops the tables, so that you are not left with test tables on your system, and can re-run the script. It then allows you to view, in spooled files, the results in test.txt, the log file in test.log, the control file in test.ctl, and data file in test.dat.
Here is the script:
store set saved_settings replace
set echo off feedback off heading off pagesize 0 serveroutput on verify off
spool test.dat
declare
v_clob clob;
begin
v_clob := null;
for i in 1 .. 1165 loop
v_clob := v_clob || 'a' || i || '~';
end loop;
v_clob := v_clob || 'x';
dbms_output.put_line (v_clob);
v_clob := null;
for i in reverse 1 .. 1165 loop
v_clob := v_clob || 'b' || i || '~';
end loop;
v_clob := v_clob || 'x';
dbms_output.put_line (v_clob);
v_clob := null;
for i in reverse 1 .. 1165 loop
v_clob := v_clob || 'a' || i || '~';
end loop;
v_clob := v_clob || 'x';
dbms_output.put_line (v_clob);
v_clob := null;
for i in 1 .. 1165 loop
v_clob := v_clob || 'b' || i || '~';
end loop;
v_clob := v_clob || 'x';
dbms_output.put_line (v_clob);
end;
/
spool off
spool test.ctl
prompt load data
prompt infile test.dat
prompt continueif last != 'x'
prompt into table taba
prompt when (1) = 'a'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 position (1),
prompt col2,
prompt col3)
prompt into table tabb
prompt when (1) = 'b'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 filler position (1),
select 'col' || to_char (1 + rownum) || ' filler,'
from dual
connect by level <= 1161
/
prompt col1163,
prompt col1164,
prompt col1165)
spool off
start saved_settings
spool test.txt
create table taba
(col1 varchar2 (5),
col2 varchar2 (5),
col3 varchar2 (5))
/
create table tabb
(col1163 varchar2 (5),
col1164 varchar2 (5),
col1165 varchar2 (5))
/
host sqlldr scott/tiger control=test.ctl log=test.log
select * from taba
/
select * from tabb
/
spool off
drop table taba
/
drop table tabb
/
ed test.txt
ed test.log
ed test.ctl
ed test.dat
and here is the result you should get in test.txt:
SCOTT@orcl_11gR2> create table taba
2 (col1 varchar2 (5),
3 col2 varchar2 (5),
4 col3 varchar2 (5))
5 /
Table created.
SCOTT@orcl_11gR2> create table tabb
2 (col1163 varchar2 (5),
3 col1164 varchar2 (5),
4 col1165 varchar2 (5))
5 /
Table created.
SCOTT@orcl_11gR2> host sqlldr scott/tiger control=test.ctl log=test.log
SCOTT@orcl_11gR2> select * from taba
2 /
COL1 COL2 COL3
----- ----- -----
a1 a2 a3
a1165 a1164 a1163
2 rows selected.
SCOTT@orcl_11gR2> select * from tabb
2 /
COL11 COL11 COL11
----- ----- -----
b3 b2 b1
b1163 b1164 b1165
2 rows selected.
SCOTT@orcl_11gR2> spool off
your control file will look like this:
load data
infile test.dat
continueif last != 'x'
into table taba
when (1) = 'a'
fields terminated by '~'
trailing nullcols
(col1 position (1),
col2,
col3)
into table tabb
when (1) = 'b'
fields terminated by '~'
trailing nullcols
(col1 filler position (1),
col2 filler,
col3 filler,
col4 filler,
...
col1160 filler,
col1161 filler,
col1162 filler,
col1163,
col1164,
col1165)
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485214 is a reply to message #485211] |
Mon, 06 December 2010 20:05 |
|
Barbara Boehmer
Messages: 9101 Registered: November 2002 Location: California, USA
|
Senior Member |
|
|
Here is another script that does not use a record terminator, but instead assumes that your linesize is big enough that each row of data is in one row in your test.dat file.
store set saved_settings replace
set echo off feedback off heading off linesize 6000 pagesize 0 serveroutput on verify off
spool test.dat
declare
v_clob clob;
begin
v_clob := null;
for i in 1 .. 1165 loop
v_clob := v_clob || 'a' || i || '~';
end loop;
dbms_output.put_line (v_clob);
v_clob := null;
for i in reverse 1 .. 1165 loop
v_clob := v_clob || 'b' || i || '~';
end loop;
dbms_output.put_line (v_clob);
v_clob := null;
for i in reverse 1 .. 1165 loop
v_clob := v_clob || 'a' || i || '~';
end loop;
dbms_output.put_line (v_clob);
v_clob := null;
for i in 1 .. 1165 loop
v_clob := v_clob || 'b' || i || '~';
end loop;
dbms_output.put_line (v_clob);
end;
/
spool off
spool test.ctl
prompt load data
prompt infile test.dat
prompt into table taba
prompt when (1) = 'a'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 position (1),
prompt col2,
prompt col3)
prompt into table tabb
prompt when (1) = 'b'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 filler position (1),
select 'col' || to_char (1 + rownum) || ' filler,'
from dual
connect by level <= 1161
/
prompt col1163,
prompt col1164,
prompt col1165)
spool off
start saved_settings
spool test.txt
create table taba
(col1 varchar2 (5),
col2 varchar2 (5),
col3 varchar2 (5))
/
create table tabb
(col1163 varchar2 (5),
col1164 varchar2 (5),
col1165 varchar2 (5))
/
host sqlldr scott/tiger control=test.ctl log=test.log
select * from taba
/
select * from tabb
/
spool off
drop table taba
/
drop table tabb
/
ed test.txt
ed test.log
ed test.ctl
ed test.dat
|
|
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485320 is a reply to message #485250] |
Tue, 07 December 2010 08:00 |
rappaj
Messages: 147 Registered: June 2010 Location: NYC
|
Senior Member |
|
|
Littlefoot, No not so complex, just a ton of employee information. For many years this data has been going into a MAINFRAME system, but early next year they are pulling the plug on that dinosaur. So the next best thing is to get the data into Oracle, which we have here as well. From there I will create a couple of screens using Oracle Application Express for the users to view.
Barbara, Wow. You did a lot of work, and thanks for the test scripts and for your advice. I'd much rather use SQLLDR since I am familiar with it and am glad that it will work. You're right, the values are very short for each column.
I am going to copy and paste your code to give it a try and will let you know when I am finished with it.
Thank you all so much for your help !!
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485368 is a reply to message #485211] |
Tue, 07 December 2010 13:53 |
rappaj
Messages: 147 Registered: June 2010 Location: NYC
|
Senior Member |
|
|
Barbara,
Nice script. It created the ctl file really fast. However, something is wrong on my end. Can I specify somewhere in the code for DBMS_OUTPUT to be more than 2000 bytes?
SQLPlus would hang on me, so I had to X out of it. This was in the test.dat file:
declare
*
ERROR at line 1:
ORA-20000: ORU-10027: buffer overflow, limit of 2000 bytes
ORA-06512: at "SYS.DBMS_OUTPUT", line 32
ORA-06512: at "SYS.DBMS_OUTPUT", line 97
ORA-06512: at "SYS.DBMS_OUTPUT", line 112
ORA-06512: at line 9
Thanks,
Joe
|
|
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485479 is a reply to message #485372] |
Wed, 08 December 2010 08:28 |
rappaj
Messages: 147 Registered: June 2010 Location: NYC
|
Senior Member |
|
|
I am not having any luck with either of the new changes. I set the size to a million, and put the DBMS_OUTPUT enable in several places before that. I still can't get the test.dat file created.
With the serveroutput being set, this is the error msg I get. I didn't see any problems with bind variables, or quotes...
ERROR:
ORA-06502: PL/SQL: numeric or value error: host bind array too small
ORA-06512: at line 1
...but it's ok. I know the problem must be over here and not the code. I get what Barbara has written and appreciate the effort. If it's an easy fix, again, I'd appreciate the help. If not, it's ok.
This was where I set the output size, plus the next few lines:
store set saved_settings replace
set echo off feedback off heading off pagesize 0 serveroutput on size 1000000 verify off
spool c:\test.dat
declare
v_clob clob;
begin
v_clob := null;
for i in 1 .. 1165 loop
Thank you
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485488 is a reply to message #485479] |
Wed, 08 December 2010 09:35 |
|
Barbara Boehmer
Messages: 9101 Registered: November 2002 Location: California, USA
|
Senior Member |
|
|
I think the difference is due to versions. I am using 11g and you are using in 10g. I think in 10g dbms_output could not handle a clob. Try the following script instead.
store set saved_settings replace
set echo off feedback off heading off linesize 6000 pagesize 0 serveroutput on size 1000000 verify off
spool test.dat
variable v_clob clob
begin
:v_clob := null;
for i in 1 .. 1165 loop
:v_clob := :v_clob || 'a' || i || '~';
end loop;
end;
/
select :v_clob from dual
/
begin
:v_clob := null;
for i in reverse 1 .. 1165 loop
:v_clob := :v_clob || 'b' || i || '~';
end loop;
end;
/
select :v_clob from dual
/
begin
:v_clob := null;
for i in reverse 1 .. 1165 loop
:v_clob := :v_clob || 'a' || i || '~';
end loop;
end;
/
select :v_clob from dual
/
begin
:v_clob := null;
for i in 1 .. 1165 loop
:v_clob := :v_clob || 'b' || i || '~';
end loop;
end;
/
select :v_clob from dual
/
spool off
spool test.ctl
prompt load data
prompt infile test.dat
prompt into table taba
prompt when (1) = 'a'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 position (1),
prompt col2,
prompt col3)
prompt into table tabb
prompt when (1) = 'b'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 filler position (1),
select 'col' || to_char (1 + rownum) || ' filler,'
from dual
connect by level <= 1161
/
prompt col1163,
prompt col1164,
prompt col1165)
spool off
start saved_settings
spool test.txt
create table taba
(col1 varchar2 (5),
col2 varchar2 (5),
col3 varchar2 (5))
/
create table tabb
(col1163 varchar2 (5),
col1164 varchar2 (5),
col1165 varchar2 (5))
/
host sqlldr scott/tiger control=test.ctl log=test.log
select * from taba
/
select * from tabb
/
spool off
drop table taba
/
drop table tabb
/
ed test.txt
ed test.log
ed test.ctl
ed test.dat
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485515 is a reply to message #485488] |
Wed, 08 December 2010 11:49 |
|
Barbara Boehmer
Messages: 9101 Registered: November 2002 Location: California, USA
|
Senior Member |
|
|
Here is another revision. I added long 10000 and longchunksize 10000 to the set commands, because they affect the size of the clob that you are able to select. I also added a set echo on before spooling test.txt.
store set saved_settings replace
set echo off feedback off heading off linesize 6000 long 10000 longchunksize 10000
set pagesize 0 serveroutput on size 1000000 verify off
spool test.dat
variable v_clob clob
begin
:v_clob := null;
for i in 1 .. 1165 loop
:v_clob := :v_clob || 'a' || i || '~';
end loop;
end;
/
select :v_clob from dual
/
begin
:v_clob := null;
for i in reverse 1 .. 1165 loop
:v_clob := :v_clob || 'b' || i || '~';
end loop;
end;
/
select :v_clob from dual
/
begin
:v_clob := null;
for i in reverse 1 .. 1165 loop
:v_clob := :v_clob || 'a' || i || '~';
end loop;
end;
/
select :v_clob from dual
/
begin
:v_clob := null;
for i in 1 .. 1165 loop
:v_clob := :v_clob || 'b' || i || '~';
end loop;
end;
/
select :v_clob from dual
/
spool off
spool test.ctl
prompt load data
prompt infile test.dat
prompt into table taba
prompt when (1) = 'a'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 position (1),
prompt col2,
prompt col3)
prompt into table tabb
prompt when (1) = 'b'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 filler position (1),
select 'col' || to_char (1 + rownum) || ' filler,'
from dual
connect by level <= 1161
/
prompt col1163,
prompt col1164,
prompt col1165)
spool off
start saved_settings
set echo on
spool test.txt
create table taba
(col1 varchar2 (5),
col2 varchar2 (5),
col3 varchar2 (5))
/
create table tabb
(col1163 varchar2 (5),
col1164 varchar2 (5),
col1165 varchar2 (5))
/
host sqlldr scott/tiger control=test.ctl log=test.log
select * from taba
/
select * from tabb
/
spool off
drop table taba
/
drop table tabb
/
ed test.txt
ed test.log
ed test.ctl
ed test.dat
|
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485517 is a reply to message #485516] |
Wed, 08 December 2010 12:25 |
rappaj
Messages: 147 Registered: June 2010 Location: NYC
|
Senior Member |
|
|
I have another question when using this ctl file to load into multiple tables.
Lets say for example in your TABLEA you have your employee info including the ssn. Then in TABLEB other info like employee address or something. Later, I would need to join the 2 tables to get them together. Can I set up the ctl file to also include the ssn to be put into TABLEB using a delimitted ctl file? Or can that only happen when using a ctl file using the positional method?
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485522 is a reply to message #485516] |
Wed, 08 December 2010 12:35 |
|
Barbara Boehmer
Messages: 9101 Registered: November 2002 Location: California, USA
|
Senior Member |
|
|
I seem to recall that in earlier versions, the column to be checked needed to be in the table, not a filler field, so perhaps that is the problem. I added co1 to tableb in the revised script below. If that doesn't work, then I am running out of ideas.
store set saved_settings replace
set echo off feedback off heading off linesize 6000 long 10000 longchunksize 10000
set pagesize 0 serveroutput on size 1000000 verify off
spool test.dat
variable v_clob clob
begin
:v_clob := null;
for i in 1 .. 1165 loop
:v_clob := :v_clob || 'a' || i || '~';
end loop;
end;
/
select :v_clob from dual
/
begin
:v_clob := null;
for i in reverse 1 .. 1165 loop
:v_clob := :v_clob || 'b' || i || '~';
end loop;
end;
/
select :v_clob from dual
/
begin
:v_clob := null;
for i in reverse 1 .. 1165 loop
:v_clob := :v_clob || 'a' || i || '~';
end loop;
end;
/
select :v_clob from dual
/
begin
:v_clob := null;
for i in 1 .. 1165 loop
:v_clob := :v_clob || 'b' || i || '~';
end loop;
end;
/
select :v_clob from dual
/
spool off
spool test.ctl
prompt load data
prompt infile test.dat
prompt into table taba
prompt when (1) = 'a'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 position (1),
prompt col2,
prompt col3)
prompt into table tabb
prompt when (1) = 'b'
prompt fields terminated by '~'
prompt trailing nullcols
prompt (col1 position (1),
select 'col' || to_char (1 + rownum) || ' filler,'
from dual
connect by level <= 1161
/
prompt col1163,
prompt col1164,
prompt col1165)
spool off
start saved_settings
set echo on
spool test.txt
create table taba
(col1 varchar2 (5),
col2 varchar2 (5),
col3 varchar2 (5))
/
create table tabb
(col1 varchar2 (5),
col1163 varchar2 (5),
col1164 varchar2 (5),
col1165 varchar2 (5))
/
host sqlldr scott/tiger control=test.ctl log=test.log
select * from taba
/
select * from tabb
/
spool off
drop table taba
/
drop table tabb
/
ed test.txt
ed test.log
ed test.ctl
ed test.dat
|
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485530 is a reply to message #485522] |
Wed, 08 December 2010 12:49 |
rappaj
Messages: 147 Registered: June 2010 Location: NYC
|
Senior Member |
|
|
Barbara,
So close and almost perfect. TABLEB empty still. But believe me, what you have done so far was excellent. It was a lot of work you did to create a test scenerio for me, and I see that it can work loading over 1165 columns split into several tables.
They will be sending me this file towards the end of the month so I have plenty of time to continue testing and playing around with it.
I think we can close this topic now if you want. I appreciate all the help. But if you want me to test anything other changes, I'd be glad to.
Thank you
|
|
|
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485541 is a reply to message #485540] |
Wed, 08 December 2010 13:13 |
rappaj
Messages: 147 Registered: June 2010 Location: NYC
|
Senior Member |
|
|
P.S. By looking at this example I found on the net and in your opinion, would you say it would be easier for me to get the file ~ delimitted, or positional? What would be your preference?
This sample below doesn't use fillers when using positional.
LOAD DATA
INFILE /u01/app/oracle/load.dat
REPLACE
INTO TABLE emp
WHEN empno != ' '
( empno POSITION(1:4) INTEGER EXTERNAL,
ename POSITION(6:15) CHAR
)
INTO TABLE proj
WHEN projno != ' '
( projno POSITION(25:27) INTEGER EXTERNAL,
empno POSITION(1:4) INTEGER EXTERNAL
)
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485548 is a reply to message #485541] |
Wed, 08 December 2010 13:55 |
|
Barbara Boehmer
Messages: 9101 Registered: November 2002 Location: California, USA
|
Senior Member |
|
|
The ssn position(1) is for when the ssn is the first column in your data file. The position(1) needs to go with the first column, whatever it is.
Whether you use delimited or fixed position depends on your data. If your data is such that you are certain that your delimiter will never be within your data, then a delimited file is easier to create a control file for and you can generate a bunch of filler fields as I did. If your delimiter may be present in your data, then a fixed position file is safest, but it will be a major pain counting the columns and trying to get it right for a row that long. With a row that long, I would almost certainly go for a delimited data file, then after loading, check the bad file, remove any delimiters using an editor like notepad, then reload the fixed bad file.
|
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485716 is a reply to message #485548] |
Thu, 09 December 2010 10:08 |
rappaj
Messages: 147 Registered: June 2010 Location: NYC
|
Senior Member |
|
|
Hi,
Am I doing this correctly With the FILLERS in the sample below? Do my WHEN clauses look ok?
For each record
- I want to Load the ssn and columns cola-colj into TABLEAAA
- I want to Load the ssn and columns colk-colp into TABLEBBB
- I want to Load the ssn and columns colq-colt into TABLECCC
- I want to Load the ssn and columns colu-colz into TABLEDDD
load data
infile 'employee.txt'
append
into table tableAAA
when ssn != ' '
fields terminated by '~'
trailing nullcols
(
ssn position(1),
cola,
colb,
colc,
cold,
cole,
colf,
colg,
colh,
coli,
colj
INTO TABLE tableBBB
when ssn != ' '
fields terminated by '~'
trailing nullcols
ssn position(1),
cola filler,
colb filler,
colc filler,
cold filler,
cole filler,
colf filler,
colg filler,
colh filler,
coli filler,
colj filler,
colk,
coll,
colm,
coln,
colo,
colp,
INTO TABLE tableCCC
when ssn != ' '
fields terminated by '~'
trailing nullcols
ssn position(1),
cola filler,
colb filler,
colc filler,
cold filler,
cole filler,
colf filler,
colg filler,
colh filler,
coli filler,
colj filler,
colk filler,
coll filler,
colm filler,
coln filler,
colo filler,
colp filler,
colq,
colr
cols,
colt,
INTO TABLE tableDDD
when ssn != ' '
fields terminated by '~'
trailing nullcols
ssn position(1),
cola filler,
colb filler,
colc filler,
cold filler,
cole filler,
colf filler,
colg filler,
colh filler,
coli filler,
colj filler,
colk filler,
coll filler,
colm filler,
coln filler,
colo filler,
colp filler,
colq filler,
colr filler,
cols filler,
colt filler,
COLu,
colv,
colw,
colx,
coly,
colz
)
Thank you
[Updated on: Thu, 09 December 2010 10:42] Report message to a moderator
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485731 is a reply to message #485716] |
Thu, 09 December 2010 12:10 |
|
Barbara Boehmer
Messages: 9101 Registered: November 2002 Location: California, USA
|
Senior Member |
|
|
You need to enclose each group of fields after each table within parentheses. So, you need a left parenthesis before each ssn and after colj under tableaaa and colp under tablebbb and colt under tableccc. You are also missing a comma after colr under tableccc. This is assuming that your data matches. I don't have your data. I also don't have your tables. It will help if you would supply create table statements and sample data. What you should do is get or make some test data and test it, check the log, and work through the errors one at a time. After fixing the parentheses and comma, here is what I got with some made-up data and tables.
-- employee.txt:
1~cola~colb~colc~cold~cole~colf~colg~colh~coli~colj~colk~coll~colm~coln~colo~colp~colq~colr~cols~colt~colu~colv~colw~colx~coly~colz
2~COLA~COLB~COLC~COLD~COLE~COLF~COLG~COLH~COLI~COLJ~COLK~COLL~COLM~COLN~COLO~COLP~COLQ~COLR~COLS~COLT~COLU~COLV~COLW~COLX~COLY~COLZ
-- test.ctl:
load data
infile 'employee.txt'
append
into table tableAAA
when ssn != ' '
fields terminated by '~'
trailing nullcols
(ssn position(1),
cola,
colb,
colc,
cold,
cole,
colf,
colg,
colh,
coli,
colj)
INTO TABLE tableBBB
when ssn != ' '
fields terminated by '~'
trailing nullcols
(ssn position(1),
cola filler,
colb filler,
colc filler,
cold filler,
cole filler,
colf filler,
colg filler,
colh filler,
coli filler,
colj filler,
colk,
coll,
colm,
coln,
colo,
colp)
INTO TABLE tableCCC
when ssn != ' '
fields terminated by '~'
trailing nullcols
(ssn position(1),
cola filler,
colb filler,
colc filler,
cold filler,
cole filler,
colf filler,
colg filler,
colh filler,
coli filler,
colj filler,
colk filler,
coll filler,
colm filler,
coln filler,
colo filler,
colp filler,
colq,
colr,
cols,
colt)
INTO TABLE tableDDD
when ssn != ' '
fields terminated by '~'
trailing nullcols
(ssn position(1),
cola filler,
colb filler,
colc filler,
cold filler,
cole filler,
colf filler,
colg filler,
colh filler,
coli filler,
colj filler,
colk filler,
coll filler,
colm filler,
coln filler,
colo filler,
colp filler,
colq filler,
colr filler,
cols filler,
colt filler,
COLu,
colv,
colw,
colx,
coly,
colz)
-- create tables:
SCOTT@orcl_11gR2> create table tableaaa
2 (ssn number,
3 cola varchar2 (5),
4 colb varchar2 (5),
5 colc varchar2 (5),
6 cold varchar2 (5),
7 cole varchar2 (5),
8 colf varchar2 (5),
9 colg varchar2 (5),
10 colh varchar2 (5),
11 coli varchar2 (5),
12 colj varchar2 (5))
13 /
Table created.
SCOTT@orcl_11gR2> create table tablebbb
2 (ssn number,
3 colk varchar2 (5),
4 coll varchar2 (5),
5 colm varchar2 (5),
6 coln varchar2 (5),
7 colo varchar2 (5),
8 colp varchar2 (5))
9 /
Table created.
SCOTT@orcl_11gR2> create table tableccc
2 (ssn number,
3 colq varchar2 (5),
4 colr varchar2 (5),
5 cols varchar2 (5),
6 colt varchar2 (5))
7 /
Table created.
SCOTT@orcl_11gR2> create table tableddd
2 (ssn number,
3 colu varchar2 (5),
4 colv varchar2 (5),
5 colw varchar2 (5),
6 colx varchar2 (5),
7 coly varchar2 (5),
8 colz varchar2 (5))
9 /
Table created.
-- load:
SCOTT@orcl_11gR2> host sqlldr scott/tiger control=test.ctl log=test.log
-- results:
SCOTT@orcl_11gR2> select * from tableaaa
2 /
SSN COLA COLB COLC COLD COLE COLF COLG COLH COLI COLJ
---------- ----- ----- ----- ----- ----- ----- ----- ----- ----- -----
1 cola colb colc cold cole colf colg colh coli colj
2 COLA COLB COLC COLD COLE COLF COLG COLH COLI COLJ
2 rows selected.
SCOTT@orcl_11gR2> select * from tablebbb
2 /
SSN COLK COLL COLM COLN COLO COLP
---------- ----- ----- ----- ----- ----- -----
1 colk coll colm coln colo colp
2 COLK COLL COLM COLN COLO COLP
2 rows selected.
SCOTT@orcl_11gR2> select * from tableccc
2 /
SSN COLQ COLR COLS COLT
---------- ----- ----- ----- -----
1 colq colr cols colt
2 COLQ COLR COLS COLT
2 rows selected.
SCOTT@orcl_11gR2> select * from tableddd
2 /
SSN COLU COLV COLW COLX COLY COLZ
---------- ----- ----- ----- ----- ----- -----
1 colu colv colw colx coly colz
2 COLU COLV COLW COLX COLY COLZ
2 rows selected.
SCOTT@orcl_11gR2>
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485738 is a reply to message #485731] |
Thu, 09 December 2010 12:48 |
rappaj
Messages: 147 Registered: June 2010 Location: NYC
|
Senior Member |
|
|
Ooops. I had several mistakes, and I see them now. You are very fast creating these test cases. So I need the FILLERS. I'm going to have a super huge ctl file when I get finished with this. I'll show you what it looks like when I'm finished with it.
I don't have any data yet, and the tables have not been created yet as well. I think I will be getting a test file next week hopefully. They said the end of the month, but I'm ready to get this thing going already.
Now I will start creating the ctl file since I know what the columns are.
Another question. Over the years, in my ctl files for the numeric columns I just supplied the column name. For the varchar columns, I did it like this below. Do I need to trim, or am I just wasting my time?
...
pay_amount,
LAST_NAME "ltrim(rtrim(:LAST_NAME))" ,
FIRST_NAME "ltrim(rtrim(:FIRST_NAME))" ,
badge_number,
...
Thanks
|
|
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #485742 is a reply to message #485741] |
Thu, 09 December 2010 13:17 |
rappaj
Messages: 147 Registered: June 2010 Location: NYC
|
Senior Member |
|
|
Oh, for the amounts data that I will be getting, they will be putting a minus or plus before
the number like this (+12345.67 and -12345.64)
I guess the column on the table has to be varchar2. Then later on I can convert it to to_number to do totalling and summarize for reporting. Is that correct?
|
|
|
|
|
|
Re: Creating a SQL Loader ctl file with over 1000 columns [message #486562 is a reply to message #486459] |
Thu, 16 December 2010 14:16 |
rappaj
Messages: 147 Registered: June 2010 Location: NYC
|
Senior Member |
|
|
Hi,
I see that we have ssn position(1) defined as is so it can be loaded into several tables during the load, and because it happens to be the first column of the datafile.
I want to put in an additional column into all the tables called pay_period and it's in column 450 of the input data file.
Do I define the column as pay_period position(450) and can I put it directly under ssn?
into table tableBBB
when ssn != ' '
fields terminated by '~'
trailing nullcols
(
ssn position(1),
pay_period position(450),
cola filler,
colb filler,
colc,
cold,
cole
...
...
Thanks,
Joe
|
|
|
Goto Forum:
Current Time: Mon Dec 23 11:01:40 CST 2024
|