running multiple sql loader runs on many datafiles [message #154684] |
Tue, 10 January 2006 03:51 ![Go to next message Go to next message](/forum/theme/orafaq/images/down.png) |
hrishy
Messages: 25 Registered: August 2005
|
Junior Member |
|
|
Hi
I need to split a huge file into multiple smaller files using split command.
After that i need to process each file in the back ground with sql loader .Sql loader is a utlity to load CSV files into oracle .
Check the status of each of these sqlloaders and then after sucessfull completion join the file back again.
Any idea on how to do run the process in background and then check for their sucessful execution
|
|
|
|
Re: running multiple sql loader runs on many datafiles [message #154713 is a reply to message #154698] |
Tue, 10 January 2006 05:02 ![Go to previous message Go to previous message](/forum/theme/orafaq/images/up.png) ![Go to next message Go to next message](/forum/theme/orafaq/images/down.png) |
hrishy
Messages: 25 Registered: August 2005
|
Junior Member |
|
|
Hi
Nope we cannot use direct path load.As we are loading into a very active table whose perfromance is critical.
I am jsut wundering if anybody has a sample code for this try to split the input datafile and then use a control file and run these sessions in the background.
check theier status .But when i run a job in eth background how do i check its return status if it completed sucesfully or not.
And if it fails then when i restart it should only load the failed data file.
regards
Hrishy
|
|
|
Re: running multiple sql loader runs on many datafiles [message #154738 is a reply to message #154713] |
Tue, 10 January 2006 06:38 ![Go to previous message Go to previous message](/forum/theme/orafaq/images/up.png) ![Go to next message Go to next message](/forum/theme/orafaq/images/down.png) |
![](/forum/images/custom_avatars/42800.jpg) |
Mahesh Rajendran
Messages: 10708 Registered: March 2002 Location: oracleDocoVille
|
Senior Member Account Moderator |
|
|
You can get any fancy you want.
--
-- before load
--
oracle@mutation#count mutation scott.dept
Table:scott.dept
COUNT(*)
----------
0
--
-- check the datafile to be loaded. This file has 70 lines
--
oracle@mutation#wc -l dept.dat
70 dept.dat
--
-- This is the script that takes filename to be loaded as input.
--
oracle@mutation#split_load dept.dat
input file is dept.dat
Loading dept.dat00
SQL*Loader: Release 9.2.0.4.0 - Production on Tue Jan 10 07:34:48 2006
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Commit point reached - logical record count 19
Loading dept.dat01
SQL*Loader: Release 9.2.0.4.0 - Production on Tue Jan 10 07:34:48 2006
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Commit point reached - logical record count 20
Loading dept.dat02
SQL*Loader: Release 9.2.0.4.0 - Production on Tue Jan 10 07:34:48 2006
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Commit point reached - logical record count 20
Loading dept.dat03
SQL*Loader: Release 9.2.0.4.0 - Production on Tue Jan 10 07:34:49 2006
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Commit point reached - logical record count 11
--
-- After load
--
oracle@mutation#count mutation scott.dept
Table:scott.dept
COUNT(*)
----------
70
--
-- This is the script
--
oracle@mutation#cat split_load
#split your input file identified by $1.
#i am splitting every 20 lines of a 70 line textfile
#output files are prefixed by filename.extension.
csplit -k -f $1 $1 20 {19} > /dev/null 2>&1
#to be in safe side, i am moving the original file to something else for future usage
mv $1 backup_$1.moved
echo input file is $1
#now browing through every split file created and loading those files one by one
#to make sqlldr accept an input filename, i am using a template controlfile.
#for every load i pass a new value for the input file ( the file split)
#and create a new control file and use that file. So the template stays as is.
#during sqlload i am using errors=0, so that, even if ONE record fails, the whole
#load / file is aborted. We can comeback and load that file later.
ls -lt $1* | awk '{print $9}'| while read split_filename
do
echo "Loading $split_filename"
sed s/FILENAME/$split_filename/ template.ctl > /tmp/dept.ctl
sqlldr userid=scott/tiger control=/tmp/dept.ctl errors=0
done
--
--The template i used.
--
oracle@mutation#cat template.ctl
LOAD DATA
infile 'FILENAME'
append
INTO TABLE dept
FIELDS TERMINATED BY ","
trailing nullcols
(
deptno ,
dname,
loc
)
|
|
|
Re: running multiple sql loader runs on many datafiles [message #154918 is a reply to message #154684] |
Wed, 11 January 2006 05:30 ![Go to previous message Go to previous message](/forum/theme/orafaq/images/up.png) ![Go to next message Go to next message](/forum/theme/orafaq/images/down.png) |
hrishy
Messages: 25 Registered: August 2005
|
Junior Member |
|
|
Hi
[code]
ls -lt $1* | awk '{print $9}'| while read split_filename
do
echo "Loading $split_filename"
sed s/FILENAME/$split_filename/ template.ctl > /tmp/dept.ctl
sqlldr userid=scott/tiger control=/tmp/dept.ctl errors=0 &
done
[code]
I would like run the loader in the background and then monitor the if the job is sucessfull.Any ideas how to do that ?
regards
Hrishy
|
|
|
Re: running multiple sql loader runs on many datafiles [message #154922 is a reply to message #154918] |
Wed, 11 January 2006 06:19 ![Go to previous message Go to previous message](/forum/theme/orafaq/images/up.png) |
![](/forum/images/custom_avatars/42800.jpg) |
Mahesh Rajendran
Messages: 10708 Registered: March 2002 Location: oracleDocoVille
|
Senior Member Account Moderator |
|
|
define what you mean by 'background'?
As stated before ,in conventional means, '&' and NOHUP will take care of it.
To monitor all you have to is make use of the logfile.
sqlldr userid=scott/tiger control=/tmp/dept.ctl log=$split_filename.log_someother_tag_like_timestamp.
write scripts to email those logfiles to you
or email only if something is BAD.
As said before, you can get any fancy you can.
|
|
|