Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Usenet -> c.d.o.server -> Re: SQL*Loader Through Pipes
Billy Verreynne wrote:
> "Daniel A. Morgan" <Daniel.Morgan_at_attws.com> wrote
>
> > Has anyone been successful running SQL*Loader via DBMS_PIPES
> > or any other means.
>
> To add to what Kyle posted. I almost always use Unix pipes when
> dealing with Oracle loading and unload (SQL*Loader, Export and
> Import).
>
> Basically you create a pipe (using the mkfifo or mknod).
>
> You start the reader, e.g. SQL*Loader, to read from the pipe.
>
> Next you start the writer, e.g. an uncompress that writes a zipped
> file to the pipe.
>
> Here's a very basic script that performs an export and automatically
> compresses the exported data using the Unix compress command.
> --
> #! /bin/sh
>
> # sample Unix script for exporting Oracle data
> # to a compress/zipped file
>
> rm export.dmp > /dev/null # delete the pipe/file
> mkfifo -p export.dmp # creates a pipe
>
> # start the compressor as a background job
> compress < export.dmp > export.dmp.Z&
>
> # we read the PID (process id) of the compressor
> # child process
> COMPRESS_PID=$!
> echo "Job $COMPRESS_PID started"
>
> # now we sleep for 5 seconds
> sleep 5
>
> # then we check if the compressor is still
> # running, just to make sure that its running
> # okay
> ps -p $COMPRESS_PID > /dev/null
> if [ "$?" != "0" ] # if process does not exist
> then
> echo "Compressor unexpectedly died."
> echo "Export aborted".
> rm export.dmp
> rm export.dmp.Z
> exit 1
> fi
>
> # now we start the export - the output file specified
> # in the parameter file, points to the pipe we created
> # i.e. file=export.dmp
> echo "Starting exporter"
> exp parfile=foobar.par
> echo ""
>
> # export completed - we sleep for 10
> # seconds to give the compressor a change
> # to complete
> sleep 10
>
> # if the compressor is still running there
> # is a problem - it should have read the eof
> # marker written to the pipe by export
> ps -p $COMPRESS_PID > /dev/null
> if [ "$?" = "0" ]
> then
> echo "Compressor job is still active. It should have terminated by
> now."
> echo "Killing the job..."
> kill -9 $COMPRESS_PID
> echo ""
> echo "WARNING: This export terminated abnormally. Please check the
> export data file and log for errors!"
> else
> echo "Export terminated normally"
> fi
>
> # clean up
> rm export.dmp
> --
>
> SQL*Loader can be run the exact same way. The most complex one I've
> seen in a production environment:
>
> mainframe file --> ftp --> pipe --> dd (EBCDIC to ASCII) --> pipe -->
> SQL*Loader --> Oracle
>
> This used two pipes and two background processes. Worked pretty well.
> At the time we were still limited to 2GB file systems and the
> mainframe data file was a 4GB file. This solved the problem with
> limited file system size.
>
> Yet another reason why NT sucks when it comes to hardcore batch
> processing.. :-)
>
> --
> Billy
Thanks.
Daniel A. Morgan Received on Wed Jul 18 2001 - 12:12:25 CDT
![]() |
![]() |