Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Usenet -> c.d.o.server -> Re: utl_file limits?
On Tue, 11 Aug 1998 03:44:03 -0400, "David Chasteen"
<davidc_at_opteamasoft.com> wrote:
>I get an "ORU-10027 buffer overflow, limit of 20,000 bytes" error when
>writing a long file (it gets to 156 kb) from a procedure. It still does=
it
>after opening, writing, flushing, and closing the file on each statement=
as
>it loops through creating the line. For example:
>
> -- declared earlier...
> table_dml := UTL_FILE.FOPEN =
('c:\aopt\vss\dev\db\install\dml',
>table_nm||'.sql', 'A');
> -- Write load data command for table into loaddata.sql
> UTL_FILE.PUT_LINE(table_dml,col_line);
> UTL_FILE.FFLUSH(table_dml);
> UTL_FILE.FCLOSE(table_dml); -- close table DML file
>
>I looked high and low for the "buffer" parameter the message is refering=
to
>in order to set it higher.
>
>Is there a "buffer" parameter for writing out to a file?
>Is it in the SGA?
>Is it limited to 20000k or can it be set higher?
>Any ideas on strategy to get around this limit?
Hi David,
the only utl_file limitations that I'm aware of are that you can only write logical lines (i.e. no binary data whatsoever), and that such a line may not be longer than 1023 bytes.
I think the ORU-10027 message your are getting is related to some use of DBMS_OUTPUT in your code, which has nothing to do with writing to a file through utl_file.
Although you can use DBMS_OUTPUT.ENABLE(<buffer_size>) to allocate a larger buffer, the maximum value here is 1M, so it's generally a bad idea to use DBMS_OUTPUT for an unknown amount of output data.
HTH,
Peter
--
Peter Schneider
peter.schneider_at_okay.net
Received on Tue Aug 11 1998 - 16:41:07 CDT
![]() |
![]() |