Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Usenet -> c.d.o.server -> Re: OCI bulk insert failure with different NLS_LANG setting
cjsun_sp_at_yahoo.com (C.J. Sun) wrote in message news:<bd817e2a.0303241430.67def6e7_at_posting.google.com>...
> I found a weird problem in OCI bulk insert. It failed if client locale
> setting
> is different from DB server. my DB server locale is
> NLS_LANG=AMERICAN_AMERICA.ZHT16BIG5 and the table is very simple:
> SQL> desc foo
> Name Null? Type
> ----------------------------------------- --------
> ----------------------------
> A VARCHAR2(200)
> B VARCHAR2(200)
> C NUMBER(9)
>
> If I set client locale same as
> server,i.e.NLS_LANG=AMERICAN_AMERICA.ZHT16BIG5,
> bulk insert OCIStmtPrepare(), OCIBindByPos(), OCIStmtExecute() runs
> sucessfully.
>
> However, if I set client locale as NLS_LANG=AMERICAN_AMERICA.UTF8, it
> throws
> bind failure:
> OCI Error: ORA-01461: can bind a LONG value only for insert into a
> LONG column
>
> What's more strange is that if the table has only 2 columns as follow,
> bulk insert now succeeds with any locale setting. Anyone can help me
> out ? Thanks.
>
> SQL> desc foo
> Name Null? Type
> ----------------------------------------- --------
> ----------------------------
> A VARCHAR2(200)
> C NUMBER(9)
Try setting OCIAttrSet() for property OCI_ATTR_MAXCHAR_SIZE (or
OCI_ATTR_MAXDATA_SIZE). This could be happening because your
characterset conversion is causing Oracle internal buffer overflow or
Oracle seems that the converted size is larger than allowed VARCHAR2
size.
Amit
Development Engineer
http://www.roguewave.com/products/sourcepro/db/
Received on Thu Apr 17 2003 - 17:08:36 CDT
![]() |
![]() |