Re: Database character set conversion
Date: Thu, 13 Aug 2009 10:46:05 +0200
Message-Id: <0MKuxg-1MbVwi3YY1-000RQW_at_mrelayeu.kundenserver.de>
Hi,
> to convert ot a 16 bit character set
which character set are you exactly thinking of?
As of 10g, no (true) 16-bit character se can be used as the "database
character set".
UTF8, AL32UTF8 are just using a variable length encoding, which leads to
more or less
issues.
varchar2(4000 byte) limits the contents to 4000 bytes, independing on how
many characters
actually consume them: 2000 characters with a two byte encoding, 1997
characters with a t
two byte encoding and one with a 3 byte encoding etc.
Typically, such lenght validation occurs in den Application (client), so
even if the client
validates the input, you may get sql errors in the db.
Changing the semantic to varchar2(4000 char) does not allow you to store
4000 characters
(as one may expect...) The limit is still 4000 bytes (10g), but the database
validates
the content length for you. Which leads to a situation, where the db refuses
to store
4000 single byte encoded characters in a varchar2(4000 char) field....
FYI: I've wrote an article in german on this issue ( http://www.wlp-systems.de/unterlagen/Oracle-UTF8-busik-070822.pdf )
Cheers,
Martin
-- http://www.freelists.org/webpage/oracle-lReceived on Thu Aug 13 2009 - 03:46:05 CDT