Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Usenet -> c.d.o.server -> Re: what characterset to use?
On Aug 23, 2:48 pm, sybra..._at_hccnet.nl wrote:
> On Thu, 23 Aug 2007 07:02:10 -0700, Ben <bal..._at_comcast.net> wrote:
> >I'm not saying it is feasible to have a database set to use US7ASCII
> >as its character set. I'm simply saying that in the scenario that
> >Sybrand listed, 1 database and 1 client both being set to us7ascii, I
> >don't see the issue. UNLESS you introduce a client using a different
> >character set.
>
> Ok, again
>
> Client set to US7ASCII
> Database set to US7ASCII
> You send an eight bit character.
> Oracle sees 7 bit client character set, 7 bit server character set
> --->
> HEY, I DON'T HAVE TO CONVERT ANY CHARACTER.
> What will happen if all of a sudden someone decides to export using 7
> bit NLS_LANG and import into 8 bit database.
>
> Please don't imply I'm making up fairy tales, I'm talking stories for
> grown ups!!!!
> REAL WORLD HORROR STORIES with customers getting GROSS!!!
>
> And yes: this explanation is on Metalink!!!
>
> --
I'm not implying anything. I'm trying to understand.
How do you insert an 8 bit character with a 7 bit client into a db with a 7 bit character set? Wouldn't that be a square peg round hole kind of thing? You of course wouldn't get the 8 bit character back out of the 7 bit db.
> >You don't really have control over what character set all the clients
> >connect with, do you? If you have a client that uses US7ASCII and they
> >select then update based on results, you could potentially corrupt all
> >your data. no?
The example in Mr Kyte's book is what I am referring to in my original question of not being able to avoid corruption. How can you keep someone from setting their NLS_LANG to us7ascii and updating an 8 bit or multibyte character field? Anytime that happens you would get a replacement character wouldn't you? Received on Thu Aug 23 2007 - 14:19:00 CDT
![]() |
![]() |