ORA-03113: end-of-file on communication channel in toad [message #414993] |
Fri, 24 July 2009 06:09 |
sundarfaq
Messages: 235 Registered: October 2007 Location: Chennai
|
Senior Member |
|
|
Hi,
i am getting ORA-03113: end-of-file on communication channel error when executing select * from bank table in toad. the bank table has 10513 records.but i won't getting any error when i select * from details table. the details table has 1000 records.
the end of the communication error will shows when select the huge amount of data (i.e greater than 1000 records).
Toad version:9.1.0.62 .
but in sqlplus and other tools it works fine. the problem occur in toad only.
how we can resolve this issue?
Thanks,
Michael
|
|
|
|
|
Re: ORA-03113: end-of-file on communication channel in toad [message #416175 is a reply to message #415083] |
Fri, 31 July 2009 00:13 |
|
Kevin Meade
Messages: 2103 Registered: December 1999 Location: Connecticut USA
|
Senior Member |
|
|
I have seen many reasons for this error over the years, but three cover 99% of all happenings:
1) a true network hickup caused a disconnect or memory corruption in your client somewhere. There is nothing you can do for this but log back into your tool whatever it is. You many even be required to close your program and restart to fix it depending upon the specific reason for the error.
2) bad data somehow. A column in one of your tables has a "bad" character. This one can be both easy and hard to diagnose. There will usually be one row that if you can skip, everthing will be fine so it is often possible to find the row. From there you can selectively retrieve columns on that one row till you find the column that causes the failure. Of course not all is peachy for once you find the row and column what do you do wiht it? Sometimes you can use the DUMP function to see what is hurting you. Ohter times you can update the column to itself to "fix" the problem. Alternatively you can set the column to null and then update it a second time with a correct value assuming you know what the value is. Other times you are hosed and cannot do any of this so you must delete the row and add it back from some other source. Good luck with that one.
3) you have as was mentioned by a prior poster, exceeded some limit of whatever tool you are using. This can as was suggested sometimes be fixed with parameter settings in your tool but more often than not the limitation is a hard one in your tool. A good example might be a clob with xml in it that is just too big for your tool to handle. There is no easy fix for this one. Often you must either truncate the length of your data or find a different tool or simply not look at the offending rows. Identify the culprit row following similar process as mentioned above.
For what it is worth, my money is on #2.
Good luck, Kevin
|
|
|