Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Mailing Lists -> Oracle-L -> RE: many schemas
Hi Lisa,
My last job had a DSS database with over 400 schemas and over 110,000 objects.
This split up as 9000 tables, 14000 indices, 20000 views and about 60,000
synonyms. The rest were stored code and other things...
While querying the data dictionary include OWNER in the where clause as this is the
only field indexed in most of the dba views.
The Dictionary Cache must be larger than usual. Pinning objects and caching small tables and indices is also very important. Analyse your heavily hit tables and cache them even if they are large. This may require some changes in init.ora (CACHE_THRESHOLD_SIZE??), as well as drastic increases in the size of the SGA. We cached a 150M table to fix the performanceproblem below.
If a large number of your objects are private synonyms in many cases public
synonyms will clean up your dictionary. Often you can use a public synonym and
control access through grants. Use roles instead of individual grants. Also look for similarily defined views, in many cases these can also beshared.
We noticed a performance hit once we got over 40,000 objects mainly due to the
grant lookups and owner translations. However, maintenance is much easier if
each schema has it's own tablespace. Since our database consisted of related
geological datasets it would have been dangerous and insecure to simplify the
schema model. As Jared mentioned, seperate databases are very slow so the only
way to keep performance up is through proper design of your access model.
If you like tuning you are in for alot of fun.
Dave
On Mon, 15 May 2000 Lisa_Koivu_at_gelco.com wrote:
> Hello -
>
> Has anyone seen a database with a large number of schemas? I have one database
Snip .....
Dave Morgan
Senior Database Administrator
Internet Barter Inc.
Received on Tue May 16 2000 - 10:33:37 CDT
![]() |
![]() |