Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
Home -> Community -> Mailing Lists -> Oracle-L -> Perf of 10000+ tables in a db
Good (morning | day | evening) everyone,
I have to store horn clauses in a db to perform some AI/ML processing. OK, the facts could be stored as tables, the rules are mapped as views. (the rules could be recursive, but this should stay my problem :-).
This system is used as data mining/disclosure system and many fact sets could be arising. (This is not a DW!) Storing them into separate tables could lead into many thousand tables filled with 10 -100 -1000 rows each, columns on every table <10. The amount of views are huge too, say twice as the amount of tables. These views have depencies each other -a view could 'call' three others. The queries will probably be inner joins over a few tables mostly selected via equal conditions. A master table for translating facts/rules into tables/views names is also used.
Has anybody experience with such funky configuration? SAP-experts or similar? How works that system with its catalog data? Whats with tuning -only table scan or an automated analyzing and creating of indexes / automated materialisation of highly-used views? Whats with depencies between views -could be the recompiling of views a bottleneck? Questions and questions...
I need your suggestions.
Thanks, oli.
Oliver Artelt, System- und Datenbankadministration
email: oli@cubeoffice.de # web: http://www.cubeoffice.de