Home » RDBMS Server » Server Utilities » Partial Import
Partial Import [message #252389] Wed, 18 July 2007 14:25 Go to next message
schastlivtsev
Messages: 5
Registered: July 2007
Junior Member
Hi all,
We've been perplexed lately by this adamant behavior of the import utility in the following case:
The situation demands full export of database schema of a particular user, say A.
Now, this user A has been created in another database, in some other physical location, perhaps on a different machine altogether. A has completely created the definitions of all or a part the previous tables, all of which have been exported in A.dmp (say).
Now when A imports all or a part of the previous data from A.dmp, the old tables which are missing now, are by default (atleast that's what it seems) created again.
This whole functionality is to be bundled into a complete utility which is to be generic. We are not required to know which tables exist and which do not. What is desirable is for the import utility to behave judiciously, to populate tables whose instance already exist (of course we are using import=y) and dont create the ones which have not been instantiated.
***
Summarizing, the tables whose schema does not exist should not be created by A.dmp, since it contains the DDL of the exported tables too. Is there any way under the sun, perhaps an attribute to be reset, which could make the import utility inherently chose between its tables.
Right now, as I understand it, import is just a stock disgorging of all data exported without any pre-logic.
Any help, hints, redirections etc please reply soon!
Re: Partial Import [message #252398 is a reply to message #252389] Wed, 18 July 2007 14:38 Go to previous messageGo to next message
BlackSwan
Messages: 26766
Registered: January 2009
Location: SoCal
Senior Member
>Right now, as I understand it, import is just a stock disgorging of all data exported without any pre-logic.
You are 100% correct.
You need to be smarter than the tools you are using.
Since your requirements are different from those designed into Oracle's IMPORT utility, you'll have to write your own front end or a replacement tool.
icon3.gif  Re: Partial Import [message #252408 is a reply to message #252398] Wed, 18 July 2007 14:58 Go to previous messageGo to next message
schastlivtsev
Messages: 5
Registered: July 2007
Junior Member
I've read your replies on many other topics, an active member that you are, it leaves the questioner in quite a lurch when confronted by statements like "you need to be smarter than the machine"... is that a sort of a signature? The statment otherwise, intriguing it is Cool

We tried XML, but since we need to be generic, the column names, which are needed by xmlagg, xmlforest, xmlelement are a serious takedown...
But something about import eludes me, there has to be some tweak here and there, its a very basic requirement after all!

The front end thing right now, cumbersome and inefficient though it is, is the JDBC way, querying and storing table names...

My problem is simple: Skip the tables not instantiated, and do the rest.
Re: Partial Import [message #252416 is a reply to message #252389] Wed, 18 July 2007 15:16 Go to previous messageGo to next message
BlackSwan
Messages: 26766
Registered: January 2009
Location: SoCal
Senior Member
If it were my task to complete, I'd use PERL which would write "dynamic" shell scripts such that:
TABLES=(TAB1, TAB3,TAB7)
would be generated by querying database via DBD/DBI.

Alternatively load the data into "work" tables in a different schema & move the rows which need to actually get populated.

[Updated on: Wed, 18 July 2007 15:19] by Moderator

Report message to a moderator

Re: Partial Import [message #252501 is a reply to message #252389] Thu, 19 July 2007 01:03 Go to previous messageGo to next message
schastlivtsev
Messages: 5
Registered: July 2007
Junior Member
dude... the utility is gonna be created by one person, used by a second person and installed by a third person, and these three people dont know each other. The third person will only be a IT Services guy working for the second person who will unknowingly intall our uitility like a patch.
Every thing has to be discerned by the utility itself:
1) Perform the full export of the source database, or pick up dump files which have been created days or weeks ago. Persistence is the word!

2)the third person will open the utility, specify the database or tablespace and run the utility.

3)the third person will be drinking coffee, gambolling around with colleagues while the utility works on, and the second person is completely unaware of the functioning and looks forward to a successful operation.

We need the following things to be automated:
1) The full export files to be fetched or the export to be performed

2) running the import utility with ignore=y to populate predefined schemas, leaving the new schemas and "skipping the absent table schemas" <-- here in enters the showstopper

The front end thing as mentioned: how could we go about it?
The PERL script you said asks for table names... which we dont know!
The work table thing? I'mm blank about that!
Directions man, directions!
Re: Partial Import [message #252528 is a reply to message #252389] Thu, 19 July 2007 02:07 Go to previous message
schastlivtsev
Messages: 5
Registered: July 2007
Junior Member
Hey! got something, but with some strings attached!
The Export Data Pump Utility... unfortunately from 10g onwards...

It has an attribute: CONTENT :

CONTENT
Default: ALL

Purpose

Enables you to filter what Export unloads: data only, metadata only, or both.

Syntax and Description

CONTENT={ALL | DATA_ONLY | METADATA_ONLY}

ALL unloads both data and metadata. This is the default.

DATA_ONLY unloads only table row data; no database object definitions are unloaded.

METADATA_ONLY unloads only database object definitions; no table row data is unloaded.

Restrictions

The CONTENT=METADATA_ONLY parameter cannot be used in conjunction with the parameter TRANSPORT_TABLESPACES (transportable-tablespace-mode).

Example

The following is an example of using the CONTENT parameter:

> expdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=hr.dmp CONTENT=METADATA_ONLY

This command will execute a schema-mode export that will unload only the metadata associated with the hr schema. It defaults to a schema-mode export of the hr schema because no export mode is specified.


Practically, the world still loves Oracle 9.*
And I am pretty confident, the parameter ignore=y being the same, this could serve my purpose: since there are no schemas in the dump file, only data; it's power to create tables is absolutely decimated... it can only fill up data, and not create a DDL... i fact, it even does away with the need for ignore=y!
Wotsay?
And, now the course of the problem changes, all hope notwithstanding, how to use Export Dump in version less than Oracle 10g?

Anyhow, it hasnt yet refuted the existing problem... so please continue any directions related to previous queries as well!
Thanks
Ravi.
Previous Topic: import problem
Next Topic: granting export rights
Goto Forum:
  


Current Time: Sat Jun 22 21:44:36 CDT 2024