Increasing parallel process in impdp in runtime [message #551097] |
Sun, 15 April 2012 11:21 |
morad_dba
Messages: 93 Registered: June 2008
|
Member |
|
|
Dear all,
I started datapump not using parallel option
I issued the following command
impdp t24/t24 directory=dp_dump dumpfile=bef_cob_%U.dmp schemas=t24
The total dumpfile size is 200 GB..
Now I want to add parallel process to the job..
Please help me.. how can i do it.
Regards,
Morad
|
|
|
Re: Increasing parallel process in impdp in runtime [message #551101 is a reply to message #551097] |
Sun, 15 April 2012 11:39 |
|
BlackSwan
Messages: 26766 Registered: January 2009 Location: SoCal
|
Senior Member |
|
|
bcm@bcm-laptop:~$ impdp help=yes
Import: Release 11.2.0.1.0 - Production on Sun Apr 15 09:37:52 2012
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
The Data Pump Import utility provides a mechanism for transferring data objects
between Oracle databases. The utility is invoked with the following command:
Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
You can control how Import runs by entering the 'impdp' command followed
by various parameters. To specify parameters, you use keywords:
Format: impdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
USERID must be the first parameter on the command line.
------------------------------------------------------------------------------
The available keywords and their descriptions follow. Default values are listed within square brackets.
ATTACH
Attach to an existing job.
For example, ATTACH=job_name.
CONTENT
Specifies data to load.
Valid keywords are: [ALL], DATA_ONLY and METADATA_ONLY.
DATA_OPTIONS
Data layer option flags.
Valid keywords are: SKIP_CONSTRAINT_ERRORS.
DIRECTORY
Directory object to be used for dump, log and sql files.
DUMPFILE
List of dumpfiles to import from [expdat.dmp].
For example, DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
ENCRYPTION_PASSWORD
Password key for accessing encrypted data within a dump file.
Not valid for network import jobs.
ESTIMATE
Calculate job estimates.
Valid keywords are: [BLOCKS] and STATISTICS.
EXCLUDE
Exclude specific object types.
For example, EXCLUDE=SCHEMA:"='HR'".
FLASHBACK_SCN
SCN used to reset session snapshot.
FLASHBACK_TIME
Time used to find the closest corresponding SCN value.
FULL
Import everything from source [Y].
HELP
Display help messages [N].
INCLUDE
Include specific object types.
For example, INCLUDE=TABLE_DATA.
JOB_NAME
Name of import job to create.
LOGFILE
Log file name [import.log].
NETWORK_LINK
Name of remote database link to the source system.
NOLOGFILE
Do not write log file [N].
PARALLEL
Change the number of active workers for current job.
PARFILE
Specify parameter file.
PARTITION_OPTIONS
Specify how partitions should be transformed.
Valid keywords are: DEPARTITION, MERGE and [NONE].
QUERY
Predicate clause used to import a subset of a table.
For example, QUERY=employees:"WHERE department_id > 10".
REMAP_DATA
Specify a data conversion function.
For example, REMAP_DATA=EMP.EMPNO:REMAPPKG.EMPNO.
REMAP_DATAFILE
Redefine datafile references in all DDL statements.
REMAP_SCHEMA
Objects from one schema are loaded into another schema.
REMAP_TABLE
Table names are remapped to another table.
For example, REMAP_TABLE=EMP.EMPNO:REMAPPKG.EMPNO.
REMAP_TABLESPACE
Tablespace object are remapped to another tablespace.
REUSE_DATAFILES
Tablespace will be initialized if it already exists [N].
SCHEMAS
List of schemas to import.
SKIP_UNUSABLE_INDEXES
Skip indexes that were set to the Index Unusable state.
SOURCE_EDITION
Edition to be used for extracting metadata.
SQLFILE
Write all the SQL DDL to a specified file.
STATUS
Frequency (secs) job status is to be monitored where
the default [0] will show new status when available.
STREAMS_CONFIGURATION
Enable the loading of Streams metadata
TABLE_EXISTS_ACTION
Action to take if imported object already exists.
Valid keywords are: APPEND, REPLACE, [SKIP] and TRUNCATE.
TABLES
Identifies a list of tables to import.
For example, TABLES=HR.EMPLOYEES,SH.SALES:SALES_1995.
TABLESPACES
Identifies a list of tablespaces to import.
TARGET_EDITION
Edition to be used for loading metadata.
TRANSFORM
Metadata transform to apply to applicable objects.
Valid keywords are: OID, PCTSPACE, SEGMENT_ATTRIBUTES and STORAGE.
TRANSPORTABLE
Options for choosing transportable data movement.
Valid keywords are: ALWAYS and [NEVER].
Only valid in NETWORK_LINK mode import operations.
TRANSPORT_DATAFILES
List of datafiles to be imported by transportable mode.
TRANSPORT_FULL_CHECK
Verify storage segments of all tables [N].
TRANSPORT_TABLESPACES
List of tablespaces from which metadata will be loaded.
Only valid in NETWORK_LINK mode import operations.
VERSION
Version of objects to import.
Valid keywords are: [COMPATIBLE], LATEST or any valid database version.
Only valid for NETWORK_LINK and SQLFILE.
------------------------------------------------------------------------------
The following commands are valid while in interactive mode.
Note: abbreviations are allowed.
CONTINUE_CLIENT
Return to logging mode. Job will be restarted if idle.
EXIT_CLIENT
Quit client session and leave job running.
HELP
Summarize interactive commands.
KILL_JOB
Detach and delete job.
PARALLEL
Change the number of active workers for current job.
START_JOB
Start or resume current job.
Valid keywords are: SKIP_CURRENT.
STATUS
Frequency (secs) job status is to be monitored where
the default [0] will show new status when available.
STOP_JOB
Orderly shutdown of job execution and exits the client.
Valid keywords are: IMMEDIATE.
can you read & do the necessary changes now?
|
|
|
|
|
|
|
|
|
Re: Increasing parallel process in impdp in runtime [message #575952 is a reply to message #575947] |
Tue, 29 January 2013 13:39 |
|
Michel Cadot
Messages: 68729 Registered: March 2007 Location: Saint-Maur, France, https...
|
Senior Member Account Moderator |
|
|
Quote:What kind of question is it ?
A legitimate question.
The workers are not always working EVEN if you set a PARALLEL degree.
For instance, creation of indexes or gathering or statistics are ALWAYS done by only ONE worker, the other ones are WAITING FOR WORK.
So the fact that Oracle adds more workers does not mean they are working.
Regards
Michel
[Updated on: Tue, 29 January 2013 13:40] Report message to a moderator
|
|
|
|
|
|
Re: Increasing parallel process in impdp in runtime [message #611982 is a reply to message #611911] |
Thu, 10 April 2014 09:28 |
Roachcoach
Messages: 1576 Registered: May 2010 Location: UK
|
Senior Member |
|
|
dave1234 wrote on Thu, 10 April 2014 08:15Thank you for answering the question of how to increase number of worker threads even though it was posted long time ago.
I'm a newbie trying to get import and indexing done faster. Oracle documentation says parallel parameter "Enables you to increase or decrease the number of active worker processes and/or PQ slaves for the current job". So i was wondering why 3 of the 4 threads were sitting idle.
From your answer I understand that there needs to be multiple dump files to use parallel processing. Now I can ask dba to use parallel parameter when creating dump file so that I can import faster.
Parallel with datapump is more complicated than you'd expect.
http://download.oracle.com/otndocs/products/database/enterprise_edition/utilities/pdf/datapump11gr2_parallel_1106.pdf
Object size is a factor and, as I discovered to my horror, LOBs don't count. I had a 200mb "table" with xTbs of LOB data....serial expdp. "Have a nice day". Sure, it's sensible in the strictist terms but slightly disappointing they weren't smarter with it.
Also there are differences between parallel datapump slaves and parallel DB processes. And import behaves differently from export.
Somewhat irritatingly I've only ever found this information in that white paper above.
[Updated on: Thu, 10 April 2014 09:31] Report message to a moderator
|
|
|
|