Parallel degree importing Datapump

Hello

For several imports that I ran, I observed a constant degree of parallelism = 10 when creating index.

Moreover, I parallel_degree_limit = en parallel_degree_policy = MANUAL CPU.

The first parameter specifies that Oracle sets up to an upper limit on the DOP.

If this explains the value observed?

Kind regards

Hello

At the same time goes like this:

all the tables, plsql definitions are created in series

All data in the table of loading in parallel - so not parallel insertion into a table but a work process separate datapump by table - so EMP would be charged by slave1 and DEPT by slave2 - but at the same time. IN your case 10 separate tables would all be imported at the same time.

The indexes are built at the same time you specify

Constraints are made in series one and the same process, I think

I wrote a short blog about it earlier that can be useful: it contains pictures! :-)

Oracle DBA Blog 2.0: What happens during an import parallel datapump?

See you soon,.

Rich

Tags: Database

Similar Questions

  • doubt import DataPump

    Hello

    Oracle Version: 10.2.0.1
    Operating system: Linux

    Can we put two datapump both on discharge even import process in the same machine at the same time.

    Thank you and best regards,
    Poorna Prasad.

    Neerav999 wrote:
    NO, you cannot create import datapump as
    Import process starts with the creation of master process, the work process, the customer and the shadow process no import is possible until these processes are free

    I don't know what you mean here? You don't mean that we cannot run two parallel import process as the process will be not free? Please see below,

    E:\>time
    The current time is: 18:24:17.04
    Enter the new time:
    
    E:\>impdp system/oracle directory=expdpdir dumpfile=dumpfile.dmp remap_schema=hr:hr2
    
    Import: Release 11.1.0.6.0 - Production on Saturday, 06 February, 2010 18:24:25
    
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    
    Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "SYSTEM"."SYS_IMPORT_FULL_02" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_02":  system/******** directory=expdpdir dumpfile=dumpfile.dmp remap_schema=hr:hr2
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "HR2"."COUNTRIES"                           6.375 KB      25 rows
    . . imported "HR2"."DEPARTMENTS"                         7.015 KB      27 rows
    . . imported "HR2"."EMPLOYEES"                           16.80 KB     107 rows
    . . imported "HR2"."JOBS"                                6.984 KB      19 rows
    . . imported "HR2"."JOB_HISTORY"                         7.054 KB      10 rows
    . . imported "HR2"."LOCATIONS"                           8.273 KB      23 rows
    . . imported "HR2"."REGIONS"                             5.484 KB       4 rows
    Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/COMMENT
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/VIEW/VIEW
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
    Job "SYSTEM"."SYS_IMPORT_FULL_02" successfully completed at 18:24:44
    

    And the second session "

    E:\Documents and Settings\aristadba>time
    The current time is: 18:24:19.37
    Enter the new time:
    
    E:\Documents and Settings\aristadba>impdp system/oracle directory=expdpdir dumpfile=dumpfile.dmp remap_schema=hr:hr
    
    Import: Release 11.1.0.6.0 - Production on Saturday, 06 February, 2010 18:24:22
    
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    
    Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01":  system/******** directory=expdpdir dumpfile=dumpfile.dmp remap_schema=hr:h
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "HR1"."COUNTRIES"                           6.375 KB      25 rows
    . . imported "HR1"."DEPARTMENTS"                         7.015 KB      27 rows
    . . imported "HR1"."EMPLOYEES"                           16.80 KB     107 rows
    . . imported "HR1"."JOBS"                                6.984 KB      19 rows
    . . imported "HR1"."JOB_HISTORY"                         7.054 KB      10 rows
    . . imported "HR1"."LOCATIONS"                           8.273 KB      23 rows
    . . imported "HR1"."REGIONS"                             5.484 KB       4 rows
    Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/COMMENT
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/VIEW/VIEW
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
    Job "SYSTEM"."SYS_IMPORT_FULL_01" successfully completed at 18:24:44
    

    You see the time difference? Its just 2 seconds before the work has been done!

    You would be so kind to explain the statement you gave earlier since very well are the chances that I can not understand it.

    HTH
    Aman...

  • Oracle import datapump fix corruption of block?

    Hi all
    I'm having a corruption of data on the production block. I want to export this DB and import it into a test using datapump environment, in order to do some tests on it.

    However, I'm concerned if impdp will solve this scenario... and that's why I'm going to lose my test scenario...
    It will be Oracle import datapump fix block corruption?

    Thank you

    Oracle does not have an import of block by block, including blocks corrupt so that the answer to your question is NO, corruption cannot be reproduced in this way.

    Concerning

  • ERROR importing DATAPUMP

    Hello


    Oracle Release version: 10.2.0.3.0
    Server: Linux


    We get the following error when importing the datapump. Here we export the dump on the server even and then he was trying to import in the same server, we get the following error in our servers.yesterday also import datapump production works very well, but today, we get the error.


    We tried to import on different servers too, but the same problem we face



    ORA-39014: one or more workers have left prematurely.
    ORA-39029: worker 1 with the process name "DW01' end prematurely
    ORA-31672: process work DW01 died suddenly.


    Please help me.

    Published by: SIDDABATHUNI on June 1st, 2009 21:46

    Hello

    When you have attached the work in the prompt before running continue_client.
    can you issue the command status and check which object it was important during the crash.

    now run a new job with parameter exclude to exclude this one object.

    Thank you.

  • Dangers of parallel (degree n) at the level of the table

    What all things I would be informed when PARALLEL (degree n) was implemented on a very large table to (especially) faster SQLs.

    I know, it can drive up to CPU & i/o etc... I'm looking for is, how the optimizer react to this settings... What path this setting would make unavailable or unfavourable etc etc...

    Thanks in advance...

    user4529833 wrote:
    Thank you everyone. Jonathan, very good section on size measure PQ and AUTO.  I saw these DL readings several times and always wondered why, but now I know why. So solution here to avoid this option to use the uniform extensions. Right?

    OK - and then you have to choose the size of your measurement carefully to avoid excessive production of waste of space (as mentioned by Bert Scalzo).

    On average, if you make a parallel ETG or insert as Select, each slave will end up with half a measure of free space in her last litter. If you are likely to get the waste of space of approximately (degree of parallelism * extent_size / 2) altogether.

    If you choose a very large measure, or a very high degree of parallelism, this could lead to a lot of waste of space in the object. As I indicated above, however, I haven't checked this recently, so there may be some improvements to reduce waste in algorithms that use Oracle.

    Concerning
    Jonathan Lewis
    http://jonathanlewis.WordPress.com
    http://www.jlcomp.demon.co.UK

    "Science is more than a body of knowledge; It's a way of thinking"Carl Sagan

  • Using the parallel option with DataPump

    Hello

    Does make sense to make a DataPump export and import with the option parallel > 1 when I only have a single dump file?

    Thanks and greetings

    Hello

    Thank you for the feedback.

    But as I said I am concerned by the presence of a single dump file.

    It does not work and have abortions. So I can't use this option unless I generate export by many dump files.

    Kind regards

  • table import DataPump, simple question

    Hello
    A junior DBA, I'm a bit confused.
    Suppose that a user wants to import a table with datapump which it exports from another database with different schemas and tablespaces (it has exported with expdp using tables = XX and I don't know the details..)...
    If I don't know the name of the table, should I ask these three questions 1) diagram 2) 3) version oracle tablespace?
    Of the documentation. I know the remapping of impdp capacities... But they are required when importing a table?
    Thanks in advance

    Hello

    Suppose that a user wants to import a table with datapump which it exports from another database with different schemas and tablespaces (He > exported with expdp using tables = XX and I don't know the details..)...
    If I don't know the name of the table, should I ask these three questions 1) diagram 2) 3) version oracle tablespace?

    You can get this information from the dumpfile if you want - just to make sure you get the right information. If you run your import command, but add:

    sqlFile = MySQL.SQL

    You can edit this sql file to see what is in the dumpfile. It won't display the data, but it will show all the metadata (tables, storage spaces, etc.). If you have not added sqlfile parameter is a .sql file that contains all the create statements that would have been executed.

    Of the documentation. I know the remapping of impdp capacities... But they are required when importing a table?
    Thanks in advance

    You must re-map anything, but if the dumpfile contains a table scott.emp then it will import this table scott.emp in. If you want to go in blake, then you must remap_schema. If it goes in the tablespace tbs1 and you want in tbs2, you need a remap_tablespace.

    Suppose that an end-user wants me to export a table spesific use datapump...
    Should I give him also the name of the repository where the exported table?

    It would be nice, but see above, you can get the name of the tablespace of sqlfile command during the import.

    I hope this helps.

    Dean

  • on"PARALLEL"option od datapump!

    Dear friends,

    I'm bit confused about parallel datapump option keyword. Could you please tell me what is the real function of this PARALLEL option.

    -Is related to the number of CPUS?
    - And if my server has 8 CPU then in PARALLEL option, what number I'm going to use to improve performance?


    Please help me to erase the notion on this subject.

    query dba_directories

    Kind regards
    S.K.

  • Reg: Objects & missing partitions while import datapump

    Hi Techies,

    I need urgent assistance on the case below I am facing currently.

    I imported a schema using DATAPUMP and it failed with the error "Failed to extend the tablespace" and for this dynamically when requested, I added a space.

    Its a grand scheme to export 120G dump file.

    The import log showed as completed successfully and when the AD team working on it they they found misssing some of the objects & partitions and it will be about 1000 + partitions.

    The one you suggest how to solve this problem without refreshing again...

    Thanks in advance...
    Satya

    user10972727 wrote:
    Hi Techies,

    I need urgent assistance on the case below I am facing currently.
    .. misssing some objects & partitions...
    The one you suggest how to solve this problem without refreshing again...

    Say your team AD to set "a little." exactly which opposes and what partitions are missing. With this information, you can start new export only objects and the partitions that are missing, then import them and you will have the full data set, you don't have to import in the first test.

    The one you suggest how to solve this problem without refreshing again...

    There's no other way to do it logically correct. You could possibly play and have fun with links to the database and to add the missing items through some SQLs, but something as I have explained in a few lines above is certainly the more normal way to do it when you have scenario you described

    Thanks in advance...
    Satya

  • Import of the first 100 rows of tables selected when importing Datapump

    DB version: 10 gr 2


    I have a dumpfile datapump of a schema that has 120 tables. Out of this 120 tables, there are 3 really big paintings
    1. ERR_LOG
    2. RTE_CONS_DTL
    3. ITEM_ENT_DTL
    During the import, I want only the first 100 lines above mentioned 3 tables. I want that all the lines of the rest of 117 paintings. Is this possible in Datapump Import?

    I think that you will have to do the import for 117 tables first then for three tables (using the QUERY option).

    Please see the following for more details:

    http://www.Stanford.edu/dept/ITSS/docs/Oracle/10G/server.101/b10825/dp_import.htm

    Published by: SKU on April 20, 2009 02:49

  • IMPDP import DataPump fails with error ORA-31626 ORA-6512 31637 ORA ORA-31632 ORA-31635

    Hello


    As he attempted an impdp, I get after the sequence of errors-


    ORA-31626: there is no job
    ORA-06512: at "SYS." DBMS_SYS_ERROR', line 79
    ORA-06512: at "SYS." "KUPV$ FT", line 885
    ORA-31637: failed to create the SYS_IMPORT_TABLE_01 work for user LSU
    ORA-31632: table main "LSU. SYS_IMPORT_TABLE_01"not found, invalid or unreachable
    ORA-31635: impossible to establish synchronization job resources
    ORA-06512: at "SYS." DBMS_SYS_ERROR', line 79
    ORA-06512: at "SYS." "KUPV$ FT_INT", line 1986
    ORA-04063: package body 'SYS. DBMS_LOCK"contains errors
    ORA-06508: PL/SQL: called program unit is not found: 'SYS. DBMS_LOCK.


    Parameters passed to impdp are also less

    Impdp sjm/xxxxxx@flman700 DIRECTORY = temp_dir DUMPFILE = dyman01_exp01.dmp, dyman01_exp02.dmp, dyman01_exp03.dmp, dyman01_exp04.dmp, dyman01_exp05.dmp, dyman01_exp06.dmp LOGFILE is one. JOURNAL OF THE TABLES = CST_ACTVT TABLE_EXISTS_ACTION = ADD


    Any help on how to proceed to another. Cannot find jobs datapump hung in the target database.

    Version of data source and target - 10.2.0.3


    Thank you


    Sery

    Hello

    According to DBA_DEPENDENCIES (for 11.1 anyway) the dependencies of the table for DBMS_LOCK are DUAL and DBMS_LOCK_ALLOCATED - these two exist?

    I suggest that you get a failure and the rerun catalog and the catproc.

    See you soon,.

    Rich

  • Bottleneck of performance when importing datapump

    Hi, we have created a new database 12cR1 on LMOS with Linux x 86 64-bit. We are experiencing great slowness in 10-11GB data import and taking more than 3 hours. Looking at the AWR it shows high I/O waits.

    AWR attached but unable to conclude the problem. Please can I ask someone to stem the problem?

    Thank you very much!

    I haven't looked too closely, but I will note this:

    Your sql albums and waiting is all related to import - which is not a bad thing.

    You start on the cpu - that is not a bad thing, even if CPU resources are needed by the operating system to run scripts, so there could be something on the side of the BONE there to observe.

    You have a lot of 'other' waiting for I/O - I'll take that as meaning the OS did a lot of things.  It makes sense for a big load.

    Others may have more insight of the AWR in what Oracle can be done, but I don't think that we are not seeing clear is the resources required by the operating system to archive the redo.  Therefore, if no one else using the db when you perform this type of load, you might consider: backup, off the record of archive, load, turn the back of archiving, archive again.  You can use OS tools to check the performance in your current setup and configuration noarchivelog, and total elapsed time.  Given that the percentage of your database that processes these data compared to the full size will determine how long does take a full backup, this could be the determining factor to see if you do not want to do that.

  • the parallel degree

    Hello
    10g R2 on Win 2008, I have:
    SQL> select distinct degree from all_tables;
    
    DEGREE
    ----------
       DEFAULT
             1
    What is the meaning of each value: DEFAULT and 1?

    Are defined when the table was created or there is a for this initialization parameter?

    Thanks in advance.

    Documentation,.

    Number of threads per instance for scanning the table, or by DEFAULT

    http://download.Oracle.com/docs/CD/B28359_01/server.111/b28320/statviews_2105.htm#REFRN20286

    HTH
    Aman...

  • little doubt about import datapump

    Hello

    OracleVersion:10.2.0.1
    Operating system: linux

    Here, I have a small doubt please help me it is to say

    I took a backup of tables emp and dept now I need to import only the table emp based on condition specified in another schema
    select * from emp where deptno in (select deptno from dept where loc='DALLAS')
    Here is my script to import that I had tried failed. Please help me how to
    E:\oracle\dbdump>impdp sample/sample directory=dbdump dumpfile=TABLES.DMP logfile=tales_imp.log remap_schema=scott:sample tables=emp remap_tablespace=users:sample query=\"where deptno in \(select deptno from dept where loc='DALLAS')\"
    
    Import: Release 10.2.0.1.0 - Production on Thursday, 29 October, 2009 17:59:05
    
    Copyright (c) 2003, 2005, Oracle.  All rights reserved.
    
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Master table "SAMPLE"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
    Starting "SAMPLE"."SYS_IMPORT_TABLE_01":  sample/******** directory=dbdump dumpfile=TABLES.DMP logfile=tales_imp.log remap_schema=scott:sample tables=emp remap_tablespace=users:sample query="where deptno in \(select deptno from dept where loc='DALLAS')"
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    ORA-31693: Table data object "SAMPLE"."EMP" failed to load/unload and is being skipped due to error:
    ORA-00911: invalid character
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Processing object type TABLE_EXPORT/TABLE/POST_TABLE_ACTION
    ORA-31685: Object type POST_TABLE_ACTION failed due to insufficient privileges. Failing sql is:
    BEGIN
     SYS.DBMS_SNAPSHOT_UTL.SYNC_UP_LOG('SCOTT','EMP');
     END;
    
    Job "SAMPLE"."SYS_IMPORT_TABLE_01" completed with 2 error(s) at 17:59:15

    SIDDABATHUNI wrote:
    Hello

    I'm looking for here is I'll have the release of the full scheme. Now, I want to import only a single table based on a specific condition rather than import the full dump.

    I get the error when you use parfile as you suggest.

    Not too, I've said to parfile "+ no need to back slashes +"...?

    Nicolas.

  • How to know the degree of optimal parallelism for my database?

    I have an important application on my (Oracle 10,2,0) databae and the box has 4 CPU. All tables are not partitioned. Should I set the parallel degree by myself?

    How to know the degree of optimal parallelism for my database?

    As far as I am concerned there is no optimum degree of parallelism at the database level. The optimal value depends on the query based on the plan in use. This may change over time.

    It is not so difficult to abuse of the PQO and end up harming the overall database performance. PQO is a brute force methodology and should be applied with caution. Otherwise, you end up with results inconsisten.

    You can let Oracle manage, or you can manage it on the level of education through advice. I do not have to specify the degrees of parallelism to the object level. As I said, no two queries are exactly alike and what is right for a query on a table cannot be good for another query on the table.

    In the case of doubt put in place the system to let Oracle manage. If you ask really, it's how many sessions to allocate PQO then look at your reports Statspack or AWR and judge your system load. Monitor v$ px_session and v$ pq_slave to see how these views show activity.

    IMHO - Mark D Powell-

Maybe you are looking for

  • I want to move some bookmarks to my documents put on cd. How can I do this using 8.0.1?

    I managed to get them to json. but they failed to adobe reader and would not open. All I want to do is move the bookmarks to mydouments and then put them on a cd. Why is it so difficult.I've never had this problem until I update to 8.0.1. Please expl

  • Where are my Safari extensions?

    I have Safari 9.02.  I use 1Password which involves the installation of an extension.  Many worked for several months, but just recently, I noticed that was missing from the extension.  Downloaded the latest and installed.  The result is a box under

  • Siri works do not correctly on 6s

    Hey there, I just got a 6 s last week (before 6) and Siri does not well at all on this phone and I don't know if it's a known issue or just me. It seems very uneven, some tasks which it does very well, while others, she gets caught on. I have the cur

  • Transfer TDMS file cRIO 9076 PC locally using Labview code problem

    Dear all I use LabVIEW 2013 with cRIO 9076 for datalogging PDM. I try to transfer the logging queue (.tdms) cRIO to the computer via the LabVIEW FTP software. The transfer is OK but computer I can't open the file-68007 error. It's ok when I copy the

  • Bluetooth, required for the wireless device: Kobo ereader. WiFi.

    New feature - Kobo wireless e-reader - requires accessibility Bluetooth.  If Windows does not support according to the Advisory Panel for this portable AcerAspire - what thenn is the solution to get my functional Kobo.  I had to reset the status of m