Follow the progress of the import of the data pump network?

I'm on Oracle 10.2.0.4 (SunOS) and execution of an import of network data pump from a list of tables in a schema.

I see that the following views are available to track the data pump tasks:

DBA_DATAPUMP_JOBS - a list and the status of the data pump jobs running
DBA_DATAPUMP_SESSIONS - list of user sessions attached to each data pump task (which may be attached to the session of v$)
DBA_RESUMABLE - See labor being imported and its status
V$ SESSION_LONGOPS - can see the total size of imports and the elapsed time for work of pump data if they run long enough

What other options are available for the increase in imports of network monitoring?
Also, is it possible to see at which table is being processed for a network of several tables import?

That would have helped. :^)

When you run a job, if you do not specify a task name, then it will generate one for you. In your case, I don't see a name of the specified job, it seems that one would be generated. The generated name looks like:

SYS_IMPORT_FULL_01 if a plenty to do
SYS_IMPORT_SCHEMA_01 if make a diagram
SYS_IMPORT_TABLE_01 if a table
SYS_IMPROT_TRANSPORTABLE_01 if you are using the transportable tablespace.

01 is the first array to create. If there is a second job while this job is currently running, it will be 02. The number can also be moved if the job fails, and is not cleaned. In your case, you make a table, so the default task name would be something like:

SYS_IMPORT_TABLE_01 and lets say you ran this with the diagram of the SYSTEM.

In this case, you can run this command:

Impdp system/password attach = system.sys_import_table_01

This will bring you to the datapump prompt, where you can type in the location or status 10, etc..

Dean

Tags: Database

Similar Questions

  • migration from 10g to 12 c using the data pump in

    Hi, while I used the data pump at the level of the schema before, I'm relatively new to the full database import.

    We are trying a full database migration to 10.2.0.4 to 12 c using the complete method of database data pump over db link.

    the DBA has indicated to avoid move SYSAUX and SYSTEM objects. but initially during the documentation review, it appeared that these objects are not exported since the TRANSPORTABLE given target = system NEVER. If anyone can confirm this? done import and export log refers to the objects I thought would not:

    ...

    19:41:11.684 23 FEBRUARY 15:Estimated TABLE_DATA 3718 objects in 77 seconds

    19:41:12.450 23 February 15: total estimation using BLOCKS method: 52,93 GB

    19:41:14.058 23 February 15: object DATABASE_EXPORT/TABLESPACE of treatment type

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'UNDOTBS1' already exists

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'SYSAUX' already exists

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'TEMP' already exists

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'USERS' existing

    20:10:33.200 23 FEBRUARY 15:96 objects TABLESPACE finished in 1759 seconds

    20:10:33.208 23 February 15: treatment of type of object DATABASE_EXPORT/PROFILE

    20:10:33.445 23 FEBRUARY 15:7 PROFILE items finished in 1 seconds

    20:10:33.453 23 February 15: treatment of DATABASE_EXPORT/SYS_USER/USER object type

    20:10:33.842 23 FEBRUARY 15:1 USER objects ended in 0 seconds

    20:10:33.852 23 February 15: treatment of DATABASE_EXPORT/SCHEMA/USER object type

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'OUTLN' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'ANONYMOUS' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'OLAPSYS' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'MDDATA' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'SCOTT' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'LLTEST' already exists

    20:10:52.372 23 FEBRUARY 15:Finished objects USER 1140 in 19 seconds

    20:10:52.375 23 February 15: object DATABASE_EXPORT/ROLE of treatment type

    20:10:55.255 23 February 15: ORA-31684: object ROLE type: 'SELECT_CATALOG_ROLE' already exists

    20:10:55.255 23 February 15: ORA-31684: object ROLE type: 'EXECUTE_CATALOG_ROLE' already exists

    20:10:55.255 23 February 15: ORA-31684: object ROLE type: 'DELETE_CATALOG_ROLE' already exists

    20:10:55.256 23 February 15: ORA-31684: object ROLE type: 'RECOVERY_CATALOG_OWNER' already exists

    ...

    the most insight.

    The schema SYS, CTXSYS and MDSYS ORDSYS are not exported using exp/expdp

    DOC - ID: Note: 228482.1

    I guess that he has already installed a 12 c software and created an itseems database - so when you have imported you have this "already exists."

    Every time the database is created and the software installed by default system, sys, sysaux will be created.

  • differences between the Data Pump to back up the database and use RMAN?

    What are the differences between the Data Pump to back up the database and use RMAN? What is DISADVANTAGES and BENEFITS?

    Thank you

    Search for the backup of the database in

    http://docs.Oracle.com/CD/B28359_01/server.111/b28318/backrec.htm#i1007289

    In brief

    RMAN-> physical backup. (copies of the physical database files)

    DataPump-> logical backup. (logical data such as tables, procedures)

    Docs for RMAN-

    http://docs.Oracle.com/CD/B28359_01/backup.111/b28270/rcmcncpt.htm#

    Datapump docs

    http://docs.Oracle.com/CD/B19306_01/server.102/b14215/dp_overview.htm

    Published by: Sunny kichloo on July 5, 2012 06:55

  • To update the schema in the data pump

    Hello

    Database version: 11.2.0.1.0

    Is that we can update the schema in the database without creating the similar to the network mode dump file

    for example

    I have a DB 1 database to say, it contains two schema Test and production, we can update the test with production data without creating the dump file in the database server.

    the whole idea is to perform the export and import data in a single step here to reduce the time and human intervention.

    Currently, I've followed the steps below:

    (1) export of production data

    (2) file the test schema

    (3) create a diagram of the test

    (4) import production data in the test schema

    Thank you

    Sery

    Hello

    It is possible.,.

    SQL > create public database link impdpt connect to the SYSTEM identified by abc123 using 'DG1 ';

    Database link created.

    SQL >

    -----

    Directory of system/abc123 impdp [oracle@prima admin] $ = network_link = impdpt schemas = hr remap_schema hr:hrtn logfile = test_same.log = DATA_PUMP_DIR

    Import: Release 11.2.0.4.0 - Production on Tue Dec 29 02:53:24 2015

    Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

    Connected to: Oracle Database 11 g Enterprise Edition Release 11.2.0.4.0 - 64 bit Production

    With partitioning, OLAP, Data Mining and Real Application Testing options

    Start "SYSTEM". "" SYS_IMPORT_SCHEMA_01 ": System / * Directory = network_link = impdpt schemas = hr remap_schema hr:hrtn logfile = test_same.log = DATA_PUMP_DIR

    Current estimation using BLOCKS method...

    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA

    Total estimation using BLOCKS method: 448 KB

    Processing object type SCHEMA_EXPORT/USER

    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT

    Processing object type SCHEMA_EXPORT/ROLE_GRANT

    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE

    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA

    Object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE of treatment

    Object type SCHEMA_EXPORT/TABLE/TABLE processing

    . . imported "HRTN. "" COUNTRY "25 lines

    . . imported "HRTN. "' DEPARTMENTS ' 27 lines

    . . imported "HRTN. "' EMPLOYEES ' 107 lines

    . . imported "HRTN. "JOBS"19 ranks. "

    . . imported "HRTN. "" JOB_HISTORY "10 lines

    . . imported "HRTN. "" LOCATIONS "23 lines

    . . imported "HRTN. "The REGIONS"4 lines.

    Processing object type SCHEMA_EXPORT/TABLE/SCHOLARSHIP/OWNER_GRANT/OBJECT_GRANT

    Object type SCHEMA_EXPORT/TABLE/COMMENT of treatment

    Object type SCHEMA_EXPORT/PROCEDURE/treatment PROCEDURE

    Object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE processing

    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX

    Object type SCHEMA_EXPORT/TABLE/CONSTRAINT/treatment

    Object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS of treatment

    Object type SCHEMA_EXPORT/VIEW/VIEW processing

    Object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT of treatment

    Object type SCHEMA_EXPORT/TABLE/TRIGGER processing

    Object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS treatment

    Work 'SYSTEM '. "' SYS_IMPORT_SCHEMA_01 ' completed Fri Dec 29 02:54:09 2015 elapsed 0 00:00:44

    [oracle@prima admin] $

  • The verification of the data pump

    I am curious if, and how, Data Pump check that the output it generates is indeed a valid dump.

    I've read the Oracle utility books and I could not find anything indicating if this happens because it works, and if there are problems the process simply stops or if there was a way to check the discharge later.

    Someone at - it a better idea of how it works?

    Hello..

    I don't think that SQLFILE valid export dump. It shows just the ddl of objects in the discharge of taken export. There is no way except check for validation if made export has been made successfully, except to check the log file and at the end of the file journal meaage should be Job 'SYS '. "' SYS_EXPORT_SCHEMA_02 ' succeeded at 02:47:57

    Some problems during export, the export will be terminated without success.

    Anand

  • I want to learn more about the data pump and table space transferable

    Please notify easy tutorials as I want to know how to import and export between oracle 10 and 11.
    Thank you

    Hello
    Please check this oracle tutorial:
    http://www.exforsys.com/tutorials/Oracle-10G/Oracle-10G-using-data-pump-import.html
    about transportable table spaces, you may consult:
    http://www.rampant-books.com/art_otn_transportable_tablespace_tricks.htm
    Kind regards
    Mohamed
    Oracle DBA

  • Best practices for the data pump or import process?

    We are trying to copy the existing to another newly created schema schema. Pump data export to succeed the export schema.

    However, we met errors when you import dump again file schema. Remapped schema and storage areas, etc.
    Most of the errors occur in PL/SQL... For example, we have views as below in the original schema:
    "
    CREATE the VIEW * oldschema.myview * AS
    SELECT col1, col2, col3
    OF * oldschema.mytable *.
    WHERE coll1 = 10
    .....
    "
    Quite a few functions, procedures, packages and triggers contain "* oldschema.mytable *" in the DML (insert, select, update), for example.

    Get the following errors in the import log:
    ORA-39082: object ALTER_FUNCTION type: 'TEST '. "' MYFUNC ' created with compilation warnings
    ORA-39082: ALTER_PROCEDURE object type: 'TEST '. "" MYPROCEDURE "created with compilation warnings
    ORA-39082: the VIEW object type: 'TEST '. "' BIRD ' created with compilation warnings
    ORA-39082: object PACKAGE_BODY type: 'TEST '. "' MYPACKAGE ' created with compilation warnings
    ORA-39082: TRIGGER object type: 'TEST '. "' MON_TRIGGER ' created with compilation warnings

    Many actual errors/no valid in the new schema objects are due to:
    ORA-00942: table or view does not exist

    My question is:
    1. What can we do to correct these errors?
    2. is there a better way to do the import with such condition?
    3 update PL/SQL and recompile with the new scheme? Or update in the scheme of origin, firstly and export?

    Your help will be greatly appreciated!

    Thank you!

    @?/rdbms/admin/utlrp.sql

    Will compile the objects in the database through drawings. In your case, you re-mapping from one schema to another and utlrp objects will not be able to compile.

    SQLFILE impdp option allows to generate the DDL of the discharge of export and change the name of the schema on a global scale and run the script in sqlplus. This should solve most of your errors. If you still see errors, now proceed to utlrp.sql.

    -André

  • Import of differnet schema table using the data pump in

    Hi all
    I try to export a table to a diagram MIPS_MDM and import it in the MIPS_QUIRINO schema in the different database. I get the below error.

    expdp MIPS_MDM/MIPS_MDM@ext01mdm tables = QUARANTINE directory = DP_EXP_DIR dumpfile = quarantine.dmp logfile = qunat.log

    To import
    Impdp MIPS_QUIRINO/MIPS_QUIRINO@mps01dev tables = QUARANTINE directory = DP_EXP_DIR dumpfile = quarantine.dmp logfile = impd.log

    Please can I know what is the exact syntax to import in the scheme of differnet.

    Thank you.

    http://download.Oracle.com/docs/CD/E11882_01/server.112/e16536/dp_import.htm#BEHFIEIH

  • How to get out the year following the date

    Hi all

    I'm couldn't get out of the year to date.
    in my table it has been stored in varchar format on January 11, 2005
    If I want to display the year of the column, then
    I tried with year (to_date (' ' jan 11 2005 ', ' mm dd yyyy ' "))
    but I get an error then, how should I go out the year of the date_column

    thnks

    Hello

    Use RIGHT(DATE_COLUMN,4)

    Thank you
    Saichand.v

  • Retrieve the month following the date

    Hi people,

    How can I get the part of months to a date.

    Say, like, I want to get the month part (April = 4) 01/04/2009.

    Thanks in advance!

    Hello

    You can get all parts of a DATE using TO_CHAR.
    For example, on April 1, 2009

    SELECT  TO_CHAR (SYSDATE, 'Month')
    FROM    dual;
    

    product of the results of this

    April
    

    (if you use English) and

    SELECT  TO_CHAR (SYSDATE, 'MM')
    FROM    dual;
    

    produces this output (VARCHAR2)

    04
    
  • Moving all the newspapers and Materialized View at the schema level using the data pump in

    Hi Experts,

    Please help me on how I can exp/imp all materialized views andMV logs (as are some MVs) only the full scheme of other databases. I want to exclude everything else.

    Concerning
    -Samar-

    Using DBMS_METADATA. Create the following SQL script:

    SET FEEDBACK OFF
    SET SERVEROUTPUT ON FORMAT WORD_WRAPPED
    SET TERMOUT OFF
    SPOOL C:\TEMP\MVIEW.SQL
    DECLARE
        CURSOR V_MLOG_CUR
          IS
            SELECT  DBMS_METADATA.GET_DDL('MATERIALIZED_VIEW_LOG',LOG_TABLE) DDL
              FROM  USER_MVIEW_LOGS;
        CURSOR V_MVIEW_CUR
          IS
            SELECT  DBMS_METADATA.GET_DDL('MATERIALIZED_VIEW',MVIEW_NAME) DDL
              FROM  USER_MVIEWS;
    BEGIN
        DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'SQLTERMINATOR',TRUE);
        FOR V_REC IN V_MLOG_CUR LOOP
          DBMS_OUTPUT.PUT_LINE(V_REC.DDL);
        END LOOP;
        FOR V_REC IN V_MVIEW_CUR LOOP
          DBMS_OUTPUT.PUT_LINE(V_REC.DDL);
        END LOOP;
    END;
    /
    SPOOL OFF
    

    In my case the script is saved as C:\TEMP\MVIEW_GEN. SQL. Now I will create a journal mview and mview in schema SCOTT and run the script above:

    SQL> CREATE MATERIALIZED VIEW LOG ON EMP
      2  /
    
    Materialized view log created.
    
    SQL> CREATE MATERIALIZED VIEW EMP_MV
      2  AS SELECT * FROM EMP
      3  /
    
    Materialized view created.
    
    SQL> @C:\TEMP\MVIEW_GEN
    SQL> 
    

    Run the C:\TEMP\MVIEW_GEN script. SQL generated a C:\TEMP\MVIEW queue. SQL:

      CREATE MATERIALIZED VIEW LOG ON "SCOTT"."EMP"
     PCTFREE 10 PCTUSED 30 INITRANS
    1 MAXTRANS 255 LOGGING
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1
    MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL
    DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT)
      TABLESPACE "USERS" 
    
    WITH PRIMARY KEY EXCLUDING NEW VALUES;
    
      CREATE MATERIALIZED VIEW "SCOTT"."EMP_MV" ("EMPNO", "ENAME", "JOB", "MGR",
    "HIREDATE", "SAL", "COMM", "DEPTNO")
      ORGANIZATION HEAP PCTFREE 10 PCTUSED 40
    INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
      STORAGE(INITIAL 65536 NEXT 1048576
    MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1
    BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT)
      TABLESPACE
    "USERS"
      BUILD IMMEDIATE
      USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 
    
    STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
    
    PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE
    DEFAULT CELL_FLASH_CACHE DEFAULT)
      TABLESPACE "USERS"
      REFRESH FORCE ON
    DEMAND
      WITH PRIMARY KEY USING DEFAULT LOCAL ROLLBACK SEGMENT
      USING ENFORCED
    CONSTRAINTS DISABLE QUERY REWRITE
      AS SELECT "EMP"."EMPNO"
    "EMPNO","EMP"."ENAME" "ENAME","EMP"."JOB" "JOB","EMP"."MGR"
    "MGR","EMP"."HIREDATE" "HIREDATE","EMP"."SAL" "SAL","EMP"."COMM"
    "COMM","EMP"."DEPTNO" "DEPTNO" FROM "EMP" "EMP";
                                   
    

    Now, you can run this on the database. You may need to adjust the tablespace and storage clauses. Or you can add more DBMS_METADATA. SET_TRANSFORM_PARAM calls to C:\TEMP\MVIEW_GEN. SQL to force DBMS_METADATA not to include the tablespace or / and the terms of storage.

    SY.

  • use the data pump FLASHBACK_SCN

    Hey,.

    I try to export a schema, as below,

    expdp------"/ ACE sysdba\ ' DIRECTORY = data_pump_dir1 DUMPFILE = xxx.dmp LOGFILE = data_pump_dir1:exp_xxxx.log = FLASHBACK_SCN = xxxxx xxx DRAWINGS


    so first, I need to get a current database of YVERT.

    SQL > select flashback_on, current_scn from v$ database;

    FLASHBACK_ON CURRENT_SCN
    ------------------ -----------
    YES 7.3776E + 10

    SQL > select to_char (7.3776E + 10, '9999999999999999999') of double;

    TO_CHAR (7.3776E + 10',
    --------------------
    73776000000



    or

    SQL > SELECT to_char (DBMS_FLASHBACK. GET_SYSTEM_CHANGE_NUMBER, ' 9999999999999999999') twice;

    TO_CHAR (DBMS_FLASHBA
    --------------------
    73775942383



    Assume they are all the current SCN for the database number... so I do not understand why these two numbers are different (73776000000 and 73775942383)?

    should what number I use to export current database dump?


    Thank you very much!!!

    Published by: 995137 on May 30, 2013 08:25

    Hello

    You can use one, is it from the difference because tehre's time diffenence between requests, you can check with

    column scn format 9999999999999;
    select current_scn scn from v$database
    union
    select dbms_flashback.get_system_change_number scn from dual;
    

    HTH

  • Follow the circle with overlay

    Hello

    I have a few messy data a camera. It's a pattern of concentric circles that changes with the times - each ring splits into two, each of these components of movement away form between them. I want to set up a set of these rings.

    Someone could point me in the right direction or have any advice for a strategy game start? I thought I could use the superposition of the ring the user select a set of rings, and then somehow have the overlay 'follow' the data he was moving (easy or hard?).

    I removed the components of acquisition IMAQ and took a series of images, load them into a loop to try to show the effect as an animation (vi and associated data).

    Any help would be appreciated.

    I watched this example: http://forums.ni.com/t5/LabVIEW/Fit-2D-data-in-to-Circle/td-p/668858/page/2?view=by_date_ascending
    who can be of some use if I manipulate data, but is it the right way to go?

    And have tried to use machine vision-> analytic geometry-> control 'IMAQ made circle 2', but there is no example vi or tutorial for this control (or most of the other tools of analytic geometry).

    so not sure exactly how it works.

    Thank you

    I'd probably watch the line profile.  If it is noisy or has spots, you might on average several radial lines (up, down, left, right) to get more consistent data.

    Ridge detection would probably find the part the clearest of each circle and could find maximums the.  You could treat these to find what you are looking for.  Two very close circles would probably be identified as a single peak.  Looking at the data would be the first step.

    You can use overlays draw the circles located on the original image.  This would help you to know if you followed them correctly.

    Bruce

  • ORA-39097: Data Pump job encountered the error unexpected-39076

    Hello world

    Today, I tried to take a pump dump to export my test database (specific table), the version is 10.2.0.4 on Solaris10(64-bit) and I got the following error message

    Work 'SYSTEM '. "" SYS_EXPORT_TABLE_23 "carried out at 09:51:36
    ORA-39097: Data Pump job encountered the error unexpected-39076
    ORA-39065: exception of unexpected master process in KUPV$ FT_INT. DELETE_JOB
    ORA-39076: cannot remove SYS_EXPORT_TABLE_23 work for the SYSTEM user
    ORA-06512: at "SYS." DBMS_SYS_ERROR', line 95
    ORA-06512: at "SYS." "KUPV$ FT_INT", line 934
    ORA-31632: main table "of the SYSTEM. SYS_EXPORT_TABLE_23"not found, invalid or unreachable
    ORA-06512: at "SYS." DBMS_SYS_ERROR', line 95
    ORA-06512: at "SYS." "KUPV$ FT_INT", line 1079
    ORA-20000: failed to send the e-mail message from pl/sql because of:
    ORA-29260: network error: Connect failed because target host or object does not exist
    ORA-39097: Data Pump job encountered the error unexpected-39076
    ORA-39065: exception unexpected master process in HAND
    ORA-39076: cannot remove SYS_EXPORT_TABLE_23 work for the SYSTEM user
    ORA-06512: at "SYS." DBMS_SYS_ERROR', line 95
    ORA-06512: at "SYS." "KUPV$ FT_INT", line 934
    ORA-31632: main table "of the SYSTEM. SYS_EXPORT_TABLE_23"not found, invalid or unreachable
    ORA-06512: at "SYS." DBMS_SYS_ERROR', line 95
    ORA-06512: at "SYS." "KUPV$ FT_INT", line 1079
    ORA-20000: failed to send the e-mail message from pl/sql because of:
    ORA-29260: network error: Connect failed because target host or object does not exist

    I hope that the export dumpfile is valid, but I don't know why I get this error message. One faced this kind of problem. please me tips

    Thank you

    Shan

    Once you see this:

    Work 'SYSTEM '. "" SYS_EXPORT_TABLE_23 "carried out at 09:51:36

    The Data Pump task is done with the dumpfile. It is some clean that is needed and it looks like something in the cleaning failed. Don't know what it was, but you dumpfile should be good. An easy way to test is to run impdp with sqlfile. This will make all import will do, but instead to create objects, he writes the ddl in sql file.

    Impdp directory of the user/password sqlfile = my_test.sql = your_dir dupmfile = your_dump.dmp...

    If it works, then your dumpfile should be fine. The last action of export, it is write the main table Data Pump in the dumpfile. The first thing that import is read this table. So, if you can read it in (which sqlfile impdp) your dump is good.

    Dean

  • With the help of Data Pump for Migration

    Hi all

    Version of database - 11.2.0.3

    RHEL 6

    Size of the DB - 150 GB

    I have to run a Migration of database from one server to another (AIX for Linux), we will use the data pump Option, we will migrate from Source to the target using schemas expdp option (5 patterns will be exported and imported on the target Machine). But it won't go live for target, after that the development of this migration team will do a job on this machine target which will take 2 days for them to fill in and to cultivate these 2 days, source database will run as production.

    Now, I have obligation which, after the development team, complete their work that I have to the changes of 2 days of source to target after what target will act as production.

    I want to know what options are available in Data Pump can I use to do this.

    Kind regards

    No business will update something that has some data that are no longer representative of live.

    Sounds like a normal upgrade, but you test it just on a copy of the direct - make sure that the process works & then play it comfortable once again, but against your last set of timely production data.

    Deans suggestion is fine, but rather than dropping coins and importation, personally I tend to keep things simple and do it all (datapump full schema if it is possible). Live in this way, you know that you put off, inclusive of all sequences and objects (sequences could be incremented, so you must re-create the fall /). Otherwise you are dividing a upgrade in stages, several measures to trace & more to examine potential conflicts. Even if they are simple, a full datapump would be preferable. Simple is always best with production data

    Also - you do not know the changes that have been made to upgrade the new environment... so you roll this back etc? Useful to look at. Most of the migration would be a db via RMAN copy / transport-endian, as you must also make sure that you inherit all the subsidies system patterns, not only summary level.

Maybe you are looking for