The verification of the data pump

I am curious if, and how, Data Pump check that the output it generates is indeed a valid dump.

I've read the Oracle utility books and I could not find anything indicating if this happens because it works, and if there are problems the process simply stops or if there was a way to check the discharge later.

Someone at - it a better idea of how it works?

Hello..

I don't think that SQLFILE valid export dump. It shows just the ddl of objects in the discharge of taken export. There is no way except check for validation if made export has been made successfully, except to check the log file and at the end of the file journal meaage should be Job 'SYS '. "' SYS_EXPORT_SCHEMA_02 ' succeeded at 02:47:57

Some problems during export, the export will be terminated without success.

Anand

Tags: Database

Similar Questions

  • migration from 10g to 12 c using the data pump in

    Hi, while I used the data pump at the level of the schema before, I'm relatively new to the full database import.

    We are trying a full database migration to 10.2.0.4 to 12 c using the complete method of database data pump over db link.

    the DBA has indicated to avoid move SYSAUX and SYSTEM objects. but initially during the documentation review, it appeared that these objects are not exported since the TRANSPORTABLE given target = system NEVER. If anyone can confirm this? done import and export log refers to the objects I thought would not:

    ...

    19:41:11.684 23 FEBRUARY 15:Estimated TABLE_DATA 3718 objects in 77 seconds

    19:41:12.450 23 February 15: total estimation using BLOCKS method: 52,93 GB

    19:41:14.058 23 February 15: object DATABASE_EXPORT/TABLESPACE of treatment type

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'UNDOTBS1' already exists

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'SYSAUX' already exists

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'TEMP' already exists

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'USERS' existing

    20:10:33.200 23 FEBRUARY 15:96 objects TABLESPACE finished in 1759 seconds

    20:10:33.208 23 February 15: treatment of type of object DATABASE_EXPORT/PROFILE

    20:10:33.445 23 FEBRUARY 15:7 PROFILE items finished in 1 seconds

    20:10:33.453 23 February 15: treatment of DATABASE_EXPORT/SYS_USER/USER object type

    20:10:33.842 23 FEBRUARY 15:1 USER objects ended in 0 seconds

    20:10:33.852 23 February 15: treatment of DATABASE_EXPORT/SCHEMA/USER object type

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'OUTLN' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'ANONYMOUS' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'OLAPSYS' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'MDDATA' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'SCOTT' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'LLTEST' already exists

    20:10:52.372 23 FEBRUARY 15:Finished objects USER 1140 in 19 seconds

    20:10:52.375 23 February 15: object DATABASE_EXPORT/ROLE of treatment type

    20:10:55.255 23 February 15: ORA-31684: object ROLE type: 'SELECT_CATALOG_ROLE' already exists

    20:10:55.255 23 February 15: ORA-31684: object ROLE type: 'EXECUTE_CATALOG_ROLE' already exists

    20:10:55.255 23 February 15: ORA-31684: object ROLE type: 'DELETE_CATALOG_ROLE' already exists

    20:10:55.256 23 February 15: ORA-31684: object ROLE type: 'RECOVERY_CATALOG_OWNER' already exists

    ...

    the most insight.

    The schema SYS, CTXSYS and MDSYS ORDSYS are not exported using exp/expdp

    DOC - ID: Note: 228482.1

    I guess that he has already installed a 12 c software and created an itseems database - so when you have imported you have this "already exists."

    Every time the database is created and the software installed by default system, sys, sysaux will be created.

  • differences between the Data Pump to back up the database and use RMAN?

    What are the differences between the Data Pump to back up the database and use RMAN? What is DISADVANTAGES and BENEFITS?

    Thank you

    Search for the backup of the database in

    http://docs.Oracle.com/CD/B28359_01/server.111/b28318/backrec.htm#i1007289

    In brief

    RMAN-> physical backup. (copies of the physical database files)

    DataPump-> logical backup. (logical data such as tables, procedures)

    Docs for RMAN-

    http://docs.Oracle.com/CD/B28359_01/backup.111/b28270/rcmcncpt.htm#

    Datapump docs

    http://docs.Oracle.com/CD/B19306_01/server.102/b14215/dp_overview.htm

    Published by: Sunny kichloo on July 5, 2012 06:55

  • Follow the progress of the import of the data pump network?

    I'm on Oracle 10.2.0.4 (SunOS) and execution of an import of network data pump from a list of tables in a schema.

    I see that the following views are available to track the data pump tasks:

    DBA_DATAPUMP_JOBS - a list and the status of the data pump jobs running
    DBA_DATAPUMP_SESSIONS - list of user sessions attached to each data pump task (which may be attached to the session of v$)
    DBA_RESUMABLE - See labor being imported and its status
    V$ SESSION_LONGOPS - can see the total size of imports and the elapsed time for work of pump data if they run long enough

    What other options are available for the increase in imports of network monitoring?
    Also, is it possible to see at which table is being processed for a network of several tables import?

    That would have helped. :^)

    When you run a job, if you do not specify a task name, then it will generate one for you. In your case, I don't see a name of the specified job, it seems that one would be generated. The generated name looks like:

    SYS_IMPORT_FULL_01 if a plenty to do
    SYS_IMPORT_SCHEMA_01 if make a diagram
    SYS_IMPORT_TABLE_01 if a table
    SYS_IMPROT_TRANSPORTABLE_01 if you are using the transportable tablespace.

    01 is the first array to create. If there is a second job while this job is currently running, it will be 02. The number can also be moved if the job fails, and is not cleaned. In your case, you make a table, so the default task name would be something like:

    SYS_IMPORT_TABLE_01 and lets say you ran this with the diagram of the SYSTEM.

    In this case, you can run this command:

    Impdp system/password attach = system.sys_import_table_01

    This will bring you to the datapump prompt, where you can type in the location or status 10, etc..

    Dean

  • I want to learn more about the data pump and table space transferable

    Please notify easy tutorials as I want to know how to import and export between oracle 10 and 11.
    Thank you

    Hello
    Please check this oracle tutorial:
    http://www.exforsys.com/tutorials/Oracle-10G/Oracle-10G-using-data-pump-import.html
    about transportable table spaces, you may consult:
    http://www.rampant-books.com/art_otn_transportable_tablespace_tricks.htm
    Kind regards
    Mohamed
    Oracle DBA

  • Best practices for the data pump or import process?

    We are trying to copy the existing to another newly created schema schema. Pump data export to succeed the export schema.

    However, we met errors when you import dump again file schema. Remapped schema and storage areas, etc.
    Most of the errors occur in PL/SQL... For example, we have views as below in the original schema:
    "
    CREATE the VIEW * oldschema.myview * AS
    SELECT col1, col2, col3
    OF * oldschema.mytable *.
    WHERE coll1 = 10
    .....
    "
    Quite a few functions, procedures, packages and triggers contain "* oldschema.mytable *" in the DML (insert, select, update), for example.

    Get the following errors in the import log:
    ORA-39082: object ALTER_FUNCTION type: 'TEST '. "' MYFUNC ' created with compilation warnings
    ORA-39082: ALTER_PROCEDURE object type: 'TEST '. "" MYPROCEDURE "created with compilation warnings
    ORA-39082: the VIEW object type: 'TEST '. "' BIRD ' created with compilation warnings
    ORA-39082: object PACKAGE_BODY type: 'TEST '. "' MYPACKAGE ' created with compilation warnings
    ORA-39082: TRIGGER object type: 'TEST '. "' MON_TRIGGER ' created with compilation warnings

    Many actual errors/no valid in the new schema objects are due to:
    ORA-00942: table or view does not exist

    My question is:
    1. What can we do to correct these errors?
    2. is there a better way to do the import with such condition?
    3 update PL/SQL and recompile with the new scheme? Or update in the scheme of origin, firstly and export?

    Your help will be greatly appreciated!

    Thank you!

    @?/rdbms/admin/utlrp.sql

    Will compile the objects in the database through drawings. In your case, you re-mapping from one schema to another and utlrp objects will not be able to compile.

    SQLFILE impdp option allows to generate the DDL of the discharge of export and change the name of the schema on a global scale and run the script in sqlplus. This should solve most of your errors. If you still see errors, now proceed to utlrp.sql.

    -André

  • To update the schema in the data pump

    Hello

    Database version: 11.2.0.1.0

    Is that we can update the schema in the database without creating the similar to the network mode dump file

    for example

    I have a DB 1 database to say, it contains two schema Test and production, we can update the test with production data without creating the dump file in the database server.

    the whole idea is to perform the export and import data in a single step here to reduce the time and human intervention.

    Currently, I've followed the steps below:

    (1) export of production data

    (2) file the test schema

    (3) create a diagram of the test

    (4) import production data in the test schema

    Thank you

    Sery

    Hello

    It is possible.,.

    SQL > create public database link impdpt connect to the SYSTEM identified by abc123 using 'DG1 ';

    Database link created.

    SQL >

    -----

    Directory of system/abc123 impdp [oracle@prima admin] $ = network_link = impdpt schemas = hr remap_schema hr:hrtn logfile = test_same.log = DATA_PUMP_DIR

    Import: Release 11.2.0.4.0 - Production on Tue Dec 29 02:53:24 2015

    Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

    Connected to: Oracle Database 11 g Enterprise Edition Release 11.2.0.4.0 - 64 bit Production

    With partitioning, OLAP, Data Mining and Real Application Testing options

    Start "SYSTEM". "" SYS_IMPORT_SCHEMA_01 ": System / * Directory = network_link = impdpt schemas = hr remap_schema hr:hrtn logfile = test_same.log = DATA_PUMP_DIR

    Current estimation using BLOCKS method...

    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA

    Total estimation using BLOCKS method: 448 KB

    Processing object type SCHEMA_EXPORT/USER

    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT

    Processing object type SCHEMA_EXPORT/ROLE_GRANT

    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE

    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA

    Object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE of treatment

    Object type SCHEMA_EXPORT/TABLE/TABLE processing

    . . imported "HRTN. "" COUNTRY "25 lines

    . . imported "HRTN. "' DEPARTMENTS ' 27 lines

    . . imported "HRTN. "' EMPLOYEES ' 107 lines

    . . imported "HRTN. "JOBS"19 ranks. "

    . . imported "HRTN. "" JOB_HISTORY "10 lines

    . . imported "HRTN. "" LOCATIONS "23 lines

    . . imported "HRTN. "The REGIONS"4 lines.

    Processing object type SCHEMA_EXPORT/TABLE/SCHOLARSHIP/OWNER_GRANT/OBJECT_GRANT

    Object type SCHEMA_EXPORT/TABLE/COMMENT of treatment

    Object type SCHEMA_EXPORT/PROCEDURE/treatment PROCEDURE

    Object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE processing

    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX

    Object type SCHEMA_EXPORT/TABLE/CONSTRAINT/treatment

    Object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS of treatment

    Object type SCHEMA_EXPORT/VIEW/VIEW processing

    Object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT of treatment

    Object type SCHEMA_EXPORT/TABLE/TRIGGER processing

    Object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS treatment

    Work 'SYSTEM '. "' SYS_IMPORT_SCHEMA_01 ' completed Fri Dec 29 02:54:09 2015 elapsed 0 00:00:44

    [oracle@prima admin] $

  • use the data pump FLASHBACK_SCN

    Hey,.

    I try to export a schema, as below,

    expdp------"/ ACE sysdba\ ' DIRECTORY = data_pump_dir1 DUMPFILE = xxx.dmp LOGFILE = data_pump_dir1:exp_xxxx.log = FLASHBACK_SCN = xxxxx xxx DRAWINGS


    so first, I need to get a current database of YVERT.

    SQL > select flashback_on, current_scn from v$ database;

    FLASHBACK_ON CURRENT_SCN
    ------------------ -----------
    YES 7.3776E + 10

    SQL > select to_char (7.3776E + 10, '9999999999999999999') of double;

    TO_CHAR (7.3776E + 10',
    --------------------
    73776000000



    or

    SQL > SELECT to_char (DBMS_FLASHBACK. GET_SYSTEM_CHANGE_NUMBER, ' 9999999999999999999') twice;

    TO_CHAR (DBMS_FLASHBA
    --------------------
    73775942383



    Assume they are all the current SCN for the database number... so I do not understand why these two numbers are different (73776000000 and 73775942383)?

    should what number I use to export current database dump?


    Thank you very much!!!

    Published by: 995137 on May 30, 2013 08:25

    Hello

    You can use one, is it from the difference because tehre's time diffenence between requests, you can check with

    column scn format 9999999999999;
    select current_scn scn from v$database
    union
    select dbms_flashback.get_system_change_number scn from dual;
    

    HTH

  • Moving all the newspapers and Materialized View at the schema level using the data pump in

    Hi Experts,

    Please help me on how I can exp/imp all materialized views andMV logs (as are some MVs) only the full scheme of other databases. I want to exclude everything else.

    Concerning
    -Samar-

    Using DBMS_METADATA. Create the following SQL script:

    SET FEEDBACK OFF
    SET SERVEROUTPUT ON FORMAT WORD_WRAPPED
    SET TERMOUT OFF
    SPOOL C:\TEMP\MVIEW.SQL
    DECLARE
        CURSOR V_MLOG_CUR
          IS
            SELECT  DBMS_METADATA.GET_DDL('MATERIALIZED_VIEW_LOG',LOG_TABLE) DDL
              FROM  USER_MVIEW_LOGS;
        CURSOR V_MVIEW_CUR
          IS
            SELECT  DBMS_METADATA.GET_DDL('MATERIALIZED_VIEW',MVIEW_NAME) DDL
              FROM  USER_MVIEWS;
    BEGIN
        DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'SQLTERMINATOR',TRUE);
        FOR V_REC IN V_MLOG_CUR LOOP
          DBMS_OUTPUT.PUT_LINE(V_REC.DDL);
        END LOOP;
        FOR V_REC IN V_MVIEW_CUR LOOP
          DBMS_OUTPUT.PUT_LINE(V_REC.DDL);
        END LOOP;
    END;
    /
    SPOOL OFF
    

    In my case the script is saved as C:\TEMP\MVIEW_GEN. SQL. Now I will create a journal mview and mview in schema SCOTT and run the script above:

    SQL> CREATE MATERIALIZED VIEW LOG ON EMP
      2  /
    
    Materialized view log created.
    
    SQL> CREATE MATERIALIZED VIEW EMP_MV
      2  AS SELECT * FROM EMP
      3  /
    
    Materialized view created.
    
    SQL> @C:\TEMP\MVIEW_GEN
    SQL> 
    

    Run the C:\TEMP\MVIEW_GEN script. SQL generated a C:\TEMP\MVIEW queue. SQL:

      CREATE MATERIALIZED VIEW LOG ON "SCOTT"."EMP"
     PCTFREE 10 PCTUSED 30 INITRANS
    1 MAXTRANS 255 LOGGING
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1
    MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL
    DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT)
      TABLESPACE "USERS" 
    
    WITH PRIMARY KEY EXCLUDING NEW VALUES;
    
      CREATE MATERIALIZED VIEW "SCOTT"."EMP_MV" ("EMPNO", "ENAME", "JOB", "MGR",
    "HIREDATE", "SAL", "COMM", "DEPTNO")
      ORGANIZATION HEAP PCTFREE 10 PCTUSED 40
    INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
      STORAGE(INITIAL 65536 NEXT 1048576
    MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1
    BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT)
      TABLESPACE
    "USERS"
      BUILD IMMEDIATE
      USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 
    
    STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
    
    PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE
    DEFAULT CELL_FLASH_CACHE DEFAULT)
      TABLESPACE "USERS"
      REFRESH FORCE ON
    DEMAND
      WITH PRIMARY KEY USING DEFAULT LOCAL ROLLBACK SEGMENT
      USING ENFORCED
    CONSTRAINTS DISABLE QUERY REWRITE
      AS SELECT "EMP"."EMPNO"
    "EMPNO","EMP"."ENAME" "ENAME","EMP"."JOB" "JOB","EMP"."MGR"
    "MGR","EMP"."HIREDATE" "HIREDATE","EMP"."SAL" "SAL","EMP"."COMM"
    "COMM","EMP"."DEPTNO" "DEPTNO" FROM "EMP" "EMP";
                                   
    

    Now, you can run this on the database. You may need to adjust the tablespace and storage clauses. Or you can add more DBMS_METADATA. SET_TRANSFORM_PARAM calls to C:\TEMP\MVIEW_GEN. SQL to force DBMS_METADATA not to include the tablespace or / and the terms of storage.

    SY.

  • Import of differnet schema table using the data pump in

    Hi all
    I try to export a table to a diagram MIPS_MDM and import it in the MIPS_QUIRINO schema in the different database. I get the below error.

    expdp MIPS_MDM/MIPS_MDM@ext01mdm tables = QUARANTINE directory = DP_EXP_DIR dumpfile = quarantine.dmp logfile = qunat.log

    To import
    Impdp MIPS_QUIRINO/MIPS_QUIRINO@mps01dev tables = QUARANTINE directory = DP_EXP_DIR dumpfile = quarantine.dmp logfile = impd.log

    Please can I know what is the exact syntax to import in the scheme of differnet.

    Thank you.

    http://download.Oracle.com/docs/CD/E11882_01/server.112/e16536/dp_import.htm#BEHFIEIH

  • With the help of Data Pump for Migration

    Hi all

    Version of database - 11.2.0.3

    RHEL 6

    Size of the DB - 150 GB

    I have to run a Migration of database from one server to another (AIX for Linux), we will use the data pump Option, we will migrate from Source to the target using schemas expdp option (5 patterns will be exported and imported on the target Machine). But it won't go live for target, after that the development of this migration team will do a job on this machine target which will take 2 days for them to fill in and to cultivate these 2 days, source database will run as production.

    Now, I have obligation which, after the development team, complete their work that I have to the changes of 2 days of source to target after what target will act as production.

    I want to know what options are available in Data Pump can I use to do this.

    Kind regards

    No business will update something that has some data that are no longer representative of live.

    Sounds like a normal upgrade, but you test it just on a copy of the direct - make sure that the process works & then play it comfortable once again, but against your last set of timely production data.

    Deans suggestion is fine, but rather than dropping coins and importation, personally I tend to keep things simple and do it all (datapump full schema if it is possible). Live in this way, you know that you put off, inclusive of all sequences and objects (sequences could be incremented, so you must re-create the fall /). Otherwise you are dividing a upgrade in stages, several measures to trace & more to examine potential conflicts. Even if they are simple, a full datapump would be preferable. Simple is always best with production data

    Also - you do not know the changes that have been made to upgrade the new environment... so you roll this back etc? Useful to look at. Most of the migration would be a db via RMAN copy / transport-endian, as you must also make sure that you inherit all the subsidies system patterns, not only summary level.

  • ORA-39097: Data Pump job encountered the error unexpected-39076

    Hello world

    Today, I tried to take a pump dump to export my test database (specific table), the version is 10.2.0.4 on Solaris10(64-bit) and I got the following error message

    Work 'SYSTEM '. "" SYS_EXPORT_TABLE_23 "carried out at 09:51:36
    ORA-39097: Data Pump job encountered the error unexpected-39076
    ORA-39065: exception of unexpected master process in KUPV$ FT_INT. DELETE_JOB
    ORA-39076: cannot remove SYS_EXPORT_TABLE_23 work for the SYSTEM user
    ORA-06512: at "SYS." DBMS_SYS_ERROR', line 95
    ORA-06512: at "SYS." "KUPV$ FT_INT", line 934
    ORA-31632: main table "of the SYSTEM. SYS_EXPORT_TABLE_23"not found, invalid or unreachable
    ORA-06512: at "SYS." DBMS_SYS_ERROR', line 95
    ORA-06512: at "SYS." "KUPV$ FT_INT", line 1079
    ORA-20000: failed to send the e-mail message from pl/sql because of:
    ORA-29260: network error: Connect failed because target host or object does not exist
    ORA-39097: Data Pump job encountered the error unexpected-39076
    ORA-39065: exception unexpected master process in HAND
    ORA-39076: cannot remove SYS_EXPORT_TABLE_23 work for the SYSTEM user
    ORA-06512: at "SYS." DBMS_SYS_ERROR', line 95
    ORA-06512: at "SYS." "KUPV$ FT_INT", line 934
    ORA-31632: main table "of the SYSTEM. SYS_EXPORT_TABLE_23"not found, invalid or unreachable
    ORA-06512: at "SYS." DBMS_SYS_ERROR', line 95
    ORA-06512: at "SYS." "KUPV$ FT_INT", line 1079
    ORA-20000: failed to send the e-mail message from pl/sql because of:
    ORA-29260: network error: Connect failed because target host or object does not exist

    I hope that the export dumpfile is valid, but I don't know why I get this error message. One faced this kind of problem. please me tips

    Thank you

    Shan

    Once you see this:

    Work 'SYSTEM '. "" SYS_EXPORT_TABLE_23 "carried out at 09:51:36

    The Data Pump task is done with the dumpfile. It is some clean that is needed and it looks like something in the cleaning failed. Don't know what it was, but you dumpfile should be good. An easy way to test is to run impdp with sqlfile. This will make all import will do, but instead to create objects, he writes the ddl in sql file.

    Impdp directory of the user/password sqlfile = my_test.sql = your_dir dupmfile = your_dump.dmp...

    If it works, then your dumpfile should be fine. The last action of export, it is write the main table Data Pump in the dumpfile. The first thing that import is read this table. So, if you can read it in (which sqlfile impdp) your dump is good.

    Dean

  • Export the whole (10 GB) using Data Pump utility export base

    Hello

    I have a requirement that we have to export all of the base (10 GB) using Data Pump export utility because it is not possible to send the discharge of 10 GB in a CD/DVD for the seller of our application system (to analyze a few problems that we have).

    Now when I checked online full export is available but not able to understand how it works, as we never used this data pump utility, we use the method of normal export. In addition, the data pump will reduce the size of the dump file so it can fit on a DVD or can we use utility export parallel DB full to split the files and include them in a DVD, it is possible.

    Please correct me if I am wrong and kindly help.

    Thank you for your help in advance.

    Pravin,

    The server saves files in the directory object that you specify on the command line. So what you want to do is:

    1. from your operating system, find an existing directory or create a new directory. In case you, C:/Dump is as good a place as any.

    2. connect to sqlplus and create the directory object. Just use the path. I use linux, so my directory looks like/scratch/xxx/yyy
    If you use Windows, the path to your directory would look like C:/Dump you should not attack.

    3. don't forget to grant access to this directory. You can grant access to a single user, group of users, or the public. Just like
    any other object.

    If it helps, or if she has answered your question, please mark messages with the appropriate tag.

    Thank you

    Dean

  • Data Pump Export/Import

    Hello Forum,

    I have a question regarding imports and exports of data pump, perhaps what I should already know.

    I need to empty a table that has about 200 million lines, that I need to get rid of about three quarters of data.

    My intention is to use the data pump to export the table and and indexes and constraints etc..

    The table has no relationship to any other table, it is composed of approximately 8 columns with constraints not null.

    My plan is

    1 truncate table

    2. disable or remove index

    3 leave the constraints in place?

    4. use data pump to import a lines to keep.

    My question

    will be my clues and imported too much constraints I want to import only a subset of my exported table?

    or

    If I dropped the table after truncation, I'll be able to import my table and indexes, even if I use the sub as part of my statement to import query functionality?

    My table using the query sub in data pump functionality must exist in the database before doing the import

    or handful of data pump import as usual IE will create table indices grants and statistics etc.?

    Thank you for your comments.

    Concerning

    Your approach is ineffective.

    What you need to do is to

    create the table in select foo * bar where the

    bar of truncate table;

    Insert / * + APPEND * / into select bar * foo.

    Rebuild the indexes on the table.

    Fact.

    This whole thing with expdp and impdp is only a waste of resources. My approach generates redo a minimum.

    ----------

    Sybrand Bakker

    Senior Oracle DBA

  • Data Pump: export/import tables in different schemas

    Hi all

    I use Oracle 11.2 and I would like to use the data pump to export / import data into the tables between the different schemas. The table already exists in the source and target schemas. Here are the commands that I use:


    Script working export:

    expdp scott/tiger@db12 schema = source include = TABLE:------"IN (\'TB_TEST1 ', \'TB_ABC')\ 'directory = datapump_dir dumpfile = test.dump logfile = test_exp.log

    Script to import all the desks:

    Impdp scott/tiger@db12 remap_schemas = rΘpertoire source: target = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only table_exists_action = truncate

    Script error for some tables to import:

    Impdp scott/tiger@db12 remap_schemas = source: target include = TABLE:------"IN (\'TB_TEST1')\' directory = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only

    Export is good, I got the folling error when I try to import the table TB_TEST1 only: "ORA-31655: no metadata data_or objectsz selected for employment. The user scott is a DBA role, and it works fine when I try to import all tables being export without the include clause.

    It is possble to import some tables but not all tables in the export file?

    Thanks for the help!

    942572 wrote:

    It's good to import all tables exported through Scott, who do NOT have the clause 'include '.

    I have the error only when I try to import some tables with clause "include".

    Can I import only some table since the export dump file? Thank you!

    you use INCLUDE incorrectly!

    do below yourself

    Impdp help = yes

    INCLUDE

    Include the specific object types.

    For example, INCLUDE TABLE_DATA =.

Maybe you are looking for

  • Transfer of pictures to all my devices

    Apple said to iCloud photo library - Apple Support: " After taking a photo in one of my Apple devices... Is downloading pictures to all my other devices Apple ONLY after opening Photos App? Or there is an automatic background activity for download wi

  • Update for root certificates for Windows XP, recalled the 25/03/2014?

    Hello. I just checked my two/2 XP SP3 machines Windows Updates and noticed "root certificates update for Windows XP [November 2013] (KB931125). According to the stories of the two machines, I took ' root certificates update for Windows XP [March 2014

  • How to install Windows on multiple PCs via network

    How can I install a Windows operating system in multiple PCs in a LAN/Lab environment?

  • HP Pavilion TouchSma 15-n201tu: graphics card

    Name of the product Laptop HP Pavilion 15-n201tu TouchSmart PC (ENERGY STAR) I want to know is it possible to upgrade my laptop using adapters (nividia) graphics. can I add a graphics card in my laptop. in the hope of an early response thanking you

  • RV082 VPN Client can connect only for 6 minutes

    Hello I have a RV082 with firmware 1.3.98 - tm. The problem I have is that a Client with Windows XP SP3 can connect only for 6 minutes exactly. In addition, a windows appears on the client saying that the remote system is not respoding and asking to