Data Pump: export/import tables in different schemas

Hi all

I use Oracle 11.2 and I would like to use the data pump to export / import data into the tables between the different schemas. The table already exists in the source and target schemas. Here are the commands that I use:


Script working export:

expdp scott/tiger@db12 schema = source include = TABLE:------"IN (\'TB_TEST1 ', \'TB_ABC')\ 'directory = datapump_dir dumpfile = test.dump logfile = test_exp.log

Script to import all the desks:

Impdp scott/tiger@db12 remap_schemas = rΘpertoire source: target = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only table_exists_action = truncate

Script error for some tables to import:

Impdp scott/tiger@db12 remap_schemas = source: target include = TABLE:------"IN (\'TB_TEST1')\' directory = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only

Export is good, I got the folling error when I try to import the table TB_TEST1 only: "ORA-31655: no metadata data_or objectsz selected for employment. The user scott is a DBA role, and it works fine when I try to import all tables being export without the include clause.

It is possble to import some tables but not all tables in the export file?

Thanks for the help!

942572 wrote:

It's good to import all tables exported through Scott, who do NOT have the clause 'include '.

I have the error only when I try to import some tables with clause "include".

Can I import only some table since the export dump file? Thank you!

you use INCLUDE incorrectly!

do below yourself

Impdp help = yes

INCLUDE

Include the specific object types.

For example, INCLUDE TABLE_DATA =.

Tags: Database

Similar Questions

  • Data Pump Export/Import

    Hello Forum,

    I have a question regarding imports and exports of data pump, perhaps what I should already know.

    I need to empty a table that has about 200 million lines, that I need to get rid of about three quarters of data.

    My intention is to use the data pump to export the table and and indexes and constraints etc..

    The table has no relationship to any other table, it is composed of approximately 8 columns with constraints not null.

    My plan is

    1 truncate table

    2. disable or remove index

    3 leave the constraints in place?

    4. use data pump to import a lines to keep.

    My question

    will be my clues and imported too much constraints I want to import only a subset of my exported table?

    or

    If I dropped the table after truncation, I'll be able to import my table and indexes, even if I use the sub as part of my statement to import query functionality?

    My table using the query sub in data pump functionality must exist in the database before doing the import

    or handful of data pump import as usual IE will create table indices grants and statistics etc.?

    Thank you for your comments.

    Concerning

    Your approach is ineffective.

    What you need to do is to

    create the table in select foo * bar where the

    bar of truncate table;

    Insert / * + APPEND * / into select bar * foo.

    Rebuild the indexes on the table.

    Fact.

    This whole thing with expdp and impdp is only a waste of resources. My approach generates redo a minimum.

    ----------

    Sybrand Bakker

    Senior Oracle DBA

  • Data Pump Export Wizard in TOAD

    Hello
    I am new to the interface of TOAD.

    I would like to export tables from a database and import it into another. I intend to use Data Pump Export / Import Wizard in TOAD.

    I installed 9.1 Toad on WINDOWS XP and I connect to the BOX (DB server) using the customer Oracle UNIX.

    I know that the command line data pump process IE $expdp and $impdp.

    But I don't have a direct access to the UNIX BOX, IE the HOST so I would use TOAD to do the job.


    I would like to know what is the process for this.

    How different it is to use the command line data pump... With the help of TOAD where we create DIRECTORY DATAPUMP?

    I can do it on the local computer?

    Basically, I would like to know the process of import/export using TOAD with no direct access to the UNIX MACHINE.

    Thanks in advance.

    user13517642 wrote:
    Hello
    I am new to the interface of TOAD.

    I would like to export tables from a database and import it into another. I intend to use Data Pump Export / Import Wizard in TOAD.

    I installed 9.1 Toad on WINDOWS XP and I connect to the BOX (DB server) using the customer Oracle UNIX.

    I know that the command line data pump process IE $expdp and $impdp.

    But I don't have a direct access to the UNIX BOX, IE the HOST so I would use TOAD to do the job.

    I would like to know what is the process for this.

    How different it is to use the command line data pump... With the help of TOAD where we create DIRECTORY DATAPUMP?

    I can do it on the local computer?

    Basically, I would like to know the process of import/export using TOAD with no direct access to the UNIX MACHINE.

    Thanks in advance.

    I don't think that you can do with TOAD withouth physically copy of the file on the remote host. However, you have another option. "You can use the link to database to load data * without copying it to the remote host" using NETWORK_LINK parameter, as described below:

    For export:
    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_export.htm#sthref144
    The NETWORK_LINK parameter initiates an export using a database link. This means that the system whereby the customer expdp is connected contact data source referenced by the source_database_link, he extracted data and writes the data to a dump on the connected system file.

    To imprt:
    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_import.htm#sthref320
    The parameter NETWORK_LINK initiates a network import. This means the impdp customer initiates import demand, usually in the local database. This server comes in contact with the remote source database referenced by source_database_link and retrieves the data directly, it rewrites in the target database. There is no involved dump file.

    Kamran Agayev a.
    Oracle ACE
    - - - - - - - - - - - - - - - - - - - - -
    My video tutorials of Oracle - http://kamranagayev.wordpress.com/oracle-video-tutorials/

  • Data pump - export without data

    To export the database without data in old tool exp was the parameter ROWS defined as N. How to import the schema of database without data using data pump technology?

    You can see by checking using dump export on your command line like this

    C:\Documents and Settings\nupneja>expdp -help
    
    Export: Release 10.2.0.1.0 - Production on Friday, 09 April, 2010 18:06:09
    
    Copyright (c) 2003, 2005, Oracle.  All rights reserved.
    
    The Data Pump export utility provides a mechanism for transferring data objects
    between Oracle databases. The utility is invoked with the following command:
    
       Example: expdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
    
    You can control how Export runs by entering the 'expdp' command followed
    by various parameters. To specify parameters, you use keywords:
    
       Format:  expdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
       Example: expdp scott/tiger DUMPFILE=scott.dmp DIRECTORY=dmpdir SCHEMAS=scott
                   or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
    
    USERID must be the first parameter on the command line.
    
    Keyword               Description (Default)
    ------------------------------------------------------------------------------
    ATTACH                Attach to existing job, e.g. ATTACH [=job name].
    COMPRESSION           Reduce size of dumpfile contents where valid
                          keyword values are: (METADATA_ONLY) and NONE.
    *CONTENT*               Specifies data to unload where the valid keywords are:
                          (ALL), DATA_ONLY, and METADATA_ONLY.
    DIRECTORY             Directory object to be used for dumpfiles and logfiles.
    DUMPFILE              List of destination dump files (expdat.dmp),
                          e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
    ENCRYPTION_PASSWORD   Password key for creating encrypted column data.
    ESTIMATE              Calculate job estimates where the valid keywords are:
                          (BLOCKS) and STATISTICS.
    ESTIMATE_ONLY         Calculate job estimates without performing the export.
    EXCLUDE               Exclude specific object types, e.g. EXCLUDE=TABLE:EMP.
    FILESIZE              Specify the size of each dumpfile in units of bytes.
    FLASHBACK_SCN         SCN used to set session snapshot back to.
    FLASHBACK_TIME        Time used to get the SCN closest to the specified time.
    FULL                  Export entire database (N).
    HELP                  Display Help messages (N).
    INCLUDE               Include specific object types, e.g. INCLUDE=TABLE_DATA.
    JOB_NAME              Name of export job to create.
    LOGFILE               Log file name (export.log).
    NETWORK_LINK          Name of remote database link to the source system.
    NOLOGFILE             Do not write logfile (N).
    PARALLEL              Change the number of active workers for current job.
    PARFILE               Specify parameter file.
    QUERY                 Predicate clause used to export a subset of a table.
    SAMPLE                Percentage of data to be exported;
    SCHEMAS               List of schemas to export (login schema).
    STATUS                Frequency (secs) job status is to be monitored where
                          the default (0) will show new status when available.
    TABLES                Identifies a list of tables to export - one schema only.
    TABLESPACES           Identifies a list of tablespaces to export.
    TRANSPORT_FULL_CHECK  Verify storage segments of all tables (N).
    TRANSPORT_TABLESPACES List of tablespaces from which metadata will be unloaded.
    VERSION               Version of objects to export where valid keywords are:
                          (COMPATIBLE), LATEST, or any valid database version.
    
    The following commands are valid while in interactive mode.
    Note: abbreviations are allowed
    
    Command               Description
    ------------------------------------------------------------------------------
    ADD_FILE              Add dumpfile to dumpfile set.
    CONTINUE_CLIENT       Return to logging mode. Job will be re-started if idle.
    EXIT_CLIENT           Quit client session and leave job running.
    FILESIZE              Default filesize (bytes) for subsequent ADD_FILE commands.
    HELP                  Summarize interactive commands.
    KILL_JOB              Detach and delete job.
    PARALLEL              Change the number of active workers for current job.
                          PARALLEL=.
    START_JOB             Start/resume current job.
    STATUS                Frequency (secs) job status is to be monitored where
                          the default (0) will show new status when available.
                          STATUS[=interval]
    STOP_JOB              Orderly shutdown of job execution and exits the client.
                          STOP_JOB=IMMEDIATE performs an immediate shutdown of the
                          Data Pump job.
    
    C:\Documents and Settings\nupneja>
    

    Content to the "metadata_only" parameter will export only the structure of the schema to skip the lines.

  • Excluding views materialized to a data pump export

    Hello

    I use the Oracle11g 11.1 RAC on Linux version

    I am trying to remove a schema of data pump export materialized views. I used EXCLUDE = MATERIALIZED_VIEW in the
    settings file. The materialized view is not exported, but the table associated with the view is always exported. Does anyone know how to remove those?

    Thank you

    Richard

    You will need to specify the tables in the exclude parameter.

    EXCLUDE the = MATERIALIZED_VIEW would not exclude the paintings of masters associates.

    In a single parameter EXCLUDE specify:
    EXCLUDE THE = MATERIALIZED_VIEW, TABLE: IN "('EMP', 'DÉPARTEMENT')" "

  • GR 11, 2 Data Pump. Import the table in a schema into another schema

    I have oracle stady. I export shema HR in the hrexport.dmp file. When I import tables from this file, I got hurt. I have used Enterprise Manager:
    1. connected by SYSTEM user as USUAL
    2 selected, choose the type of import - tables
    3. the data in the file imported
    4 selected tables to import tables
    5. in the next step, I try to insert a row into the table remapping patterns and change the cell Destination Shema, but in the list is only one name of shema - HR! Why?

    Published by: alvahtin on 10.03.2013 06:11

    ORA-39166: Object SYSTEM. EMPLOYEES were found.
    ORA-39166: Object SYSTEM. The DEPARTMENTS was not found.
    ORA-39166: Object SYSTEM. PLACES not found.

    Tables are not owned by system. Try

    impdp system/oracle remap_schema=hr:inventory tables=hr.employees, hr.departments, hr.locations ........
    
  • How to choose the access method (direct path or external tables) for Data Pump export?

    I have this slow data export pump, and I have a few suggestions for settings that might improve the speed. But I can't seem to pass them through the DBMS_DATAPUMP package. Is this possible?

    REPORT THE NUMBER OF PUMP_HANDLE: = DBMS_DATAPUMP. OPEN (OPERATION = > 'EXPORT', JOB_MODE = > 'TABLE', JOB_NAME = > 'EXP_DATABASE_370');

    BEGIN

    DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A1.) DMP', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A2.) DMP', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A5.) TXT', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);

    DBMS_DATAPUMP. METADATA_FILTER (PUMP_HANDLE, NAME = > 'NAME_EXPR', VALUE = > 'IN ("MY_DATABASE_370")');

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'INCLUDE_METADATA', VALUE = > 1);

    DBMS_DATAPUMP. SET_PARALLEL (PUMP_HANDLE, LEVEL = > 4);

    < < THIS_LINE_FAILS > > DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > "DIRECT_PATH");

    DBMS_DATAPUMP. START_JOB (PUMP_HANDLE);

    DBMS_DATAPUMP. DETACH (PUMP_HANDLE);

    END;

    < < THIS_LINE_FAILS > > line throws an exception:

    ORA-20020: error: ORA-39001: value of the invalid argument. ORA-39049: parameter not valid name ACCESS_METHOD;

    ORA-06512: at line 10

    Replace < < THIS_LINE_FAILS > > this call fails with the same message

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > "EXTERNAL_TABLES");

    Replace < < THIS_LINE_FAILS > > this call fails with the same message

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > 1); / * INTEGER does not seem to work either * /.

    Replace < < THIS_LINE_FAILS > > this call also fails with a message similar

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'PARALLEL_FORCE_LOCAL', VALUE = > 1);

    Replacement of < < THIS_LINE_FAILS > > with this call fails also, with a quite different message

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'Settings', VALUE = > "DISABLE_APPEND_HINT");

    ORA-20020: error: ORA-39001: value of the invalid argument. ORA-39207: NULL value is not valid for the parameter settings. ;

    Hello

    you have ACCESS_METHOD we DATA_ACCESS_METHOD. Just give a try.

    see you soon,

    rich

  • Best practices for the data pump or import process?

    We are trying to copy the existing to another newly created schema schema. Pump data export to succeed the export schema.

    However, we met errors when you import dump again file schema. Remapped schema and storage areas, etc.
    Most of the errors occur in PL/SQL... For example, we have views as below in the original schema:
    "
    CREATE the VIEW * oldschema.myview * AS
    SELECT col1, col2, col3
    OF * oldschema.mytable *.
    WHERE coll1 = 10
    .....
    "
    Quite a few functions, procedures, packages and triggers contain "* oldschema.mytable *" in the DML (insert, select, update), for example.

    Get the following errors in the import log:
    ORA-39082: object ALTER_FUNCTION type: 'TEST '. "' MYFUNC ' created with compilation warnings
    ORA-39082: ALTER_PROCEDURE object type: 'TEST '. "" MYPROCEDURE "created with compilation warnings
    ORA-39082: the VIEW object type: 'TEST '. "' BIRD ' created with compilation warnings
    ORA-39082: object PACKAGE_BODY type: 'TEST '. "' MYPACKAGE ' created with compilation warnings
    ORA-39082: TRIGGER object type: 'TEST '. "' MON_TRIGGER ' created with compilation warnings

    Many actual errors/no valid in the new schema objects are due to:
    ORA-00942: table or view does not exist

    My question is:
    1. What can we do to correct these errors?
    2. is there a better way to do the import with such condition?
    3 update PL/SQL and recompile with the new scheme? Or update in the scheme of origin, firstly and export?

    Your help will be greatly appreciated!

    Thank you!

    @?/rdbms/admin/utlrp.sql

    Will compile the objects in the database through drawings. In your case, you re-mapping from one schema to another and utlrp objects will not be able to compile.

    SQLFILE impdp option allows to generate the DDL of the discharge of export and change the name of the schema on a global scale and run the script in sqlplus. This should solve most of your errors. If you still see errors, now proceed to utlrp.sql.

    -André

  • Unable to export specific tables in the schema.

    I created a schema duser, the data is imported and now I want to take some pictures from this scheme.
    the tables are (partyhdr, partyaddressdtl, partycontactdtl, partydsdtl, partytdsexcludedtl, partycurrencydtl)


    D:\ > EXP FILE DUSER/LOG@ORCL = 20121221PARTY0228PM. DMP TABLES = PARTYHDR, PARTYADDRESS
    DTL, PARTYCONTACTDTL, PARTYTDSDTL, PARTYTDSEXCLUDEDTL, PARTYCURRENCYDTL;

    Export: Release 10.1.0.2.0 - Production on Fri dec 21 14:30:21 2012
    Copyright (c) 1982, 2004, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Release 10.1.0.2.0 - Production
    Export performed WE8MSWIN1252 and AL16UTF16 NCHAR character set
    About to export specified tables by conventional means...
    . . export of table PARTYHDR 19387 rows exported
    . . export of table PARTYADDRESSDTL 20747 rows exported
    . . export the PARTYCONTACTDTL 226 exported table rows
    . . export the PARTYTDSDTL 53 exported table rows
    . . export the PARTYTDSEXCLUDEDTL 2 exported table rows
    EXP-00011: DUSER. PARTYCURRENCYDTL; There is no
    Export completed successfully with warnings.

    as I check table in schema partycurrencydtl is their AND MONTRANT RECORDS.

    SELECT OBJECT_TYPE OBJECT WHERE OWNER = 'DUSER' AND OBJECT_NAME = "PARTYCURRENCYDTL";

    OBJECT_TYPE
    -------------------
    TABLE

    I can understand why this particular table does not export.

    Kindly help.

    Please use the below command without semicolon

    EXP FILE = 20121221PARTY0228PM DUSER/LOG@ORCL. DMP TABLES = PARTYHDR, PARTYADDRESS
    DTL, PARTYCONTACTDTL, PARTYTDSDTL, PARTYTDSEXCLUDEDTL, PARTYCURRENCYDTL

  • selective column of data pump export

    Dear Experts,

    I'm using Oracle 11 g. With the help of data pump is it possible to choose only some columns in a table to export.

    Thanks in advance.

    spur230 wrote:
    Dear Experts,

    I'm using Oracle 11 g. With the help of data pump is it possible to choose only some columns in a table to export.

    Thanks in advance.

    It is not possible in data pum export only selective table columns. However you can try next.

    (1) create table export_selective as select c1, c2 from the source table; (you can create it in the source db)

    (2) expdp export_selective table

    (3) export_selective impdp in the target table

    Reverse:
    Dblink allows to obtain this table created with selective columns using DEC

  • How to join two tables of different schemas of Oracle by using a subquery

    I'm trying to join two different schemas of Oracle tables using a subquery. I can extract data from each of the tables without problem. However, when I combine select statements it by using a subquery I get Oracle error *'ORA-00936: lack of expression ' *. Given that each SELECT statement runs on its own without error I don't understand what's missing. I'm trying to get the result set is matching the LINE_ID of PDTABLE_12_1 in the schema with the table PDTABLE_201 MAT_DESCRIPTION DD_12809 in the RA_12809 schema.

    The query is as follows:

    SQL = "SELECT [DD_12809]. [PDTABLE_12_1]. LINE_ID OF [DD_12809]. [PDTABLE_12_1] JOIN "_".
    + "(SELECT [RA_12809]. [PDTABLE_201]. MAT_DESCRIPTION "_".
    "FROM [RA_12809]. [PDTABLE_201]) AS FAB "_".
    + 'ON [DD_12809]. [PDTABLE_12_1]. PIPING_MATER_CLASS = FAB. PIPING_MATER_CLASS ".

    The format of the request is copied from a manual of programming SQL.

    I also tried running the query uses a right JOIN on the two tables, but got the same results. Any ideas would be useful. Thank you!

    Published by: user11338343 on October 19, 2009 06:55

    The format for the join of two tables in Oracle feels like any other database:

    SELECT a.col1, a.col2, ..., b.col1, b.col2, ...
    FROM   schema1.table1 a,
           JOIN schema2.table2 b ON a.col = b.col and ...
    WHERE  a.other_col = 'FRED'
    AND    ....;
    

    Do not place bracketed identifiers in Oracle. The account that connects to the database must have select privileges on the two tables.

  • Export / Import Tables

    Hi guys,.

    I am currently using Oracle 9i on Sun Solaris.

    I want to know if it is possible to import a dump of a different table name tables in the database.

    Thanks in advance.

    You must rename the existing table in the database before you import it using rename command.

    RENAME ORIGINA_TABLE IN TEMP

    IMPORT ORIGINAL_TABLE

    RENAME ORIGINAL_TABLE IN TEMP1

    TEMP AND RENAME IT ORIGINAL_TABLE.

    If your application is OLTP and you need to import the database online, and then try to import the ORIGINAL_TABLE into a different schema.

    Concerning
    Srinivas

  • Create table form another table in different schema throwing error when dynamic sql

    Hello

    With the help of 11.2.0.3 and was following the issue.

    To create a table in a schema (b) using data from another schema (b)

    If independent run in sqlplus create works OK but same sql in dynamic sql block saying the table or view does not exist.

    SQL even in dynamic sql

    {code}

    v_sql: =' create table new_table in select * from schemab.table_name where...';

    run immediately (v_sql);

    [code}

    Other tables work fine.

    Any ideas - don't want to grant all the schemaa.table schema b if can avoid.

    Thank you

    You run the immediate execution in an anonymous block or a stored procedure?  If it is a stored procedure, then as others have said, the owner of the procedure must have select privileges on granted directly schemab.table_name.

    Another possibility, which would be the case for a stored procedure, or an anonymous block is that your code is something like:

    v_sql: =' create table new_table in select * from schemab.table_name where...';

    immediately run v_sql;

    Select count (*) in the l_count new_table;

    who will fail at compile time because new_table does not exist.

    When you create objects using dynamic sqly you dynamic sql user to reference them in the block of code.

    John

  • foreign key referencing the table in different schema

    I create a foreign key for the table in the schema A a table to diagram B.

    I do that by grant references (column name) on the name of the table to username;


    is there any drawback/disadvantage in the creation of foreign keys referencing tables in another schema?

    Not as such, no.

    Generally, you want to check a second time when you are in this kind of situation that the two tables really belong in different schemas. It should be relatively rare to find a child table that belongs to a different schema than the parent. Sometimes, but it should be an exception, not the rule.

    Justin

  • Data pump export

    Hello

    I use

    patterns of expdp dumpfile = 31082015.dmp system/ar8mswin1256@nt11g = dbo_mobile_webresults_test

    and I face this error:

    DEU-00018: customer data pump is incompatible with the version of database 11.01.00.07.00

    I think it's a problem of version.

    I found that the database on the server that I am connected 11.2.0.1.0 - 64 bit

    and my client is 11.1.0.7.0

    I tried it on another pc and it worked.

    Thank you very much

Maybe you are looking for