Data Pump Export/Import

Hello Forum,

I have a question regarding imports and exports of data pump, perhaps what I should already know.

I need to empty a table that has about 200 million lines, that I need to get rid of about three quarters of data.

My intention is to use the data pump to export the table and and indexes and constraints etc..

The table has no relationship to any other table, it is composed of approximately 8 columns with constraints not null.

My plan is

1 truncate table

2. disable or remove index

3 leave the constraints in place?

4. use data pump to import a lines to keep.

My question

will be my clues and imported too much constraints I want to import only a subset of my exported table?

or

If I dropped the table after truncation, I'll be able to import my table and indexes, even if I use the sub as part of my statement to import query functionality?

My table using the query sub in data pump functionality must exist in the database before doing the import

or handful of data pump import as usual IE will create table indices grants and statistics etc.?

Thank you for your comments.

Concerning

Your approach is ineffective.

What you need to do is to

create the table in select foo * bar where the

bar of truncate table;

Insert / * + APPEND * / into select bar * foo.

Rebuild the indexes on the table.

Fact.

This whole thing with expdp and impdp is only a waste of resources. My approach generates redo a minimum.

----------

Sybrand Bakker

Senior Oracle DBA

Tags: Database

Similar Questions

  • Data Pump: export/import tables in different schemas

    Hi all

    I use Oracle 11.2 and I would like to use the data pump to export / import data into the tables between the different schemas. The table already exists in the source and target schemas. Here are the commands that I use:


    Script working export:

    expdp scott/tiger@db12 schema = source include = TABLE:------"IN (\'TB_TEST1 ', \'TB_ABC')\ 'directory = datapump_dir dumpfile = test.dump logfile = test_exp.log

    Script to import all the desks:

    Impdp scott/tiger@db12 remap_schemas = rΘpertoire source: target = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only table_exists_action = truncate

    Script error for some tables to import:

    Impdp scott/tiger@db12 remap_schemas = source: target include = TABLE:------"IN (\'TB_TEST1')\' directory = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only

    Export is good, I got the folling error when I try to import the table TB_TEST1 only: "ORA-31655: no metadata data_or objectsz selected for employment. The user scott is a DBA role, and it works fine when I try to import all tables being export without the include clause.

    It is possble to import some tables but not all tables in the export file?

    Thanks for the help!

    942572 wrote:

    It's good to import all tables exported through Scott, who do NOT have the clause 'include '.

    I have the error only when I try to import some tables with clause "include".

    Can I import only some table since the export dump file? Thank you!

    you use INCLUDE incorrectly!

    do below yourself

    Impdp help = yes

    INCLUDE

    Include the specific object types.

    For example, INCLUDE TABLE_DATA =.

  • Data Pump Export Wizard in TOAD

    Hello
    I am new to the interface of TOAD.

    I would like to export tables from a database and import it into another. I intend to use Data Pump Export / Import Wizard in TOAD.

    I installed 9.1 Toad on WINDOWS XP and I connect to the BOX (DB server) using the customer Oracle UNIX.

    I know that the command line data pump process IE $expdp and $impdp.

    But I don't have a direct access to the UNIX BOX, IE the HOST so I would use TOAD to do the job.


    I would like to know what is the process for this.

    How different it is to use the command line data pump... With the help of TOAD where we create DIRECTORY DATAPUMP?

    I can do it on the local computer?

    Basically, I would like to know the process of import/export using TOAD with no direct access to the UNIX MACHINE.

    Thanks in advance.

    user13517642 wrote:
    Hello
    I am new to the interface of TOAD.

    I would like to export tables from a database and import it into another. I intend to use Data Pump Export / Import Wizard in TOAD.

    I installed 9.1 Toad on WINDOWS XP and I connect to the BOX (DB server) using the customer Oracle UNIX.

    I know that the command line data pump process IE $expdp and $impdp.

    But I don't have a direct access to the UNIX BOX, IE the HOST so I would use TOAD to do the job.

    I would like to know what is the process for this.

    How different it is to use the command line data pump... With the help of TOAD where we create DIRECTORY DATAPUMP?

    I can do it on the local computer?

    Basically, I would like to know the process of import/export using TOAD with no direct access to the UNIX MACHINE.

    Thanks in advance.

    I don't think that you can do with TOAD withouth physically copy of the file on the remote host. However, you have another option. "You can use the link to database to load data * without copying it to the remote host" using NETWORK_LINK parameter, as described below:

    For export:
    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_export.htm#sthref144
    The NETWORK_LINK parameter initiates an export using a database link. This means that the system whereby the customer expdp is connected contact data source referenced by the source_database_link, he extracted data and writes the data to a dump on the connected system file.

    To imprt:
    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_import.htm#sthref320
    The parameter NETWORK_LINK initiates a network import. This means the impdp customer initiates import demand, usually in the local database. This server comes in contact with the remote source database referenced by source_database_link and retrieves the data directly, it rewrites in the target database. There is no involved dump file.

    Kamran Agayev a.
    Oracle ACE
    - - - - - - - - - - - - - - - - - - - - -
    My video tutorials of Oracle - http://kamranagayev.wordpress.com/oracle-video-tutorials/

  • Data pump - export without data

    To export the database without data in old tool exp was the parameter ROWS defined as N. How to import the schema of database without data using data pump technology?

    You can see by checking using dump export on your command line like this

    C:\Documents and Settings\nupneja>expdp -help
    
    Export: Release 10.2.0.1.0 - Production on Friday, 09 April, 2010 18:06:09
    
    Copyright (c) 2003, 2005, Oracle.  All rights reserved.
    
    The Data Pump export utility provides a mechanism for transferring data objects
    between Oracle databases. The utility is invoked with the following command:
    
       Example: expdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
    
    You can control how Export runs by entering the 'expdp' command followed
    by various parameters. To specify parameters, you use keywords:
    
       Format:  expdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
       Example: expdp scott/tiger DUMPFILE=scott.dmp DIRECTORY=dmpdir SCHEMAS=scott
                   or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
    
    USERID must be the first parameter on the command line.
    
    Keyword               Description (Default)
    ------------------------------------------------------------------------------
    ATTACH                Attach to existing job, e.g. ATTACH [=job name].
    COMPRESSION           Reduce size of dumpfile contents where valid
                          keyword values are: (METADATA_ONLY) and NONE.
    *CONTENT*               Specifies data to unload where the valid keywords are:
                          (ALL), DATA_ONLY, and METADATA_ONLY.
    DIRECTORY             Directory object to be used for dumpfiles and logfiles.
    DUMPFILE              List of destination dump files (expdat.dmp),
                          e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
    ENCRYPTION_PASSWORD   Password key for creating encrypted column data.
    ESTIMATE              Calculate job estimates where the valid keywords are:
                          (BLOCKS) and STATISTICS.
    ESTIMATE_ONLY         Calculate job estimates without performing the export.
    EXCLUDE               Exclude specific object types, e.g. EXCLUDE=TABLE:EMP.
    FILESIZE              Specify the size of each dumpfile in units of bytes.
    FLASHBACK_SCN         SCN used to set session snapshot back to.
    FLASHBACK_TIME        Time used to get the SCN closest to the specified time.
    FULL                  Export entire database (N).
    HELP                  Display Help messages (N).
    INCLUDE               Include specific object types, e.g. INCLUDE=TABLE_DATA.
    JOB_NAME              Name of export job to create.
    LOGFILE               Log file name (export.log).
    NETWORK_LINK          Name of remote database link to the source system.
    NOLOGFILE             Do not write logfile (N).
    PARALLEL              Change the number of active workers for current job.
    PARFILE               Specify parameter file.
    QUERY                 Predicate clause used to export a subset of a table.
    SAMPLE                Percentage of data to be exported;
    SCHEMAS               List of schemas to export (login schema).
    STATUS                Frequency (secs) job status is to be monitored where
                          the default (0) will show new status when available.
    TABLES                Identifies a list of tables to export - one schema only.
    TABLESPACES           Identifies a list of tablespaces to export.
    TRANSPORT_FULL_CHECK  Verify storage segments of all tables (N).
    TRANSPORT_TABLESPACES List of tablespaces from which metadata will be unloaded.
    VERSION               Version of objects to export where valid keywords are:
                          (COMPATIBLE), LATEST, or any valid database version.
    
    The following commands are valid while in interactive mode.
    Note: abbreviations are allowed
    
    Command               Description
    ------------------------------------------------------------------------------
    ADD_FILE              Add dumpfile to dumpfile set.
    CONTINUE_CLIENT       Return to logging mode. Job will be re-started if idle.
    EXIT_CLIENT           Quit client session and leave job running.
    FILESIZE              Default filesize (bytes) for subsequent ADD_FILE commands.
    HELP                  Summarize interactive commands.
    KILL_JOB              Detach and delete job.
    PARALLEL              Change the number of active workers for current job.
                          PARALLEL=.
    START_JOB             Start/resume current job.
    STATUS                Frequency (secs) job status is to be monitored where
                          the default (0) will show new status when available.
                          STATUS[=interval]
    STOP_JOB              Orderly shutdown of job execution and exits the client.
                          STOP_JOB=IMMEDIATE performs an immediate shutdown of the
                          Data Pump job.
    
    C:\Documents and Settings\nupneja>
    

    Content to the "metadata_only" parameter will export only the structure of the schema to skip the lines.

  • Excluding views materialized to a data pump export

    Hello

    I use the Oracle11g 11.1 RAC on Linux version

    I am trying to remove a schema of data pump export materialized views. I used EXCLUDE = MATERIALIZED_VIEW in the
    settings file. The materialized view is not exported, but the table associated with the view is always exported. Does anyone know how to remove those?

    Thank you

    Richard

    You will need to specify the tables in the exclude parameter.

    EXCLUDE the = MATERIALIZED_VIEW would not exclude the paintings of masters associates.

    In a single parameter EXCLUDE specify:
    EXCLUDE THE = MATERIALIZED_VIEW, TABLE: IN "('EMP', 'DÉPARTEMENT')" "

  • selective column of data pump export

    Dear Experts,

    I'm using Oracle 11 g. With the help of data pump is it possible to choose only some columns in a table to export.

    Thanks in advance.

    spur230 wrote:
    Dear Experts,

    I'm using Oracle 11 g. With the help of data pump is it possible to choose only some columns in a table to export.

    Thanks in advance.

    It is not possible in data pum export only selective table columns. However you can try next.

    (1) create table export_selective as select c1, c2 from the source table; (you can create it in the source db)

    (2) expdp export_selective table

    (3) export_selective impdp in the target table

    Reverse:
    Dblink allows to obtain this table created with selective columns using DEC

  • Data pump export

    Hello

    I use

    patterns of expdp dumpfile = 31082015.dmp system/ar8mswin1256@nt11g = dbo_mobile_webresults_test

    and I face this error:

    DEU-00018: customer data pump is incompatible with the version of database 11.01.00.07.00

    I think it's a problem of version.

    I found that the database on the server that I am connected 11.2.0.1.0 - 64 bit

    and my client is 11.1.0.7.0

    I tried it on another pc and it worked.

    Thank you very much

  • Best practices for the data pump or import process?

    We are trying to copy the existing to another newly created schema schema. Pump data export to succeed the export schema.

    However, we met errors when you import dump again file schema. Remapped schema and storage areas, etc.
    Most of the errors occur in PL/SQL... For example, we have views as below in the original schema:
    "
    CREATE the VIEW * oldschema.myview * AS
    SELECT col1, col2, col3
    OF * oldschema.mytable *.
    WHERE coll1 = 10
    .....
    "
    Quite a few functions, procedures, packages and triggers contain "* oldschema.mytable *" in the DML (insert, select, update), for example.

    Get the following errors in the import log:
    ORA-39082: object ALTER_FUNCTION type: 'TEST '. "' MYFUNC ' created with compilation warnings
    ORA-39082: ALTER_PROCEDURE object type: 'TEST '. "" MYPROCEDURE "created with compilation warnings
    ORA-39082: the VIEW object type: 'TEST '. "' BIRD ' created with compilation warnings
    ORA-39082: object PACKAGE_BODY type: 'TEST '. "' MYPACKAGE ' created with compilation warnings
    ORA-39082: TRIGGER object type: 'TEST '. "' MON_TRIGGER ' created with compilation warnings

    Many actual errors/no valid in the new schema objects are due to:
    ORA-00942: table or view does not exist

    My question is:
    1. What can we do to correct these errors?
    2. is there a better way to do the import with such condition?
    3 update PL/SQL and recompile with the new scheme? Or update in the scheme of origin, firstly and export?

    Your help will be greatly appreciated!

    Thank you!

    @?/rdbms/admin/utlrp.sql

    Will compile the objects in the database through drawings. In your case, you re-mapping from one schema to another and utlrp objects will not be able to compile.

    SQLFILE impdp option allows to generate the DDL of the discharge of export and change the name of the schema on a global scale and run the script in sqlplus. This should solve most of your errors. If you still see errors, now proceed to utlrp.sql.

    -André

  • How to choose the access method (direct path or external tables) for Data Pump export?

    I have this slow data export pump, and I have a few suggestions for settings that might improve the speed. But I can't seem to pass them through the DBMS_DATAPUMP package. Is this possible?

    REPORT THE NUMBER OF PUMP_HANDLE: = DBMS_DATAPUMP. OPEN (OPERATION = > 'EXPORT', JOB_MODE = > 'TABLE', JOB_NAME = > 'EXP_DATABASE_370');

    BEGIN

    DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A1.) DMP', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A2.) DMP', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A5.) TXT', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);

    DBMS_DATAPUMP. METADATA_FILTER (PUMP_HANDLE, NAME = > 'NAME_EXPR', VALUE = > 'IN ("MY_DATABASE_370")');

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'INCLUDE_METADATA', VALUE = > 1);

    DBMS_DATAPUMP. SET_PARALLEL (PUMP_HANDLE, LEVEL = > 4);

    < < THIS_LINE_FAILS > > DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > "DIRECT_PATH");

    DBMS_DATAPUMP. START_JOB (PUMP_HANDLE);

    DBMS_DATAPUMP. DETACH (PUMP_HANDLE);

    END;

    < < THIS_LINE_FAILS > > line throws an exception:

    ORA-20020: error: ORA-39001: value of the invalid argument. ORA-39049: parameter not valid name ACCESS_METHOD;

    ORA-06512: at line 10

    Replace < < THIS_LINE_FAILS > > this call fails with the same message

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > "EXTERNAL_TABLES");

    Replace < < THIS_LINE_FAILS > > this call fails with the same message

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > 1); / * INTEGER does not seem to work either * /.

    Replace < < THIS_LINE_FAILS > > this call also fails with a message similar

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'PARALLEL_FORCE_LOCAL', VALUE = > 1);

    Replacement of < < THIS_LINE_FAILS > > with this call fails also, with a quite different message

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'Settings', VALUE = > "DISABLE_APPEND_HINT");

    ORA-20020: error: ORA-39001: value of the invalid argument. ORA-39207: NULL value is not valid for the parameter settings. ;

    Hello

    you have ACCESS_METHOD we DATA_ACCESS_METHOD. Just give a try.

    see you soon,

    rich

  • GR 11, 2 Data Pump. Import the table in a schema into another schema

    I have oracle stady. I export shema HR in the hrexport.dmp file. When I import tables from this file, I got hurt. I have used Enterprise Manager:
    1. connected by SYSTEM user as USUAL
    2 selected, choose the type of import - tables
    3. the data in the file imported
    4 selected tables to import tables
    5. in the next step, I try to insert a row into the table remapping patterns and change the cell Destination Shema, but in the list is only one name of shema - HR! Why?

    Published by: alvahtin on 10.03.2013 06:11

    ORA-39166: Object SYSTEM. EMPLOYEES were found.
    ORA-39166: Object SYSTEM. The DEPARTMENTS was not found.
    ORA-39166: Object SYSTEM. PLACES not found.

    Tables are not owned by system. Try

    impdp system/oracle remap_schema=hr:inventory tables=hr.employees, hr.departments, hr.locations ........
    
  • The journal of data pump export error

    In datapump export log, I could see the error below:
    =================
    . . exported "SYSMAN." "" MGMT_METRIC_DEPENDENCY_DEF "13 7KO lines
    . . exported "SYSMAN." "' MGMT_CREDENTIAL_TYPES ' 6,796 KB 4 rows
    . . exported "SYSMAN." "' MGMT_JOB_TYPE_DISPLAY_INFO ' 31 lines 7,085 KB
    {color: #0000ff} ORA-31693: table data object 'OE '. "' WAREHOUSES ' could not load/unload and being ignored because of the error:
    ORA-06564: TEST_DIR object does not exist
    {color}. . exported 'SH '. "" CHANNELS "6,695 KB 5 rows
    . . exported "SYSMAN." "' MGMT_METADATA_SETS ' 6,757 KB 18 rows
    . . exported "SYSMAN." "' MGMT_HA_INFO_ECM ' 6,523 KB 1 lines
    =================
    Remaining exports seems to be quite right, I do not know why this error stuck to this object.
    Does someone has an idea about it, looked for metalink, but did not find a lot of relevant info.

    Database version: 10.2.0.1

    Abu,

    Is - this possiible to be able to run the same command line session expdp and see if it fails for the same reason?

    Concerning

  • I want to store the files of Data Pump export on a remote machine

    Hello

    I have two servers to database running, A and B. I want to run a cron job on A who takes a dump and stores it in a directory on B every night.

    Is this possible?

    Thank you.

    Another option would be to use the NETWORK_LINK parameter. Let's say you want the export of A files to be created on the server where the B. create a link from B to A database, and then run expdp B using the link pointing to A network

    http://docs.Oracle.com/CD/B19306_01/server.102/b14215/dp_export.htm#sthref142

    HTH
    Srini

  • data pump export limited

    Hi all
    Export using default coherent datapump? If I do an export on the database of production for updating the production dev, do I need to start limiting production database? Not just need?
    Please let me know.

    Hello

    [If I do an export on the database of production for updating the production dev, do I need to start limiting production database? Do not right?]

    years is: no need to start the production database in restricted

  • DataPump - validate vs export import many records

    All the

    You ask is there any technique to validate the ACCOUNT RECORD between vs Data Pump Export Import, when you only use the option to export DATA_ONLY.

    (1) export DATA_ONLY using Data Pump. (No DB objects or metadata)

    (2) DATA_ONLY using Data Pump import.

    After import, want to compare the number of records migrated on each table are match between vs Export Import.

    Debate with the following options of 2.

    (1) if I compare using LOG files from vs. Export Import (unix more format and compare files)

    (2) generate... SELECT COUNT (*) FROM sql using EXPORT and IMPORT a log file, and then compare these results.

    Please share if someone does something similar. Appreciate your comments.

    Thanks in advance.

    Hello

    > Was wondering is there a technique to validate the ACCOUNT RECORD between vs Data Pump Export and Import, when you only use the option to export DATA_ONLY.

    There is no need to do the whole process, just look for the error in the export/import log file... If you don't get any error means your account is fine... It's enough to validate the registration.

    HTH

  • Data Pump 11g Network Import

    I have a need to perform a network import a 10.2.0.4 DataPump mode database on my old server (HP - UX 11.11) to my new 11.2.0.3 database on a new server (HP - UX 11.31). What I would REALLY like to do is to import directly from my database physical standby (running in mode READ_ONLY while I do importation) rather than having calm my database of production for a couple of hours while I do the import from there.

    What I want to know is if the importation of network mode Data Pump running on 11.2.0.3 the new server creates a task Data Pump to extract in the old database as part of this importation of direct network link. If so, I won't be able to use the physical Standby as the source of my import because Data Pump will not be able to create the main table in the old database. I can't find anything in any Oracle documentation on the use of a physical Standby as a source. I know that I can't do a Data Pump export regularly this database, but I would really like to know if anyone has experience doing this.

    Any comments would be greatly appreciated.

    Bad news, Harry - it worked for me on a standard database open in READ ONLY mode. Not sure what is different between your environment and mine, but there must be something... Read only database is a 10.2.0.4 database running on a box of HP PA-RISC under HP - UX 11.11. The target database runs under 11.2.0.3, on a box of HP Itanium under HP - UX 11.31. The user, I am connected to the database target IMP_FULL_DATABASE privs, and this is the user id used for the DB_LINK and also the same user id on the source database (which, of course, follows!). This user also has the privs permission. My file by looks like this:

    TABLES = AC_LIAB_ %
    NETWORK_LINK = ARCH_LINK
    DIRECTORY = DATA_PUMP_DIR
    JOB_NAME = AMIBASE_IMPDP_ARCHDB
    LOGFILE = DATA_PUMP_DIR:base_impdp_archdb.log
    REMAP_TABLESPACE = ARCHIVE_BEFORE2003:H_DATA
    REMAP_TABLESPACE = ARCHIVE_2003:H_DATA
    REMAP_TABLESPACE = ARCHIVE_2004:H_DATA
    REMAP_TABLESPACE = ARCHIVE_2005:H_DATA
    REMAP_TABLESPACE = ARCHIVE_2006:H_DATA
    REMAP_TABLESPACE = ARCHIVE_2007:H_DATA
    REMAP_TABLESPACE = ARCHIVE_2008:H_DATA
    REMAP_TABLESPACE = ARCHIVE_INDEXES:H_INDEXES
    REUSE_DATAFILES = NO
    SKIP_UNUSABLE_INDEXES = Y
    TABLE_EXISTS_ACTION = REPLACE

Maybe you are looking for

  • What are the parameters of police to 7.0

    I'm looking for that by default, including advanced font settings should be for 7.0.1. I changed things and I'm too far away from the default and I would start again. There is no set default button. At this point the web pages are ok, but the text is

  • Satellite R830: Agitated movies on TV by using the frequency of 23Hz

    Hello I cannot connect my laptop to my TV using the options of hz 23 and 24 for frame rates in the integrated graphic card software.Movies will play fine for a few minutes and it runs really rough. Curiously, Media Player Classic - Home Cinema does n

  • Can't see Joikuspot wifi network

    I AT200 ICS 4.0 upgrade and I want to connect to the internet via Joikuspot premium (on my phone Nokia E6). JoikuSpot works fine and I can connect my laptop, but AT200 sees no wifi network. Anyone who has experienced similar problem?

  • The music plays constantly, unknown source?

    Hey havin a problem here. My computer is constantly playing this horribly inflatable music in the background, and I can't turn it off. None of the media players seem to play, and it is not coming out of the web browser. Just sort of, it started when

  • Support of graphics card for resolution 2550 x 1440

    Hello I decided to buy a screen for my laptop computer and wandering if the graphics card empeded cand manage this high resolution. Here are the specifications of my laptop http://support.HP.com/us-en/document/c02587939 and here are the specification