The journal of data pump export error

In datapump export log, I could see the error below:
=================
. . exported "SYSMAN." "" MGMT_METRIC_DEPENDENCY_DEF "13 7KO lines
. . exported "SYSMAN." "' MGMT_CREDENTIAL_TYPES ' 6,796 KB 4 rows
. . exported "SYSMAN." "' MGMT_JOB_TYPE_DISPLAY_INFO ' 31 lines 7,085 KB
{color: #0000ff} ORA-31693: table data object 'OE '. "' WAREHOUSES ' could not load/unload and being ignored because of the error:
ORA-06564: TEST_DIR object does not exist
{color}. . exported 'SH '. "" CHANNELS "6,695 KB 5 rows
. . exported "SYSMAN." "' MGMT_METADATA_SETS ' 6,757 KB 18 rows
. . exported "SYSMAN." "' MGMT_HA_INFO_ECM ' 6,523 KB 1 lines
=================
Remaining exports seems to be quite right, I do not know why this error stuck to this object.
Does someone has an idea about it, looked for metalink, but did not find a lot of relevant info.

Database version: 10.2.0.1

Abu,

Is - this possiible to be able to run the same command line session expdp and see if it fails for the same reason?

Concerning

Tags: Database

Similar Questions

  • I want to store the files of Data Pump export on a remote machine

    Hello

    I have two servers to database running, A and B. I want to run a cron job on A who takes a dump and stores it in a directory on B every night.

    Is this possible?

    Thank you.

    Another option would be to use the NETWORK_LINK parameter. Let's say you want the export of A files to be created on the server where the B. create a link from B to A database, and then run expdp B using the link pointing to A network

    http://docs.Oracle.com/CD/B19306_01/server.102/b14215/dp_export.htm#sthref142

    HTH
    Srini

  • By using the query table data pump

    I am trying to perform an export of data pump on a table by using a query in a parfile and I'm getting a strange behavior. The database version is 10.2.0.4.3 and the AIX 5.3 operating system. The query looks like this.

    QUERY = "POSDECLARATIONQUEUE:where SESSIONID in (select"B.SESSIONID"from POSACCOUNT A, POSDECLARATIONQUEUE B, POSDECLARATIONSESSION C, where"B.SESSIONID"="C.ID"and"C.ACCOUNTID"= 'A.ID' and 'Inform' = '10252')" "

    This works, but gets 0 rows. If I run the query against the instance in a session of SQLPlus as below then I get 0 rows returned.

    Select * from POSDECLARATIONQUEUE where SESSIONID in (select "B.SESSIONID" from POSACCOUNT A, POSDECLARATIONQUEUE B, POSDECLARATIONSESSION C, where "B.SESSIONID" = "C.ID" AND "C.ACCOUNTID" = 'A.ID' and 'Inform' = '10252');

    If I take the columns all about apostrophes in the query on the Forum with SQLPlus, I get over 2000 rows returned.

    SQL > select count (*) in the POSDECLARATIONQUEUE where SESSIONID in (select B.SESSIONID from POSACCOUNT A, POSDECLARATIONQUEUE B, POSDECLARATIONSESSION C where B.SESSIONID = C.ID and C.ACCOUNTID = A.ID and inform = 10252);

    COUNT (*)
    ----------
    2098

    If I remove the apostrophes in the query parfile then I get the following error in data pump export.

    DEU-00014: invalid value for the parameter, "schemas".

    The PATTERNS option is not specified in parfile him and the TABLES option specifies that the table POSDECLARATIONQUEUE.

    Can someone help with this, I can't seem to be able to get the syntax just to work within data pump.

    Kind regards.
    Graeme.

    Published by: user12219844 on April 14, 2010 03:34

    It seems that your query can be a little wrong:

    That's what you have:

    QUERY = "POSDECLARATIONQUEUE:where SESSIONID in (select"B.SESSIONID"from POSACCOUNT A, POSDECLARATIONQUEUE B, POSDECLARATIONSESSION C, where"B.SESSIONID"="C.ID"and"C.ACCOUNTID"= 'A.ID' and 'Inform' = '10252')" "

    That's what I would have thought it should look like:

    QUERY = POSDECLARATIONQUEUE: "where SESSIONID in (select B.SESSIONID from POSACCOUNT A, POSDECLARATIONQUEUE B, POSDECLARATIONSESSION C where B.SESSIONID = C.ID and C.ACCOUNTID = A.ID and inform = 10252).

    You want to double "around the full query and you have not need single ' around all =." The single ' treat these values as strings and he said

    "B.SESSIONID" = "C.ID".
    the B.SESSIONID string is the string C.ID

    In your query you used in sql has been

    B.SESSIONID = C.ID

    that said, it is that the B.SESSIONID stored value equal to the value stored at C.ID

    Which is what you want.

    Dean

  • selective column of data pump export

    Dear Experts,

    I'm using Oracle 11 g. With the help of data pump is it possible to choose only some columns in a table to export.

    Thanks in advance.

    spur230 wrote:
    Dear Experts,

    I'm using Oracle 11 g. With the help of data pump is it possible to choose only some columns in a table to export.

    Thanks in advance.

    It is not possible in data pum export only selective table columns. However you can try next.

    (1) create table export_selective as select c1, c2 from the source table; (you can create it in the source db)

    (2) expdp export_selective table

    (3) export_selective impdp in the target table

    Reverse:
    Dblink allows to obtain this table created with selective columns using DEC

  • Excluding views materialized to a data pump export

    Hello

    I use the Oracle11g 11.1 RAC on Linux version

    I am trying to remove a schema of data pump export materialized views. I used EXCLUDE = MATERIALIZED_VIEW in the
    settings file. The materialized view is not exported, but the table associated with the view is always exported. Does anyone know how to remove those?

    Thank you

    Richard

    You will need to specify the tables in the exclude parameter.

    EXCLUDE the = MATERIALIZED_VIEW would not exclude the paintings of masters associates.

    In a single parameter EXCLUDE specify:
    EXCLUDE THE = MATERIALIZED_VIEW, TABLE: IN "('EMP', 'DÉPARTEMENT')" "

  • Data Pump Export Wizard in TOAD

    Hello
    I am new to the interface of TOAD.

    I would like to export tables from a database and import it into another. I intend to use Data Pump Export / Import Wizard in TOAD.

    I installed 9.1 Toad on WINDOWS XP and I connect to the BOX (DB server) using the customer Oracle UNIX.

    I know that the command line data pump process IE $expdp and $impdp.

    But I don't have a direct access to the UNIX BOX, IE the HOST so I would use TOAD to do the job.


    I would like to know what is the process for this.

    How different it is to use the command line data pump... With the help of TOAD where we create DIRECTORY DATAPUMP?

    I can do it on the local computer?

    Basically, I would like to know the process of import/export using TOAD with no direct access to the UNIX MACHINE.

    Thanks in advance.

    user13517642 wrote:
    Hello
    I am new to the interface of TOAD.

    I would like to export tables from a database and import it into another. I intend to use Data Pump Export / Import Wizard in TOAD.

    I installed 9.1 Toad on WINDOWS XP and I connect to the BOX (DB server) using the customer Oracle UNIX.

    I know that the command line data pump process IE $expdp and $impdp.

    But I don't have a direct access to the UNIX BOX, IE the HOST so I would use TOAD to do the job.

    I would like to know what is the process for this.

    How different it is to use the command line data pump... With the help of TOAD where we create DIRECTORY DATAPUMP?

    I can do it on the local computer?

    Basically, I would like to know the process of import/export using TOAD with no direct access to the UNIX MACHINE.

    Thanks in advance.

    I don't think that you can do with TOAD withouth physically copy of the file on the remote host. However, you have another option. "You can use the link to database to load data * without copying it to the remote host" using NETWORK_LINK parameter, as described below:

    For export:
    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_export.htm#sthref144
    The NETWORK_LINK parameter initiates an export using a database link. This means that the system whereby the customer expdp is connected contact data source referenced by the source_database_link, he extracted data and writes the data to a dump on the connected system file.

    To imprt:
    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_import.htm#sthref320
    The parameter NETWORK_LINK initiates a network import. This means the impdp customer initiates import demand, usually in the local database. This server comes in contact with the remote source database referenced by source_database_link and retrieves the data directly, it rewrites in the target database. There is no involved dump file.

    Kamran Agayev a.
    Oracle ACE
    - - - - - - - - - - - - - - - - - - - - -
    My video tutorials of Oracle - http://kamranagayev.wordpress.com/oracle-video-tutorials/

  • Data pump - export without data

    To export the database without data in old tool exp was the parameter ROWS defined as N. How to import the schema of database without data using data pump technology?

    You can see by checking using dump export on your command line like this

    C:\Documents and Settings\nupneja>expdp -help
    
    Export: Release 10.2.0.1.0 - Production on Friday, 09 April, 2010 18:06:09
    
    Copyright (c) 2003, 2005, Oracle.  All rights reserved.
    
    The Data Pump export utility provides a mechanism for transferring data objects
    between Oracle databases. The utility is invoked with the following command:
    
       Example: expdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
    
    You can control how Export runs by entering the 'expdp' command followed
    by various parameters. To specify parameters, you use keywords:
    
       Format:  expdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
       Example: expdp scott/tiger DUMPFILE=scott.dmp DIRECTORY=dmpdir SCHEMAS=scott
                   or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
    
    USERID must be the first parameter on the command line.
    
    Keyword               Description (Default)
    ------------------------------------------------------------------------------
    ATTACH                Attach to existing job, e.g. ATTACH [=job name].
    COMPRESSION           Reduce size of dumpfile contents where valid
                          keyword values are: (METADATA_ONLY) and NONE.
    *CONTENT*               Specifies data to unload where the valid keywords are:
                          (ALL), DATA_ONLY, and METADATA_ONLY.
    DIRECTORY             Directory object to be used for dumpfiles and logfiles.
    DUMPFILE              List of destination dump files (expdat.dmp),
                          e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
    ENCRYPTION_PASSWORD   Password key for creating encrypted column data.
    ESTIMATE              Calculate job estimates where the valid keywords are:
                          (BLOCKS) and STATISTICS.
    ESTIMATE_ONLY         Calculate job estimates without performing the export.
    EXCLUDE               Exclude specific object types, e.g. EXCLUDE=TABLE:EMP.
    FILESIZE              Specify the size of each dumpfile in units of bytes.
    FLASHBACK_SCN         SCN used to set session snapshot back to.
    FLASHBACK_TIME        Time used to get the SCN closest to the specified time.
    FULL                  Export entire database (N).
    HELP                  Display Help messages (N).
    INCLUDE               Include specific object types, e.g. INCLUDE=TABLE_DATA.
    JOB_NAME              Name of export job to create.
    LOGFILE               Log file name (export.log).
    NETWORK_LINK          Name of remote database link to the source system.
    NOLOGFILE             Do not write logfile (N).
    PARALLEL              Change the number of active workers for current job.
    PARFILE               Specify parameter file.
    QUERY                 Predicate clause used to export a subset of a table.
    SAMPLE                Percentage of data to be exported;
    SCHEMAS               List of schemas to export (login schema).
    STATUS                Frequency (secs) job status is to be monitored where
                          the default (0) will show new status when available.
    TABLES                Identifies a list of tables to export - one schema only.
    TABLESPACES           Identifies a list of tablespaces to export.
    TRANSPORT_FULL_CHECK  Verify storage segments of all tables (N).
    TRANSPORT_TABLESPACES List of tablespaces from which metadata will be unloaded.
    VERSION               Version of objects to export where valid keywords are:
                          (COMPATIBLE), LATEST, or any valid database version.
    
    The following commands are valid while in interactive mode.
    Note: abbreviations are allowed
    
    Command               Description
    ------------------------------------------------------------------------------
    ADD_FILE              Add dumpfile to dumpfile set.
    CONTINUE_CLIENT       Return to logging mode. Job will be re-started if idle.
    EXIT_CLIENT           Quit client session and leave job running.
    FILESIZE              Default filesize (bytes) for subsequent ADD_FILE commands.
    HELP                  Summarize interactive commands.
    KILL_JOB              Detach and delete job.
    PARALLEL              Change the number of active workers for current job.
                          PARALLEL=.
    START_JOB             Start/resume current job.
    STATUS                Frequency (secs) job status is to be monitored where
                          the default (0) will show new status when available.
                          STATUS[=interval]
    STOP_JOB              Orderly shutdown of job execution and exits the client.
                          STOP_JOB=IMMEDIATE performs an immediate shutdown of the
                          Data Pump job.
    
    C:\Documents and Settings\nupneja>
    

    Content to the "metadata_only" parameter will export only the structure of the schema to skip the lines.

  • EXPORT ONLY THE TABLES IN A DIAGRAM WITH THE HELP OF DATA PUMP

    Hi people,
    Nice day. I'd appreciate if I can get a data pump command to export the TABLE object in a specific schematic.
    The server is a node RAC 4 with 16 CPU per node.
    Thanks in advance

    If all you want is table definitions, why can use you something like:

    expdp username/password = mon_repertoire dumfile = my_dump.dmp direcory tables schama1.table1, schema1.table2, happy etc = metadata_only = include = table

    This will only export table definitions. If you want the data, and then delete the content = metadata_only, if you want the dependent objects, such as indexes, table_statistics, etc, then remove the include = table.

    Dean

  • Data Pump: export/import tables in different schemas

    Hi all

    I use Oracle 11.2 and I would like to use the data pump to export / import data into the tables between the different schemas. The table already exists in the source and target schemas. Here are the commands that I use:


    Script working export:

    expdp scott/tiger@db12 schema = source include = TABLE:------"IN (\'TB_TEST1 ', \'TB_ABC')\ 'directory = datapump_dir dumpfile = test.dump logfile = test_exp.log

    Script to import all the desks:

    Impdp scott/tiger@db12 remap_schemas = rΘpertoire source: target = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only table_exists_action = truncate

    Script error for some tables to import:

    Impdp scott/tiger@db12 remap_schemas = source: target include = TABLE:------"IN (\'TB_TEST1')\' directory = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only

    Export is good, I got the folling error when I try to import the table TB_TEST1 only: "ORA-31655: no metadata data_or objectsz selected for employment. The user scott is a DBA role, and it works fine when I try to import all tables being export without the include clause.

    It is possble to import some tables but not all tables in the export file?

    Thanks for the help!

    942572 wrote:

    It's good to import all tables exported through Scott, who do NOT have the clause 'include '.

    I have the error only when I try to import some tables with clause "include".

    Can I import only some table since the export dump file? Thank you!

    you use INCLUDE incorrectly!

    do below yourself

    Impdp help = yes

    INCLUDE

    Include the specific object types.

    For example, INCLUDE TABLE_DATA =.

  • Data pump export

    Hello

    I use

    patterns of expdp dumpfile = 31082015.dmp system/ar8mswin1256@nt11g = dbo_mobile_webresults_test

    and I face this error:

    DEU-00018: customer data pump is incompatible with the version of database 11.01.00.07.00

    I think it's a problem of version.

    I found that the database on the server that I am connected 11.2.0.1.0 - 64 bit

    and my client is 11.1.0.7.0

    I tried it on another pc and it worked.

    Thank you very much

  • With the help of Data Pump for Migration

    Hi all

    Version of database - 11.2.0.3

    RHEL 6

    Size of the DB - 150 GB

    I have to run a Migration of database from one server to another (AIX for Linux), we will use the data pump Option, we will migrate from Source to the target using schemas expdp option (5 patterns will be exported and imported on the target Machine). But it won't go live for target, after that the development of this migration team will do a job on this machine target which will take 2 days for them to fill in and to cultivate these 2 days, source database will run as production.

    Now, I have obligation which, after the development team, complete their work that I have to the changes of 2 days of source to target after what target will act as production.

    I want to know what options are available in Data Pump can I use to do this.

    Kind regards

    No business will update something that has some data that are no longer representative of live.

    Sounds like a normal upgrade, but you test it just on a copy of the direct - make sure that the process works & then play it comfortable once again, but against your last set of timely production data.

    Deans suggestion is fine, but rather than dropping coins and importation, personally I tend to keep things simple and do it all (datapump full schema if it is possible). Live in this way, you know that you put off, inclusive of all sequences and objects (sequences could be incremented, so you must re-create the fall /). Otherwise you are dividing a upgrade in stages, several measures to trace & more to examine potential conflicts. Even if they are simple, a full datapump would be preferable. Simple is always best with production data

    Also - you do not know the changes that have been made to upgrade the new environment... so you roll this back etc? Useful to look at. Most of the migration would be a db via RMAN copy / transport-endian, as you must also make sure that you inherit all the subsidies system patterns, not only summary level.

  • Data Pump Export/Import

    Hello Forum,

    I have a question regarding imports and exports of data pump, perhaps what I should already know.

    I need to empty a table that has about 200 million lines, that I need to get rid of about three quarters of data.

    My intention is to use the data pump to export the table and and indexes and constraints etc..

    The table has no relationship to any other table, it is composed of approximately 8 columns with constraints not null.

    My plan is

    1 truncate table

    2. disable or remove index

    3 leave the constraints in place?

    4. use data pump to import a lines to keep.

    My question

    will be my clues and imported too much constraints I want to import only a subset of my exported table?

    or

    If I dropped the table after truncation, I'll be able to import my table and indexes, even if I use the sub as part of my statement to import query functionality?

    My table using the query sub in data pump functionality must exist in the database before doing the import

    or handful of data pump import as usual IE will create table indices grants and statistics etc.?

    Thank you for your comments.

    Concerning

    Your approach is ineffective.

    What you need to do is to

    create the table in select foo * bar where the

    bar of truncate table;

    Insert / * + APPEND * / into select bar * foo.

    Rebuild the indexes on the table.

    Fact.

    This whole thing with expdp and impdp is only a waste of resources. My approach generates redo a minimum.

    ----------

    Sybrand Bakker

    Senior Oracle DBA

  • REST API - the field value date Contact exported in digital format.

    Hi people,

    I used below REST API to retrieve the Contacts with the views and custom filter.

    The JSON response, I shot the date value in the number format. As 'C_Lead_Score_Date': '1434456862',


    https://secure.Eloqua.com/API/rest/1.0/data/contact/view/ {identifier} / contacts/filter / {id}


    How can I convert the appropriate number in date value? or y at - it an adjustment to the level of the code to retrieve the value in the correct date format.


    The data type of the field is dates.

    The date is returned to as a Unix timestamp.

    You can use an online converter to convert it to a normal value, for example: online - time conversion Conversion Unix

    1434456862 = Tuesday, June 16, 2015 12:14:22 GMT

  • How to choose the access method (direct path or external tables) for Data Pump export?

    I have this slow data export pump, and I have a few suggestions for settings that might improve the speed. But I can't seem to pass them through the DBMS_DATAPUMP package. Is this possible?

    REPORT THE NUMBER OF PUMP_HANDLE: = DBMS_DATAPUMP. OPEN (OPERATION = > 'EXPORT', JOB_MODE = > 'TABLE', JOB_NAME = > 'EXP_DATABASE_370');

    BEGIN

    DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A1.) DMP', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A2.) DMP', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A5.) TXT', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);

    DBMS_DATAPUMP. METADATA_FILTER (PUMP_HANDLE, NAME = > 'NAME_EXPR', VALUE = > 'IN ("MY_DATABASE_370")');

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'INCLUDE_METADATA', VALUE = > 1);

    DBMS_DATAPUMP. SET_PARALLEL (PUMP_HANDLE, LEVEL = > 4);

    < < THIS_LINE_FAILS > > DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > "DIRECT_PATH");

    DBMS_DATAPUMP. START_JOB (PUMP_HANDLE);

    DBMS_DATAPUMP. DETACH (PUMP_HANDLE);

    END;

    < < THIS_LINE_FAILS > > line throws an exception:

    ORA-20020: error: ORA-39001: value of the invalid argument. ORA-39049: parameter not valid name ACCESS_METHOD;

    ORA-06512: at line 10

    Replace < < THIS_LINE_FAILS > > this call fails with the same message

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > "EXTERNAL_TABLES");

    Replace < < THIS_LINE_FAILS > > this call fails with the same message

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > 1); / * INTEGER does not seem to work either * /.

    Replace < < THIS_LINE_FAILS > > this call also fails with a message similar

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'PARALLEL_FORCE_LOCAL', VALUE = > 1);

    Replacement of < < THIS_LINE_FAILS > > with this call fails also, with a quite different message

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'Settings', VALUE = > "DISABLE_APPEND_HINT");

    ORA-20020: error: ORA-39001: value of the invalid argument. ORA-39207: NULL value is not valid for the parameter settings. ;

    Hello

    you have ACCESS_METHOD we DATA_ACCESS_METHOD. Just give a try.

    see you soon,

    rich

  • During a planned migration, which method of database migration is the best?  Data Pump or RMAN?

    I ask this because I recently had to migrate a database to a new feature of ODA and our application developer head was upset because I used pump data to the database to the new platform and it took a few post import scripts to set permissions to what they were before the move.  He suggests to use RMAN, because it will restore the database to exactly how it was previously without having to restore the permissions.  I would like to know what methods are you using and which one is better?

    Thanks in advance ^_^

    He suggests using RMAN

    Yes, absolutely, and not just for the reason you mention: don't have time to stop a problem for you? Re: Tuning RMAN - active database in double with size 1.7 to

Maybe you are looking for