data pump question

Hello:
I have a general question in the export of diagram
When I run the utility to export on a diagram with 10 tables
It saves a copy of the current scheme before exporting
or
It just export the current table during the time of the export?
If the second part is a good
How can I maintain the integrity of the data during the export

The second is OK, if you want to keep the consistency of the data, you must use FLASHBACK_TIME or FLASHBACK_SCN (expdp) or COHERENT = y (exp).

Nicolas.

Tags: Database

Similar Questions

  • ODI LKM Oracle for Oracle Data Pump question

    Hi all

    I have a weird problem, ODI.

    I associate myself with per_all_people_f, fnd_user to load the w_user_ds using Oracle Data Integrator. The used LKM is LKM Oracle for Oracle Data Pump.

    Fine when I run the interface. I am getting below error

    ODI-1227: task failed USER_DATA_SET (load) on the source of connection ORACLE EBS.

    Caused by: java.sql.SQLSyntaxErrorException: ORA-00923: KEYWORD not found where expected

    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:462)

    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:405)

    at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:931)

    at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:481)

    at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:205)

    at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:548)

    at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:217)

    at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:1115)

    at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1488)

    at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3769)

    at oracle.jdbc.driver.OraclePreparedStatement.execute(OraclePreparedStatement.java:3954)

    at oracle.jdbc.driver.OraclePreparedStatementWrapper.execute(OraclePreparedStatementWrapper.java:1539)

    at oracle.odi.runtime.agent.execution.sql.SQLCommand.execute(SQLCommand.java:163)

    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:102)

    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:1)

    at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:577)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:468)

    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2128)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:366)

    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)

    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:292)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:855)

    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)

    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)

    at java.lang.Thread.run(Thread.java:662)

    The generated code is

    create the table 780021 X

    (

    C1_FIRST_NAME,

    C2_MID_NAME,

    C3_LAST_NAME,

    C4_FULL_NAME,

    C5_NAME_SUFFIX,

    C6_SEX_MF_CODE,

    C7_SEX_MF_NAME,

    C8_COUNTRY_NAME,

    C9_LOGIN,

    C10_CREATED_BY_ID,

    C11_CHANGED_BY_ID,

    C12_CREATED_ON_DT,

    C13_CHANGED_ON_DT,

    C14_AUX1_CHANGED_ON_DT,

    C15_SRC_EFF_TO_DT,

    C16_INTEGRATION_ID,

    C17_EFFECTIVE_START_DATE

    )

    (EXTERNAL) ORGANIZATION

    TYPE oracle_datapump

    Dat_dir default DIRECTORY

    LOCATION ("X780021.exp")

    )

    PARALLEL

    in SELECT

    ALL_PEOPLE_F.FIRST_NAME,

    ALL_PEOPLE_F.MIDDLE_NAMES,

    ALL_PEOPLE_F.LAST_NAME,

    ALL_PEOPLE_F.FULL_NAME,

    ALL_PEOPLE_F.SUFFIX,

    ALL_PEOPLE_F.SEX,

    ALL_PEOPLE_F.SEX,

    ALL_PEOPLE_F.NATIONALITY,

    USER. USER_NAME,

    ALL_PEOPLE_F.CREATED_BY,

    ALL_PEOPLE_F.LAST_UPDATED_BY,

    ALL_PEOPLE_F.CREATION_DATE,

    ALL_PEOPLE_F.LAST_UPDATE_DATE,

    ALL_PEOPLE_F.CREATION_DATE,

    ALL_PEOPLE_F.EFFECTIVE_END_DATE,

    USER. USER_ID,

    ALL_PEOPLE_F.EFFECTIVE_START_DATE

    from APPS. FND_USER USER, APPS. PER_ALL_PEOPLE_F ALL_PEOPLE_F

    where (1 = 1)

    And (ALL_PEOPLE_F.PERSON_ID = USER. EMPLOYEE_ID)

    I don't see what is the problem here.

    Someone can help me.

    Thank you and best regards,

    Krishna Prasad

    I found the problem, its with the way ODI generated alias for the FND_USER table, by default it produces USER as an alias, which is a keyword from oracle. We just need to rename it to something else, and it worked.

  • a question about data pump

    Hello

    I'm running a full database export using data pump. (full = y)


    To ensure data integrity, I have blocked certain patterns before starting the exp.
    I want to assure you that these patterns locked still be exported (not not being ignored), right?

    Please help confirm.

    Thank you very much.

    db version: 10.2.0.3 in Linux 5

    Published by: 995137 on April 23, 2013 15:30

    Hello
    If a schema is locked/unlocked makes no difference to datapump - he extracted them anyway with a full extract. The log file should list all the tables that are expoted, so you should see them there.

    Kind regards
    Harry

  • Export schema through Oracle data pump with question database Vault enabled

    Hello

    I installed and configured Vault database on an Oracle 11 g - r2 - 11.2.0.3 to protect a specific schema (SCHEMA_NAME) via a Kingdom. I followed the following doc:
    http://www.Oracle.com/technetwork/database/security/TWP-databasevault-DBA-BestPractices-199882.PDF
    to ensure that the system and the network user has sufficient rights to complete a schedule pump data Oracle export operation.

    I.e. I gave sys and system the following:
    execute dvsys.dbms_macadm.authorize_scheduler_user ('sys', 'SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_scheduler_user ('system', 'SCHEMA_NAME');

    execute dvsys.dbms_macadm.authorize_datapump_user ('sys', 'SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_datapump_user ('system', 'SCHEMA_NAME');

    I also create a second domain on the same schema (SCHEMA_NAME) to allow sys and system to manage indexes for the actual tables protected, in order to allow a system to manage indexes for the tables of protected area and sys. This separate realm was created for all types of indexes: Index, Partition of Index and Indextype, sys and system were allowed as the OWNER of this Kingdom.

    However, when I try and complete an operation of Oracle Data Pump export on the schematic, I get two errors directly after the following line appears in the export log:

    Processing object type SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX:
    ORA-39127: unexpected error of the call to export_string: = SYS. DBMS_TRANSFORM_EXIMP. INSTANCE_INFO_EXP('AQ$_MGMT_NOTIFY_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newBlock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS." DBMS_TRANSFORM_EXIMP', line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS." Dbms_metadata", line 9081
    ORA-39127: unexpected error of the call to export_string: = SYS. DBMS_TRANSFORM_EXIMP. INSTANCE_INFO_EXP('AQ$_MGMT_LOADER_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newBlock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS." DBMS_TRANSFORM_EXIMP', line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS." Dbms_metadata", line 9081

    The export is completed, but this error.

    Any help, pointers, suggestions, etc. in fact no matter what will be very welcome at this stage.

    Thank you

    I moved the forum thread on the "database - security. If the document does not help, pl open an SR with Support

    HTH
    Srini

  • Data Pump - expdp / question utility impdp

    HiAll,

    As a basis for learning exercise Data pump, I am trying to export the schema scott and then you want to load the dump file into a test_t diagram in the same database.

    -1. from dir object sysdba
    CREATE a DIRECTORY dpump_dir1 AS 'C:\output_dir ';

    -2. created dir on the operating system as c:/output_dit

    -3. run expdp
    schemas system/***@orcl expdp = scott DIRECTORY = JOB_NAME = hr DUMPFILE = scott_orcl_nov5.dmp PARALLEL dpump_dir1 = 4

    -4. create a schema test_t and allows the dba to it.

    -5. impdp execution
    Impdp system/***@orcl patterns = DIRECTORY = JOB_NAME = hr DUMPFILE = scott_orcl_nov5.dmp PARALLEL dpump_dir1 test_t = 8

    He does not here as ORA39165: test_t schema not found. However, the schema test_t exist.

    So, I don't know why she should give to this error. It seems that the Scott schema in the expdp dump file may not be charged to any other schema but only to a schema named scott... Is this good? If Yes, how can I load all objects in the schema scott say to another schema say test_t? It would be useful that you can show the respective expdp and impdp command please.

    Thank you very much
    KS

    Hello

    You must specify remap_schema when you import

    do impdp system/***@orcl = dpump_dir1 = scott_orcl_nov5.dmp remap_schema = scott DUMPFILE directory: test_t

    See you soon

  • Question about data pump and directory/sub-directories

    I have a process that makes these calls to Data Pump:
    dbms_datapump.add_file (dph, file_name |) '.log', ' DATA_PUMP_DIR/test', filetype = > DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);
    dbms_datapump.add_file (dph, file_name |) ".dmp", ' DATA_PUMP_DIR/test', filetype = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    He failed with the subdirectory "/ test". If I delete it, as in these calls it works.
    dbms_datapump.add_file (dph, file_name |) '.log', "DATA_PUMP_DIR" filetype = > DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);
    dbms_datapump.add_file (dph, file_name |) ".dmp", "DATA_PUMP_DIR" filetype = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    I even did a test 'chmod 777 '.

    No idea what I am doing wrong?

    I'm on 11.0.1.6

    It's the pile of error I get:
    ORA-39002: invalid operation
    ORA-06512: at "SYS." DBMS_SYS_ERROR', line 79
    ORA-06512: at "SYS." DBMS_DATAPUMP', line 3043
    ORA-06512: at "SYS." DBMS_DATAPUMP', line 3292
    ORA-06512: at "ORA_ADMIN. DATAPUMP_UTIL", line 46
    ORA-06512: at line 1

    Chris wrote:
    I have a process that makes these calls to Data Pump:
    dbms_datapump.add_file (dph, file_name |) '.log', ' DATA_PUMP_DIR/test', filetype => DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);
    dbms_datapump.add_file (dph, file_name |) ".dmp", ' DATA_PUMP_DIR/test', filetype => DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    He failed with the subdirectory "/ test". If I delete it, as in these calls it works.
    dbms_datapump.add_file (dph, file_name |) '.log', "DATA_PUMP_DIR", filetype => DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);
    dbms_datapump.add_file (dph, file_name |) ".dmp", "DATA_PUMP_DIR", filetype => DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    I even did a test 'chmod 777 '.

    No idea what I am doing wrong?

    invalid syntax

    DATA_PUMP_DIR

    above is DIRECTORY Oracle
    If you want a different location, then CREATE DIRECTORY DATA_PUMP_DIR_TEST

  • Data Pump Export/Import

    Hello Forum,

    I have a question regarding imports and exports of data pump, perhaps what I should already know.

    I need to empty a table that has about 200 million lines, that I need to get rid of about three quarters of data.

    My intention is to use the data pump to export the table and and indexes and constraints etc..

    The table has no relationship to any other table, it is composed of approximately 8 columns with constraints not null.

    My plan is

    1 truncate table

    2. disable or remove index

    3 leave the constraints in place?

    4. use data pump to import a lines to keep.

    My question

    will be my clues and imported too much constraints I want to import only a subset of my exported table?

    or

    If I dropped the table after truncation, I'll be able to import my table and indexes, even if I use the sub as part of my statement to import query functionality?

    My table using the query sub in data pump functionality must exist in the database before doing the import

    or handful of data pump import as usual IE will create table indices grants and statistics etc.?

    Thank you for your comments.

    Concerning

    Your approach is ineffective.

    What you need to do is to

    create the table in select foo * bar where the

    bar of truncate table;

    Insert / * + APPEND * / into select bar * foo.

    Rebuild the indexes on the table.

    Fact.

    This whole thing with expdp and impdp is only a waste of resources. My approach generates redo a minimum.

    ----------

    Sybrand Bakker

    Senior Oracle DBA

  • Moving database from 1 server to another via Data Pump - some queries

    Hello

    I am planing to move my database from one windows server to another. Part of the obligation is also to update this database 10g to 11.2.0.3. So I'm combining 2 tasks using the export / import method (via Data Pump) upgrade.

    Regarding export / import (which will be a pump full data of the database export) my plan is.

    create an empty 11.2.0.3 target database on the server (same number of control files, and redo logs etc.) ready for an import of the source database
    Q1. This will create tablespaces SYSTEM and UNDO - I presume the datapump export doesn't include these spaces of storage anyway?

    For export, I intend to simulate CONSISTENT = Y using FLASHBACK_TIME = SYSTIMESTAMP
    Q2. What is the return of flame characteristics must be active on the source database to use this knowledge should I Flashback Database enabled on the source (as opposed to other flashback features) database?

    My target is a virtual server with a single virtual processor
    Q3. Is there any point PARALLEL usinng in the import settings file (normally I would fix this number of processors - however in the case of this virtual server, it is actually onely a virtual processor)?

    For the import, I intend to use the REMAP_DATAFILE option to change the location of the data files on the target server

    Q4. If the import fails before the end, what is the best course of action? for example I just let go of data storage spaces and remake a complete import?

    Thank you
    Jim

    Jim,

    I'll take a pass on your questions:

    create an empty 11.2.0.3 target database on the server (same number of control files, and redo logs etc.) ready for an import > source database
    Q1. This will create tablespaces SYSTEM and UNDO - I presume the datapump export does not include these storage spaces > anyway?

    The system tablespace is created when you create a database, but the Data Pump will export and try to import. It will fail with tablespace exists. I am sure that the undo tablespace will be also exported and imported. If they are there, then just import will report that they already exist.

    For export, I intend to simulate CONSISTENT = Y using FLASHBACK_TIME = SYSTIMESTAMP
    Q2. What is the return of flame characteristics must be active on the source database to use this knowledge should I Flashback Database active > on the source (as opposed to other flashback features) database?

    I know not true about it. I thought that you need just enough cancel, but I hope that others will run in.

    My target is a virtual server with a single virtual processor
    Q3. Y at - it no PARALLEL point usinng in the import settings file (normally I put this on the number of processors - > however in the case of this virtual server, it is actually onely a virtual processor)?

    We recommend usually 2 times the number of processes, so 2 parallel should be ok.

    For the import, I intend to use the REMAP_DATAFILE option to change the location of the data files on the target server

    Q4. If the import fails before the end, what is the best course of action? that is, do I just give up storage of data and redo a > full import?

    It depends what is failure. Most of the failures will not stop work, but if this is the case, then most of these jobs may simply be restarted. To restart a job, you just need to know the name of the task, which is printed as you start to export/import, or you name the task in your order Data Pump. To restart, follow these steps:

    password/user Impdp attach = job_name

    If you do not name the work, the name of the job will be something like

    User.sys_import_full_01

    Hope that helps - and good luck with your migration.

    Dean

  • Export the whole (10 GB) using Data Pump utility export base

    Hello

    I have a requirement that we have to export all of the base (10 GB) using Data Pump export utility because it is not possible to send the discharge of 10 GB in a CD/DVD for the seller of our application system (to analyze a few problems that we have).

    Now when I checked online full export is available but not able to understand how it works, as we never used this data pump utility, we use the method of normal export. In addition, the data pump will reduce the size of the dump file so it can fit on a DVD or can we use utility export parallel DB full to split the files and include them in a DVD, it is possible.

    Please correct me if I am wrong and kindly help.

    Thank you for your help in advance.

    Pravin,

    The server saves files in the directory object that you specify on the command line. So what you want to do is:

    1. from your operating system, find an existing directory or create a new directory. In case you, C:/Dump is as good a place as any.

    2. connect to sqlplus and create the directory object. Just use the path. I use linux, so my directory looks like/scratch/xxx/yyy
    If you use Windows, the path to your directory would look like C:/Dump you should not attack.

    3. don't forget to grant access to this directory. You can grant access to a single user, group of users, or the public. Just like
    any other object.

    If it helps, or if she has answered your question, please mark messages with the appropriate tag.

    Thank you

    Dean

  • Select table when import using Data Pump API

    Hello
    Sorry for the trivial question, I export the data using Data Pump API, with the mode "TABLE".
    If all the tables will be exported in a .dmp file.

    So, my question is how to import a few tables using Data Pump API?, how to set the "TABLES" property as a command line interface?
    can I use procedures DATA_FILTER?, if so how?

    Really thanks in advance

    Kind regards

    Kahlil

    Hello

    You should use the procedure of metadata_filter for the same thing.
    for example:

    dbms_datapump.metadata_filter
                (handle1
                 ,'NAME_EXPR'
                 ,'IN (''TABLE1'', '"TABLE2'')'
                );
    {code}
    
    Regards
    Anurag                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        
    
  • Data pump export

    Hello

    I use

    patterns of expdp dumpfile = 31082015.dmp system/ar8mswin1256@nt11g = dbo_mobile_webresults_test

    and I face this error:

    DEU-00018: customer data pump is incompatible with the version of database 11.01.00.07.00

    I think it's a problem of version.

    I found that the database on the server that I am connected 11.2.0.1.0 - 64 bit

    and my client is 11.1.0.7.0

    I tried it on another pc and it worked.

    Thank you very much

  • With the help of Data Pump for Migration

    Hi all

    Version of database - 11.2.0.3

    RHEL 6

    Size of the DB - 150 GB

    I have to run a Migration of database from one server to another (AIX for Linux), we will use the data pump Option, we will migrate from Source to the target using schemas expdp option (5 patterns will be exported and imported on the target Machine). But it won't go live for target, after that the development of this migration team will do a job on this machine target which will take 2 days for them to fill in and to cultivate these 2 days, source database will run as production.

    Now, I have obligation which, after the development team, complete their work that I have to the changes of 2 days of source to target after what target will act as production.

    I want to know what options are available in Data Pump can I use to do this.

    Kind regards

    No business will update something that has some data that are no longer representative of live.

    Sounds like a normal upgrade, but you test it just on a copy of the direct - make sure that the process works & then play it comfortable once again, but against your last set of timely production data.

    Deans suggestion is fine, but rather than dropping coins and importation, personally I tend to keep things simple and do it all (datapump full schema if it is possible). Live in this way, you know that you put off, inclusive of all sequences and objects (sequences could be incremented, so you must re-create the fall /). Otherwise you are dividing a upgrade in stages, several measures to trace & more to examine potential conflicts. Even if they are simple, a full datapump would be preferable. Simple is always best with production data

    Also - you do not know the changes that have been made to upgrade the new environment... so you roll this back etc? Useful to look at. Most of the migration would be a db via RMAN copy / transport-endian, as you must also make sure that you inherit all the subsidies system patterns, not only summary level.

  • migration from 10g to 12 c using the data pump in

    Hi, while I used the data pump at the level of the schema before, I'm relatively new to the full database import.

    We are trying a full database migration to 10.2.0.4 to 12 c using the complete method of database data pump over db link.

    the DBA has indicated to avoid move SYSAUX and SYSTEM objects. but initially during the documentation review, it appeared that these objects are not exported since the TRANSPORTABLE given target = system NEVER. If anyone can confirm this? done import and export log refers to the objects I thought would not:

    ...

    19:41:11.684 23 FEBRUARY 15:Estimated TABLE_DATA 3718 objects in 77 seconds

    19:41:12.450 23 February 15: total estimation using BLOCKS method: 52,93 GB

    19:41:14.058 23 February 15: object DATABASE_EXPORT/TABLESPACE of treatment type

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'UNDOTBS1' already exists

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'SYSAUX' already exists

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'TEMP' already exists

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'USERS' existing

    20:10:33.200 23 FEBRUARY 15:96 objects TABLESPACE finished in 1759 seconds

    20:10:33.208 23 February 15: treatment of type of object DATABASE_EXPORT/PROFILE

    20:10:33.445 23 FEBRUARY 15:7 PROFILE items finished in 1 seconds

    20:10:33.453 23 February 15: treatment of DATABASE_EXPORT/SYS_USER/USER object type

    20:10:33.842 23 FEBRUARY 15:1 USER objects ended in 0 seconds

    20:10:33.852 23 February 15: treatment of DATABASE_EXPORT/SCHEMA/USER object type

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'OUTLN' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'ANONYMOUS' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'OLAPSYS' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'MDDATA' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'SCOTT' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'LLTEST' already exists

    20:10:52.372 23 FEBRUARY 15:Finished objects USER 1140 in 19 seconds

    20:10:52.375 23 February 15: object DATABASE_EXPORT/ROLE of treatment type

    20:10:55.255 23 February 15: ORA-31684: object ROLE type: 'SELECT_CATALOG_ROLE' already exists

    20:10:55.255 23 February 15: ORA-31684: object ROLE type: 'EXECUTE_CATALOG_ROLE' already exists

    20:10:55.255 23 February 15: ORA-31684: object ROLE type: 'DELETE_CATALOG_ROLE' already exists

    20:10:55.256 23 February 15: ORA-31684: object ROLE type: 'RECOVERY_CATALOG_OWNER' already exists

    ...

    the most insight.

    The schema SYS, CTXSYS and MDSYS ORDSYS are not exported using exp/expdp

    DOC - ID: Note: 228482.1

    I guess that he has already installed a 12 c software and created an itseems database - so when you have imported you have this "already exists."

    Every time the database is created and the software installed by default system, sys, sysaux will be created.

  • Data Pump: export/import tables in different schemas

    Hi all

    I use Oracle 11.2 and I would like to use the data pump to export / import data into the tables between the different schemas. The table already exists in the source and target schemas. Here are the commands that I use:


    Script working export:

    expdp scott/tiger@db12 schema = source include = TABLE:------"IN (\'TB_TEST1 ', \'TB_ABC')\ 'directory = datapump_dir dumpfile = test.dump logfile = test_exp.log

    Script to import all the desks:

    Impdp scott/tiger@db12 remap_schemas = rΘpertoire source: target = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only table_exists_action = truncate

    Script error for some tables to import:

    Impdp scott/tiger@db12 remap_schemas = source: target include = TABLE:------"IN (\'TB_TEST1')\' directory = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only

    Export is good, I got the folling error when I try to import the table TB_TEST1 only: "ORA-31655: no metadata data_or objectsz selected for employment. The user scott is a DBA role, and it works fine when I try to import all tables being export without the include clause.

    It is possble to import some tables but not all tables in the export file?

    Thanks for the help!

    942572 wrote:

    It's good to import all tables exported through Scott, who do NOT have the clause 'include '.

    I have the error only when I try to import some tables with clause "include".

    Can I import only some table since the export dump file? Thank you!

    you use INCLUDE incorrectly!

    do below yourself

    Impdp help = yes

    INCLUDE

    Include the specific object types.

    For example, INCLUDE TABLE_DATA =.

  • Differences between Data Pump and always by inheritance, import and export

    Hi all

    I work as a junior in my organization dba and I saw that my organization still uses legacy import and export utility instead of Oracle Data Pump.

    I want to convence my manager to change the existing deal with Oracle Data Pump, I have a meeting with them to keep my points and convence them for Oracle utility pump data.

    I have a week very convencing power but I won't miss to discover myself, I can't work myself and really a need with differences of strength against import and export, it would be really appreciated if someone can put strong points against Oracle Data pump on legacy import and export.

    Thank you

    Cabbage

    Hello

    a other people have already said the main advantage of datapump is performance - it is not just a little more fast exp/imp it is massively faster (especially when combined with parallel).

    It is also more flexible (once much more) - it will even create users with exports level schema which imp can't do for you (and it is very annoying that he could never).

    It is reusable

    It has an api plsql

    It supports all types of objects and new features (exp is not - and that alone is probably reason to spend)

    There even a 'legacy' at 11.2 mode where most of your old exp parameter file will still work with him - just change exp expdp and impdp print.

    The main obstacle to the transition to datapump seems to be all "what do you mean I have to create a directory so that it works", well who and where is my dumpfile why can't it be on my local machine. These are minor things to go well beyond.

    I suggest do you some sort of demo with real data of one of your large databases - do a full exp and a full expdp with parallel and show them the runtimes for them to compare...

    See you soon,.

    Rich

Maybe you are looking for

  • Unknown devices on Satellite M70-340

    Hey,.My problem started when I updated my Toshiba Satellite model: M70-340 with Windows Xp professional. Most of the drivers have been detected by windows, except for those:(1) mass storage controller(2) network controller(3) PCI modem I was wonderin

  • Impossible to find a button generate in Microsoft Visual 2010 Ultimate

    Simply, I can't find a run/build/compile/run button in visual studio 2010 ultimate. I really like it and I want to use in the future.

  • I can't connect to my local account because of corrupted files.

    I can't access my account because of a corrupted local file Ideas: You are: programs Error messages Recent changes to your computer What you have already tried to solve the problem

  • eSata problem

    Hello I have a problem with eSata on my model of Lenovo IdeaPad G560 0679 running Windows 7 Home Premium 64 DK (in Danish). When the connection and the transfer of data through a drive external hard of the eSata (eSata/usb2 Samsung Story Station Plus

  • Power adapter 3000 c200

    Hello Grandson in a bit of a pickle (it needs for school) so he turned to me and I don't have an answer, I hope one of you fine people can help It has a Lenovo 3000 c200 and the power adapter is broken and needs to be replaced. Someone has offered to