Reimport using datapump

I would like to refresh my test environment through the DMP datapump file import data. I'll use TABLE_EXISTS_ACTION as REPLACEMENT value to refresh all the data in the tables, but my question is what will happen with the procedure/function/indices and other Oracle objects? Replace automatically when importing, or manual work task?

What will happen with procedure/function/indices and other Oracle objects? Replace automatically when importing

None

or his task for manual labor?

Yes (for example you could simply drop the schema before importing)

Tags: Database

Similar Questions

  • Any specific disadvantage of using DataPump for upgrade?

    Hello

    Here is my config

    SERVER A

    Source o/s: Win 2003

    Source DB: 10.2.0.4 (32 bit)

    DB size: 100 GB

    SERVER B

    Target operating system: Win 2008 R2 sp1

    Target DB: 11.2.0.3 (64-bit)

    I have to upgrade 10g database on server A, installing 11 g on ServerB. I haven't used assistant upgrade database or RMAN or similar utility to make a 10g upgrade to 11g at any time in the past.

    Here's my question...

    (a) I was intending to use datapump to perform this upgrade (downtime is not a problem), because I know how to use datapump, do guys see any potential problems with this approach? OR

    (b) based on your exp do you suggest I should avoid option (a) because of potential problems and use other methods than oracle suggested as upgrade wizards, etc.

    I'm open to these two options, it's just that, as I'm not an expert at this point of time, I hesitated a little bit to go with option (b)

    DB is believed to suffer from 32 to 64-bit. Not sure if that would be a deal breaker.

    Note: The upgrade is supposed to happen on the 2nd Server. Can someone provide high level title pointer markets.

    -Learner

    If the downtime is not a problem, datapump is certainly an option. How big if the database? The steps are documented

    http://docs.Oracle.com/CD/E11882_01/server.112/e23633/expimp.htm#i262220

    HTH
    Srini

  • Upgrade database Oracle 10 g to 11g use Datapump

    Hi all

    I'm planning our oracle 10g to 11g database use datapump in windows environment. I used developer sql to perform the upgrade in a TEST environment, I took any export DB 10 g and I then imported into DB 11 g. but I faced a lot of error during the import. So, I have attached the log file and I wait your help.

    [Log file | https://dl.dropboxusercontent.com/u/14131772/IMPORT.LOG]

    user13289313 wrote:
    Regarding this error ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_JOB_STEP_LIST ' is already no problem with him.

    But what of the errors below.

    ORA-39083: Type as procact_system cannot be created with the object error:
    ORA-20000: Incompatible Version of the Workspace Manager installed
    Because sql is:
    BEGIN
    declare the worm varchar2 (100); entire dummy; vdummy varchar2 (30); compile_exception EXCEPTION; PRAGMA EXCEPTION_INIT(compile_exception,-06550); invalid_table EXCEPTION; PRAGMA EXCEPTION_INIT(invalid_table,-00942); createErrorProc procedure is immediately start run "create or replace function system.wm$ _check_install return boolean is begin to return the value true;"
    Treatment of type of object DATABASE_EXPORT/SYSTEM_PROCOBJACT/PROCOBJ
    ORA-39083: Type THAT PROCOBJ cannot be created with the object error:

    PL see this Doc MOS

    DtaPump Import (IMPDP) error ORA-39083 ORA-20000 when Incompatible Version of the Workspace Manager is installed [ID 730373.1]

    ORA-39325: TABLE_EXISTS_ACTION cannot apply to "SYSMAN." "" MGMT_METRIC_COLLECTIONS ".

    ORA-31693: Table object "SYSMAN data." "' MGMT_JOB_CRED_PARAMS ' failed to load/unload and being ignored because of the error:
    ORA-29913: error in executing ODCIEXTTABLEOPEN legend
    ORA-29400: data cartridge error
    ORA-39779: Type "SYSMAN." "' MGMT_JOB_VECTOR_PARAMS ' not found or conversion to the latest version is not possible

    ORA-31693: Table object «OLAPSYS» data "" CLASSIFICATION of CWM to $"could not load/unload and being ignored because of the error:
    ORA-29913: error in executing ODCIEXTTABLEFETCH legend

    You use the OLAP features? If this is not the case, this can be ignored.

    Information about the installed database components and patterns [ID 472937.1]

    >

    ORA-39112: type of dependent object TRIGGER: "SYSMAN." «MGMT_SEVERITY_UPDATES ' ignored, base object type TABLE: "SYSMAN".» "" Creation of MGMT_SEVERITY ' failed

    ORA-39148: cannot import data in pre-existing table "SYSMAN" of the queue. "" AQ$ _MGMT_NOTIFY_QTABLE_G. Table_exists_action of ADD is ignored for this table.

    Ignore all the SYSMAN errors

    HTH
    Srini

  • meet the ORA-39000: specification of the bad dump file when using datapump

    Hello
    I am using datapump to export a schema (Meta_data only). However, I would like to be named after the date and time of export taken the dump file.

    When I use the following command - the job runs perfectly.
    expdp system@*** dumpfile = expdp-'date' + %m %%d Y_ hour %M %s ".dmp logfile EXP_DP = directory = expdp-'date' + %m %%d Y_ hour %M %s".log SCHEMAS = MARTMGR CONTENT = METADATA_ONLY


    However, I want to run the export using a parfile. But if use below parfile, I encounter the following errors.

    UserID=System@***
    DIRECTORY = EXP_DP
    SCHEMAS = TEST
    dumpfile = expdp-'date' + %m %%d Y_ hour %M %s "
    LOGFILE = MARTMGR.log
    CONTENT = METADATA_ONLY

    expdp parfile = martmgr.par

    Connected to: Oracle Database 11 g Enterprise Edition Release 11.2.0.2.0 - 64 bit Production
    With the partitioning option
    ORA-39001: invalid argument value
    ORA-39000: bad dump file specification
    ORA-39157: Error adding the file extension "expdp-'date' + %m %%d Y_ hour %M %s".
    ORA-07225: sldext: translation error, impossible to expand file name.
    Additional information: 7217



    How do I add the date and time of the dumpfile when you use a parfile.

    Thank you
    Rohit

    You can do this in a script

    setenv cur_date 'date' + %m %y %d "

    then, when you give parameters

    dumpfile = dumpfile_$ cur_date.dmp

  • Upgrade using datapump 10.2.0.4 to 11.2.0.3

    Hello Experts,
    I have some doubts about the upgrade of oracle 10.2.0.4 on 11.2.0.3 database using datapumps.
    We were actually comfortable upgrade database of production manually, which takes less time than using datapumps, but our customer insists we do the upgrade using datapump to improve the performance of the database.

    My questions are:

    (1) is it safe to delete the old database once the export is considered to be we have not a lot of space in the server to accommodate both databases and the dump file. However we take a cold backup of the entire file system in the Ribbon.

    (2) is as such (objects, etc.) will not be imported, and we must take into consideration.

    (3) what DBMS_Scheduler jobs and jobs of dba, which will be imported successfully or anything to do after the upgrade.

    (4) I actually tried this method in my local test box and import completed with errors about 240.

    Finally, can someone please give me the exact measurements of docs for the upgrade by using this method.

    Thanks in advance
    Asif

    866995 wrote:
    Hello Srini

    Thanks for your suggestion. In fact, we always use sysdba and not system, I do not think that there will be no problem, but always as you suggest, I will try with system as well.

    As indicated in the documentation, you may not use sysdba for export/import. Oracle has a very good reason to make this statement.

    I have a few question for you:
    (1) currently, there are more Fragmentation in our database. Going with the upgrade of the import/export datapump method the correct way to deal with it in the Production Server?

    Yes - pl see this link - http://docs.oracle.com/cd/E11882_01/server.112/e23633/preup.htm#BABFHFIJ

    (2) what is the Oracle recommends best practices for database upgrade. According to my knowledge and my docs's DBUA?

    Yes - this is the recommended method - http://docs.oracle.com/cd/E11882_01/server.112/e23633/preup.htm#i694345

    (3) what tasks the DBMS Scheduler. We need to recreate after the upgrade?

    Scheduler jobs will be exported/imported - http://docs.oracle.com/cd/E11882_01/server.112/e25494/schedadmin003.htm#i1007297

    Thanks again
    Asif

    HTH
    Srini

  • using datapump, transportable tablespaces with the same name

    I am trying to export some CLOB data in a database of a warehouse of data using datapump and transportable tablespaces option.
    My problem is that two databases have the same name of tablespace (USERS) and data file names (users01.dbf).
    A previous post, I learned about the 'remap_tablespace = old_tablespace:new_tablespace' option that would seem to allow me to get around this, but I don't know what is happening with the data files as the data file name already exists.
    If there is another option say with impdp that I could use to tell him to use a different data file name, or do I just rename the data files used by the tablespace to advance source databases.

    You must first create the schema. When tables and indexes created using impdp, they will be created in the right tablespaces. The default tablespace for schemas is overridden because the create table statement specifies the storage space.

    create table foo... tablespace orig_tbs1

    If you add the remap_tablespace = orig_tbs1:new_tbs1 then

    create table foo... tablespace new_tbs1

    If you want to restore the default tablespace after execution of the impdp work, I guess you should be able to do it with an alter statement.

    Dean

  • A full import by using datapump will overwrite the target database data dictionary?

    Hello

    I have a 11G with 127 GB database. I did a full export using expdp as a user of the system. I'll import the created dump file (which is 33 GB) on the basis of data from 12 c target.

    When I do the full import on the 12 c database data dictionary is updated with new data. But as is it already contained data dictionary? It will also change?

    Thanks in advance

    Hello

    In addition to the responses of the other comrades

    To start, you need to know some basic things:

    The dictionary database tables are owned by SYS and must of these tables is created when the database is created.

    Thus, in the different versions of database Oracle there could be less or more data dictionary tables of different structure database,.

    so if this SYSTEM base tables are exported and imported between different versions of oracle, could damage the features of database

    because the tables do not correspond with the version of database.

    See the Ref:

    SYS, owner of the data dictionary

    Database Oracle SYS user is owner of all the base tables and a view of the data available to the user dictionary. No Oracle database user should never change (UPDATE, DELETE, or INSERT) ranks or schema objects contained in the SYS schema, because this activity can compromise the integrity of the data. Security administrator must keep strict control of this central account.

    Source: http://docs.oracle.com/cd/B28359_01/server.111/b28318/datadict.htm

    Prosecutor, the utilities for export cannot export the dictionary SYS base tables and is marked

    as a note in the documentation:

    Data Pump export Modes

    Note:

    Several patterns of system cannot be exported because they are not user patterns; they contain metadata and data managed by Oracle. Examples of system schemas that are not exported MDSYS SYS and ORDSYS.

    Source: https://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_export.htm#SUTIL826

    That's why import cannot modify/alter/drop/create dictionary database tables. If you can not export, so you can not import.

    Import just to add new Non - SYS objects/data in the database, therefore new data are added to the dictionary base tables (as new users, new tables, code pl/sql etc).

    I hope that this might answer your question.

    Kind regards

    Juan M

  • error when importing schema using datapump to level

    Hi I have the ierrors following when importing a schema of prod to database dev... someone can help, thanks!

    Impdp system DIRECTORY is DUMPFILE DATA_PUMP_DIR = abcdprod. DMP LOGFILE = abcdprod.log REMAP_SCHEMA = abcdprod:abcddev

    ORA-39002: invalid operation
    ORA-31694: main table 'SYSTEM '. "' SYS_IMPORT_FULL_01 ' failed to load/unload
    ORA-31644: impossible position to block number 170452 in the dump file ' / ots/oracle/echo/wxyz/datapump/abcdprod.» DMP

    PL check the sizes of DMP file on source and target. Try copying the DMP file again, and then restart the import. If you receive the same error, most likely the source DMP file is damaged and you will need to perform again the expdp

    HTH
    Srini

  • Upgrading database 10.2.0.1 for 11.1.0.7 use Datapump

    Some information:
    Solaris 10 SPARC 64 bit
    Oracle 10.2.0.1 database
    Upgrade to OS even on the other server

    I installed Oracle 11.1.0.7 on the new server and started the new database listener and it is.
    The new database has the same tablespaces that are still on the old database, as system, cancel, sysaux, users, example. So you can see "already exists them" in the below error. And the error you see below is only part of the high, the middle and last part of the production.

    Problem:

    We are to modernize one of our base of databases to test on the method of Import/Export Datapump. We have chosen this method to certify our own spicification. I know there are a lot of ways to improve it, but please keep to this alone.

    Here are the export script:
    System expdp/password full = directory = csappdb_dir = 3 dumpfile=CSAPPDB_%U.dmp version 10.2.0.1.0 = parallel logfile = CSAPPDB.log
    And here's the import script:
    System impdp/password full = directory Y parallel = CSAPPDB_DIR = 3 dumpfile=CSAPPDB_%U.dmp version 10.2.0.1.0 = logfile = CSAPPDB.log
    For your information, I have listed the problem previously when export finished without error and started importing seamless and thought that I understand this time. I have histily the previous thread marked 'responded', so please do not take account of that and forgive me.

    Now, here is the error I get from import datapump. Anyone see where I get it wrong and give me some directiion?

    Thank you



    A-31684: SYNONYM object type: 'PUBLIC '. "" $TEMP LOB ' already exists
    Object DATABASE_EXPORT/SCHEMA/SYNONYM of treatment type
    ORA-31684: SYNONYM object type: 'SYSTEM '. "" SYSCATALOG ' already exists
    ORA-31684: SYNONYM object type: 'SYSTEM '. "'"'CATALOGUE' already exists"
    ORA-31684: SYNONYM object type: 'SYSTEM '. "" TAB "already exists
    ORA-31684: SYNONYM object type: 'SYSTEM '. "' COL ' already exists
    ORA-31684: SYNONYM object type: 'SYSTEM '. "' TABQUOTAS ' already exists
    ORA-31684: SYNONYM object type: 'SYSTEM '. "" SYSFILES "already exists
    ORA-31684: SYNONYM object type: 'SYSTEM '. "' PUBLICSYN ' already exists
    ORA-31684: SYNONYM object type: 'SYSTEM '. "" PRODUCT_USER_PROFILE ' already exists
    ORA-31684: SYNONYM object type: 'BI '. "' CHANNELS ' existing
    ORA-31684: SYNONYM object type: 'BI '. "" COUNTRY "already existing
    ORA-31684: SYNONYM object type: 'BI '. "" TIMES "already existing
    ORA-31684: SYNONYM object type: 'BI '. "' Fees ' already existing
    ORA-31684: SYNONYM object type: 'BI '. "" CLIENTS ' existing
    ORA-31684: SYNONYM object type: 'BI '. ' ' PRODUCTS ' existing
    ORA-31684: SYNONYM object type: 'BI '. "' PROMOTIONS ' already existing
    ORA-31684: SYNONYM object type: 'BI '. "" SALES "already existing


    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_JOB_TARGET_LIST_ARRAY ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_JOB_PURGE_CRITERION_LIST ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_JOB_PARAMSRC_ARRAY ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' SMP_EMD_COL_DEF_ARRAY_ARRAY ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_CRED_TYPE_COL_RECORD ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_CRED_RECORD ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_DELTA_ENTRIES ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_DELTA_QUERIES ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_DELTA_RECORDER ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' ECM_POLICY_RULE ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_JOB_STEP_RECORD ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_CRED_TYPE_COL_ARRAY ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_TARGET_CRED_RECORD ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_CLUSTER_CRED_RECORD ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_HOST_CRED_RECORD ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_ENTERPRISE_CRED_RECORD ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_JOB_CRED_RECORD ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_CONTAINER_CRED_RECORD ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_DELTA_ENTRY_RECORDER ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_HISTORY_ENTRY_RECORDER ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_JOB_STEP_LIST ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_TARGET_CRED_ARRAY ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_CLUSTER_CRED_ARRAY ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_HOST_CRED_ARRAY ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_ENTERPRISE_CRED_ARRAY ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_JOB_CRED_ARRAY ' already exists
    ORA-31684: type of the object TYPE: "SYSMAN." "' MGMT_CONTAINER_CRED_ARRAY ' already exists
    Treatment of type of object DATABASE_EXPORT, SYSTEM_PROCOBJACT, PRE_SYSTEM_ACTIONS, PROCACT_SYSTEM
    ORA-39083: Type as procact_system cannot be created with the object error:
    ORA-20000: Incompatible Version of the Workspace Manager installed
    Because sql is:
    BEGIN
    declare the worm varchar2 (100); entire dummy; compile_exception EXCEPTION; PRAGMA EXCEPTION_INIT(compile_exception,-06550); invalid_table EXCEPTION; PRAGMA EXCEPTION_INIT(invalid_table,-00942); createErrorProc procedure is immediately start run "create or replace function system.wm$ _check_install return boolean is begin to return the value true;" end;'; end; Start
    Treatment of type of object DATABASE_EXPORT, SYSTEM_PROCOBJACT, POST_SYSTEM_ACTIONS, PROCACT_SYSTEM





    RIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. Failed to create TESTCYCL"
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' TA_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creation of TO_ALERT ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' TK_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "' Tokenization ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' TR_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creation of TRAN_RULES ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' US_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creating USERS "failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' VCCR_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creation of VC_CROS_REF ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' VCDS_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creation of VC_DESSTEPS ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' VCSP_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creation of VC_STEP_PARAMS ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' VCTS_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creation of VC_TEST ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' TVCI_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creation of TEST_VC_INFO ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' VRCT_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creation of VER_CTRL ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' LN_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "' LINK building ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' SV_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Failed to create SERVICE.
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' SG_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creation of SERVICE_GROUP ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' STG_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creation of SERVICE_TO_GROUP ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' WO_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creation of WSDL_OPERATION ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' OP_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creation of WSDL_OPERATION_PARAMS ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' SC_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creation of SERVICE_CHANGE ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' STR_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creation of SERVICE_TO_REQUIREMENT ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' GTR_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creation of SERVICE_GROUP_TO_REQUIREMENT ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' STT_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creation of SERVICE_TO_TEST ' failed
    ORA-39112: dependent object CONSTRAINT type: 'QC10MIGRATION_GCSS710_SEQ_DB '. ' GTT_PRIMARYKEY ' ignored, base object type TABLE: 'QC10MIGRATION_GCSS710_SEQ_DB'. '. "" Creation of SERVICE_GROUP_TO_TEST ' failed
    Object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/STATISTICS/INDEX_STATISTICS of treatment



    ORA-39083: Type WHAT TASK failed to create object error:
    ORA-00001: unique constraint (SYS. I_JOB_JOB) violated
    Because sql is:
    BEGIN SYS. DBMS_IJOB. SUBMIT (JOB = > 1, LUSER = > "SYSMAN", IVSWEBBROWSERUSER = > "SYSMAN", CUSER = > "SYSMAN", NEXT_DATE = > TO_DATE ('2011-01-21 14:36:07 ',' YYYY-MM - DD:HH24:MI:SS'), INTERVAL = > ' sysdate + 1 / (24 * 60)', BROKEN = > FALSE, THIS = > ' EMD_MAINTENANCE.) EXECUTE_EM_DBMS_JOB_PROCS();', NLSENV = > ' NLS_LANGUAGE = NLS_TERRITORY "AMERICAN"="AMERICA" NLS_CURRENCY = NLS_ISO_CURRENCY "$" = "AMERICA" NLS
    Treatment of type of object DATABASE_EXPORT/SCHEMA/TABLE/POST_INSTANCE/PROCACT_INSTANCE
    Treatment of type of object DATABASE_EXPORT/SCHEMA/TABLE/POST_INSTANCE/PROCDEPOBJ
    Treatment of type of object DATABASE_EXPORT, DIAGRAM, POST_SCHEMA, PROCOBJ
    Treatment of type of object DATABASE_EXPORT, DIAGRAM, POST_SCHEMA, PROCACT_SCHEMA
    Work 'SYSTEM '. "' SYS_IMPORT_FULL_01 ' completed with errors 11888 at 16:51:41

    Albert - I worked export DP from 10 to 11 where the 11 is a new database installation... it's not hard. You must only make sure that your new storage spaces exist in the size and the name of your source. It will take time to build if you have many indexes.

  • can we use original export to import using datapump 10 g file?

    Hi all

    I'm trying to find an answer to a question. Lets say both my DB are on 10g. and on the basis of data original used A...i exp uitility to export the database/schema/table... so can I use the same export file (created by original exp uitility)... on the B database and import the file using impdp uitility or I can't?

    Hello

    No, not possible.

  • database of apex moving to the instance of production using datapump

    Currently, the apex database resides on a windows server and we would like to move to our production server (unix). I don't want to create a new database for the Summit on the unix server but prefer to import the apex tablespaces and users in the production instance. I tried to export only the apex schema and then import them into an instance of 'TEST' on the side of unix, but that doesn't seem to work since I get errors stating that apex users do not exist in the database target. Did someone done this before and they could identify how they did it.
    My database of apex is to 10.2.0.1 and my database of production to 10.2.0.3

    user483999 wrote:
    I'm only moving the database, apex applications will reside always on the server current app... After doing the import that I intend to change dad to point to the new database, and then test it...

    The statement above confuses the hell out of me.

    Apex lives in a database. He needs to access the items in the database.
    Apex applications reside on an application server. They are in the database.
    He can't see the items in a different database.

    What if all you're moving the database and pointing to the application server to the old database, so any changes you make in the new database will be visible in the application of the apex.

  • Why do we create directory Datapump using.

    Hi all
    Why do we create directory while export and import of table using Datapump.



    Thank you
    Rafi.

    It's how the architecture of the RFP is being implemented. The directory points to a location that is located on the side server. There are way to NonAutres in Oracle to point or manage one o/s path/directory other than the directory object. Because you can not use an absolute path in the RFP, the only remains of option with you is that you use the object directory and this directory (on the o/s), create all necessary files.

    HTH
    Aman...

  • moving to a different database schema (datapump or?)

    Friends and Experts,

    DB: 11 GR 2

    OS: Linux

    (Sorry for the long post but did not give us any information)

    I move a scheme from 1 host to another host-2 schema size is 400 GB.

    I had planned to use datapump since this method I'm more comfortable, export everything, including statistics.

    300 GB partition of a table. (including indexes)

    170 GB of segments like '% TABLE % '.

    Schema has the table to partition table, business segments, lob indexes, indexes, index partitions

    Then he was killed by mistake of snapshot has exported about 250GB of dumpfile.

    Host-1 have only 4 CPU can not really use more than 2 parallel channels in the file export settings, tried using 4 channels and host load reached 10 in a few minutes. Export was killed on the spot to avoid the closure of the host.

    Host-2 is faster, so no, I don't not delay while import.

    We no license for the Golden Gate, but helped advanced compression.

    Problem:

    I started to export but it was so slow that only generated 10 GB/HR, I let him turn to test and it failed after 14 hours with snapshot too old error.

    Not a lot of process runs on the host or the schema, main problem I see is host/drive is slow and not seen in any case to move this huge scheme to another real data base. In the worst case I have the schema lock before the interview for 15 hours and another 10 hours for import or so but I still don't think will end export.

    Issues related to the:

    1. What can be done here to move the schema using datapump?

    2. any other safe method to move the schema? I know that this can be done through transportable tablespace but I have never done this and this is the pattern of production in order to don't want to take the risk.

    3. how anyone with similar project sharing their experience?

    4. any other advice/method/suggestions?

    File export settings:

    DIRECTORY = DATA_PUMP_REFRESH

    DUMPFILE = EXP_01.DBF, EXP_02.DBF

    LOGFILE = EXP_USER1. JOURNAL

    PARALLEL = 2

    SCHEMAS = USER1

    CONTENT = ALL

    Add the parameter parallel and added the size of segments

    You pay conservation discard at least the hour of your longest running query (and ensure that this cancellation can develop that much space).

    Your 'senior' is correct in saying, "we will not do this for the first time in production", but one could say that whatever it is. If there is one, transportable tablespace will probably be the fastest and easiest option.

    Yes, you will still need enough UNDO for a network-link data pump job, but if the writing of the album is the cause of the problem of the speed, the approach of the network link * may * be faster, which means you need less to cancel.

  • How to take partial dump using EXP/IMP in oracle only for the main tables

    Hi all

    select*from v$version;
    
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Prod
    PL/SQL Release 10.2.0.1.0 - Production
    "CORE    10.2.0.1.0    Production"
    TNS for 32-bit Windows: Version 10.2.0.1.0 - Production
    NLSRTL Version 10.2.0.1.0 - Production
    

    I have about 500 huge data main tables in my database of pre production. I have an environment to test with the same structure of old masters. This test environment have already old copy of main tables production. I take the dump file from pre production environment with data from last week. old data from the main table are not necessary that these data are already available in my test environment. And also I don't need to take all the tables of pre production. only the main tables have to do with last week data.

    How can I take partial data masters pre prodcution database tables?  and how do I import only the new record in the test database.

    I use orders EXP and IMP. But I don't see the option to take partial data. Please advice.

    Hello

    For the first part of it - the paintings of masters just want to - use datapump with a request to just extract the tables - see example below (you're on v10, so it is possible)

    Oracle DBA Blog 2.0: expdp dynamic list of tables

    However - you should be able to get a list of master tables in a single select statement - is it possible?

    For the second part - are you able to qrite a query live each main table for you show the changed rows? If you can not write a query to do this, then you won't be able to use datapump to extract only changed lines.

    Normally I would just extract all the paintings of masters completely and refresh all...

    See you soon,.

    Rich

  • using dbua downgrade 11 GR 2

    Hello

    is it still possible to downgrade to 11.2.0.4 to 10.2.0.5 after setting the compatibility to 11.2.0.4?

    There seems to be restricted after the adjustment.

    You can not downgrade using the lower upgrade script that you have successfully updated the compatible parameter.

    Probably the easiest restore backups... If you have one, or use datapump.

Maybe you are looking for

  • Power WT310 - 10 c on AC only

    Hi allI have problem with battery I guess, the Tablet powers on only when connected to the charger, windows indicates that 74% of battery and load but it remains at the same level, when I unplug charger turns off, tabletWhich tablet is off and charge

  • Re: I lost recovery image for my Satellite L655-1HE

    Unfortunately when I tried to restore my laptop to factory setting, electricity went off well, I wasn't on battery. Then, I lost my recovery. My original windows 7 features are included with my laptop Is there any chance to get the original windows 7

  • Blueooth SD Card III and iTech BlueBAND headphones.

    HelloI just bought the Toshiba Bluetooth SD 3 and iTech BlueBAND headphones so that I can listen to my music and chat online with my laptop wireless. However, I have since the installation of the SD card found that I am not authorized to connect my h

  • Passing a unique structure through the functions of customer/ServerTCPRead/write and making sure that all data is transferred

    I use the CVI TCP media kit at my request and I am curious about the following code: ClientTCPRead char * buffer;int messageSize;int bytesToRead;intbytesRead; / * Find messageSize and allocate a buffer properly... * /. bytesToRead = messageSize; Whil

  • Problem with the command net user. Please help ASAP!

    What is happening on Windows XPI logged on as administrator on my computer, then in the command prompt, I typed: net user administrator active: / Yes (and pressed enter). After that I restarted the computer, I can not connect, and was my password (pa