Data Pump - expdp / question utility impdp

HiAll,

As a basis for learning exercise Data pump, I am trying to export the schema scott and then you want to load the dump file into a test_t diagram in the same database.

-1. from dir object sysdba
CREATE a DIRECTORY dpump_dir1 AS 'C:\output_dir ';

-2. created dir on the operating system as c:/output_dit

-3. run expdp
schemas system/***@orcl expdp = scott DIRECTORY = JOB_NAME = hr DUMPFILE = scott_orcl_nov5.dmp PARALLEL dpump_dir1 = 4

-4. create a schema test_t and allows the dba to it.

-5. impdp execution
Impdp system/***@orcl patterns = DIRECTORY = JOB_NAME = hr DUMPFILE = scott_orcl_nov5.dmp PARALLEL dpump_dir1 test_t = 8

He does not here as ORA39165: test_t schema not found. However, the schema test_t exist.

So, I don't know why she should give to this error. It seems that the Scott schema in the expdp dump file may not be charged to any other schema but only to a schema named scott... Is this good? If Yes, how can I load all objects in the schema scott say to another schema say test_t? It would be useful that you can show the respective expdp and impdp command please.

Thank you very much
KS

Hello

You must specify remap_schema when you import

do impdp system/***@orcl = dpump_dir1 = scott_orcl_nov5.dmp remap_schema = scott DUMPFILE directory: test_t

See you soon

Tags: Database

Similar Questions

  • Export schema through Oracle data pump with question database Vault enabled

    Hello

    I installed and configured Vault database on an Oracle 11 g - r2 - 11.2.0.3 to protect a specific schema (SCHEMA_NAME) via a Kingdom. I followed the following doc:
    http://www.Oracle.com/technetwork/database/security/TWP-databasevault-DBA-BestPractices-199882.PDF
    to ensure that the system and the network user has sufficient rights to complete a schedule pump data Oracle export operation.

    I.e. I gave sys and system the following:
    execute dvsys.dbms_macadm.authorize_scheduler_user ('sys', 'SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_scheduler_user ('system', 'SCHEMA_NAME');

    execute dvsys.dbms_macadm.authorize_datapump_user ('sys', 'SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_datapump_user ('system', 'SCHEMA_NAME');

    I also create a second domain on the same schema (SCHEMA_NAME) to allow sys and system to manage indexes for the actual tables protected, in order to allow a system to manage indexes for the tables of protected area and sys. This separate realm was created for all types of indexes: Index, Partition of Index and Indextype, sys and system were allowed as the OWNER of this Kingdom.

    However, when I try and complete an operation of Oracle Data Pump export on the schematic, I get two errors directly after the following line appears in the export log:

    Processing object type SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX:
    ORA-39127: unexpected error of the call to export_string: = SYS. DBMS_TRANSFORM_EXIMP. INSTANCE_INFO_EXP('AQ$_MGMT_NOTIFY_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newBlock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS." DBMS_TRANSFORM_EXIMP', line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS." Dbms_metadata", line 9081
    ORA-39127: unexpected error of the call to export_string: = SYS. DBMS_TRANSFORM_EXIMP. INSTANCE_INFO_EXP('AQ$_MGMT_LOADER_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newBlock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS." DBMS_TRANSFORM_EXIMP', line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS." Dbms_metadata", line 9081

    The export is completed, but this error.

    Any help, pointers, suggestions, etc. in fact no matter what will be very welcome at this stage.

    Thank you

    I moved the forum thread on the "database - security. If the document does not help, pl open an SR with Support

    HTH
    Srini

  • Export the whole (10 GB) using Data Pump utility export base

    Hello

    I have a requirement that we have to export all of the base (10 GB) using Data Pump export utility because it is not possible to send the discharge of 10 GB in a CD/DVD for the seller of our application system (to analyze a few problems that we have).

    Now when I checked online full export is available but not able to understand how it works, as we never used this data pump utility, we use the method of normal export. In addition, the data pump will reduce the size of the dump file so it can fit on a DVD or can we use utility export parallel DB full to split the files and include them in a DVD, it is possible.

    Please correct me if I am wrong and kindly help.

    Thank you for your help in advance.

    Pravin,

    The server saves files in the directory object that you specify on the command line. So what you want to do is:

    1. from your operating system, find an existing directory or create a new directory. In case you, C:/Dump is as good a place as any.

    2. connect to sqlplus and create the directory object. Just use the path. I use linux, so my directory looks like/scratch/xxx/yyy
    If you use Windows, the path to your directory would look like C:/Dump you should not attack.

    3. don't forget to grant access to this directory. You can grant access to a single user, group of users, or the public. Just like
    any other object.

    If it helps, or if she has answered your question, please mark messages with the appropriate tag.

    Thank you

    Dean

  • ODI LKM Oracle for Oracle Data Pump question

    Hi all

    I have a weird problem, ODI.

    I associate myself with per_all_people_f, fnd_user to load the w_user_ds using Oracle Data Integrator. The used LKM is LKM Oracle for Oracle Data Pump.

    Fine when I run the interface. I am getting below error

    ODI-1227: task failed USER_DATA_SET (load) on the source of connection ORACLE EBS.

    Caused by: java.sql.SQLSyntaxErrorException: ORA-00923: KEYWORD not found where expected

    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:462)

    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:405)

    at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:931)

    at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:481)

    at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:205)

    at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:548)

    at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:217)

    at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:1115)

    at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1488)

    at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3769)

    at oracle.jdbc.driver.OraclePreparedStatement.execute(OraclePreparedStatement.java:3954)

    at oracle.jdbc.driver.OraclePreparedStatementWrapper.execute(OraclePreparedStatementWrapper.java:1539)

    at oracle.odi.runtime.agent.execution.sql.SQLCommand.execute(SQLCommand.java:163)

    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:102)

    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:1)

    at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:577)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:468)

    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2128)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:366)

    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)

    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:292)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:855)

    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)

    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)

    at java.lang.Thread.run(Thread.java:662)

    The generated code is

    create the table 780021 X

    (

    C1_FIRST_NAME,

    C2_MID_NAME,

    C3_LAST_NAME,

    C4_FULL_NAME,

    C5_NAME_SUFFIX,

    C6_SEX_MF_CODE,

    C7_SEX_MF_NAME,

    C8_COUNTRY_NAME,

    C9_LOGIN,

    C10_CREATED_BY_ID,

    C11_CHANGED_BY_ID,

    C12_CREATED_ON_DT,

    C13_CHANGED_ON_DT,

    C14_AUX1_CHANGED_ON_DT,

    C15_SRC_EFF_TO_DT,

    C16_INTEGRATION_ID,

    C17_EFFECTIVE_START_DATE

    )

    (EXTERNAL) ORGANIZATION

    TYPE oracle_datapump

    Dat_dir default DIRECTORY

    LOCATION ("X780021.exp")

    )

    PARALLEL

    in SELECT

    ALL_PEOPLE_F.FIRST_NAME,

    ALL_PEOPLE_F.MIDDLE_NAMES,

    ALL_PEOPLE_F.LAST_NAME,

    ALL_PEOPLE_F.FULL_NAME,

    ALL_PEOPLE_F.SUFFIX,

    ALL_PEOPLE_F.SEX,

    ALL_PEOPLE_F.SEX,

    ALL_PEOPLE_F.NATIONALITY,

    USER. USER_NAME,

    ALL_PEOPLE_F.CREATED_BY,

    ALL_PEOPLE_F.LAST_UPDATED_BY,

    ALL_PEOPLE_F.CREATION_DATE,

    ALL_PEOPLE_F.LAST_UPDATE_DATE,

    ALL_PEOPLE_F.CREATION_DATE,

    ALL_PEOPLE_F.EFFECTIVE_END_DATE,

    USER. USER_ID,

    ALL_PEOPLE_F.EFFECTIVE_START_DATE

    from APPS. FND_USER USER, APPS. PER_ALL_PEOPLE_F ALL_PEOPLE_F

    where (1 = 1)

    And (ALL_PEOPLE_F.PERSON_ID = USER. EMPLOYEE_ID)

    I don't see what is the problem here.

    Someone can help me.

    Thank you and best regards,

    Krishna Prasad

    I found the problem, its with the way ODI generated alias for the FND_USER table, by default it produces USER as an alias, which is a keyword from oracle. We just need to rename it to something else, and it worked.

  • a question about data pump

    Hello

    I'm running a full database export using data pump. (full = y)


    To ensure data integrity, I have blocked certain patterns before starting the exp.
    I want to assure you that these patterns locked still be exported (not not being ignored), right?

    Please help confirm.

    Thank you very much.

    db version: 10.2.0.3 in Linux 5

    Published by: 995137 on April 23, 2013 15:30

    Hello
    If a schema is locked/unlocked makes no difference to datapump - he extracted them anyway with a full extract. The log file should list all the tables that are expoted, so you should see them there.

    Kind regards
    Harry

  • Question about data pump and directory/sub-directories

    I have a process that makes these calls to Data Pump:
    dbms_datapump.add_file (dph, file_name |) '.log', ' DATA_PUMP_DIR/test', filetype = > DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);
    dbms_datapump.add_file (dph, file_name |) ".dmp", ' DATA_PUMP_DIR/test', filetype = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    He failed with the subdirectory "/ test". If I delete it, as in these calls it works.
    dbms_datapump.add_file (dph, file_name |) '.log', "DATA_PUMP_DIR" filetype = > DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);
    dbms_datapump.add_file (dph, file_name |) ".dmp", "DATA_PUMP_DIR" filetype = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    I even did a test 'chmod 777 '.

    No idea what I am doing wrong?

    I'm on 11.0.1.6

    It's the pile of error I get:
    ORA-39002: invalid operation
    ORA-06512: at "SYS." DBMS_SYS_ERROR', line 79
    ORA-06512: at "SYS." DBMS_DATAPUMP', line 3043
    ORA-06512: at "SYS." DBMS_DATAPUMP', line 3292
    ORA-06512: at "ORA_ADMIN. DATAPUMP_UTIL", line 46
    ORA-06512: at line 1

    Chris wrote:
    I have a process that makes these calls to Data Pump:
    dbms_datapump.add_file (dph, file_name |) '.log', ' DATA_PUMP_DIR/test', filetype => DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);
    dbms_datapump.add_file (dph, file_name |) ".dmp", ' DATA_PUMP_DIR/test', filetype => DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    He failed with the subdirectory "/ test". If I delete it, as in these calls it works.
    dbms_datapump.add_file (dph, file_name |) '.log', "DATA_PUMP_DIR", filetype => DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);
    dbms_datapump.add_file (dph, file_name |) ".dmp", "DATA_PUMP_DIR", filetype => DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    I even did a test 'chmod 777 '.

    No idea what I am doing wrong?

    invalid syntax

    DATA_PUMP_DIR

    above is DIRECTORY Oracle
    If you want a different location, then CREATE DIRECTORY DATA_PUMP_DIR_TEST

  • Data Pump Export/Import

    Hello Forum,

    I have a question regarding imports and exports of data pump, perhaps what I should already know.

    I need to empty a table that has about 200 million lines, that I need to get rid of about three quarters of data.

    My intention is to use the data pump to export the table and and indexes and constraints etc..

    The table has no relationship to any other table, it is composed of approximately 8 columns with constraints not null.

    My plan is

    1 truncate table

    2. disable or remove index

    3 leave the constraints in place?

    4. use data pump to import a lines to keep.

    My question

    will be my clues and imported too much constraints I want to import only a subset of my exported table?

    or

    If I dropped the table after truncation, I'll be able to import my table and indexes, even if I use the sub as part of my statement to import query functionality?

    My table using the query sub in data pump functionality must exist in the database before doing the import

    or handful of data pump import as usual IE will create table indices grants and statistics etc.?

    Thank you for your comments.

    Concerning

    Your approach is ineffective.

    What you need to do is to

    create the table in select foo * bar where the

    bar of truncate table;

    Insert / * + APPEND * / into select bar * foo.

    Rebuild the indexes on the table.

    Fact.

    This whole thing with expdp and impdp is only a waste of resources. My approach generates redo a minimum.

    ----------

    Sybrand Bakker

    Senior Oracle DBA

  • Differences between Data Pump and always by inheritance, import and export

    Hi all

    I work as a junior in my organization dba and I saw that my organization still uses legacy import and export utility instead of Oracle Data Pump.

    I want to convence my manager to change the existing deal with Oracle Data Pump, I have a meeting with them to keep my points and convence them for Oracle utility pump data.

    I have a week very convencing power but I won't miss to discover myself, I can't work myself and really a need with differences of strength against import and export, it would be really appreciated if someone can put strong points against Oracle Data pump on legacy import and export.

    Thank you

    Cabbage

    Hello

    a other people have already said the main advantage of datapump is performance - it is not just a little more fast exp/imp it is massively faster (especially when combined with parallel).

    It is also more flexible (once much more) - it will even create users with exports level schema which imp can't do for you (and it is very annoying that he could never).

    It is reusable

    It has an api plsql

    It supports all types of objects and new features (exp is not - and that alone is probably reason to spend)

    There even a 'legacy' at 11.2 mode where most of your old exp parameter file will still work with him - just change exp expdp and impdp print.

    The main obstacle to the transition to datapump seems to be all "what do you mean I have to create a directory so that it works", well who and where is my dumpfile why can't it be on my local machine. These are minor things to go well beyond.

    I suggest do you some sort of demo with real data of one of your large databases - do a full exp and a full expdp with parallel and show them the runtimes for them to compare...

    See you soon,.

    Rich

  • Data Pump: export/import tables in different schemas

    Hi all

    I use Oracle 11.2 and I would like to use the data pump to export / import data into the tables between the different schemas. The table already exists in the source and target schemas. Here are the commands that I use:


    Script working export:

    expdp scott/tiger@db12 schema = source include = TABLE:------"IN (\'TB_TEST1 ', \'TB_ABC')\ 'directory = datapump_dir dumpfile = test.dump logfile = test_exp.log

    Script to import all the desks:

    Impdp scott/tiger@db12 remap_schemas = rΘpertoire source: target = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only table_exists_action = truncate

    Script error for some tables to import:

    Impdp scott/tiger@db12 remap_schemas = source: target include = TABLE:------"IN (\'TB_TEST1')\' directory = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only

    Export is good, I got the folling error when I try to import the table TB_TEST1 only: "ORA-31655: no metadata data_or objectsz selected for employment. The user scott is a DBA role, and it works fine when I try to import all tables being export without the include clause.

    It is possble to import some tables but not all tables in the export file?

    Thanks for the help!

    942572 wrote:

    It's good to import all tables exported through Scott, who do NOT have the clause 'include '.

    I have the error only when I try to import some tables with clause "include".

    Can I import only some table since the export dump file? Thank you!

    you use INCLUDE incorrectly!

    do below yourself

    Impdp help = yes

    INCLUDE

    Include the specific object types.

    For example, INCLUDE TABLE_DATA =.

  • Error data pump import

    Hi all

    I get errors below when you try to use the data pump import a table into the dump of the file (of a pre-production database) in a different database (prod_db) even version. I used the expdp to generate the dump file.

    Gettings errors-
    ORA-39083
    ORA-00959
    ORA-39112

    Any suggestions or advice would be appreciated.

    Thank you


    Import: Release 10.2.0.1.0 - 64 bit Production

    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    ;;;
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64 bit Production
    With partitioning, OLAP and Data Mining options
    Main table 'SYSTEM '. "' SYS_IMPORT_TABLE_01 ' properly load/unloaded
    Start "SYSTEM". "" SYS_IMPORT_TABLE_01 ": System / * DIRECTORY = DATA_PUMP_DIR DUMPFILE = EXP_Mxxxx_1203.DMP LOGFILE = Mxxxx_CLAIMS. LOG TABLES = CLAIMS
    Object type SCHEMA_EXPORT/TABLE/TABLE processing
    ORA-39083: object of type THAT TABLE could not create with the error:
    ORA-00959: tablespace "OXFORD_DATA_01" does not exist
    Because sql is:
    CREATE TABLE 'xxx '. "" CLAIMS "("CLAIM_NUM"VARCHAR2 (25) NOT NULL ACTIVATE, ACTIVATE THE"LINE_NUM"NUMBER (3.0) NOT NULL, 'PAID_FLAG' CHAR (1), CHAR (1) 'ADJ_FLAG', 'CLAIM_TYPE' VARCHAR2 (20), 'MEM_ID' VARCHAR2 (20), VARCHAR2 (20) 'MEM_SUF', DATE 'MEM_BEG_DATE', 'MEM_REL' CHAR (2), 'MEM_NAME' VARCHAR2 (40), DATE 'MEM_DOB', 'MEM_SEX' CHAR (1),"REC_DATE"DATE, DATE OF THE"PAID_DATE") 'FROM_DATE' DATE "
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    ORA-39112: type of dependent object INDEX: 'xxx '. «CLAIMS_IDX1 ' ignored, base object type TABLE: 'Xxx'.» "" Creation of CLAIMS ' failed
    ORA-39112: type of dependent object INDEX: 'xxx '. «CLAIMS_PK ' ignored, base object type TABLE: 'Xxx'.» "" Creation of CLAIMS ' failed
    Object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS of treatment
    ORA-39112: dependent object type skipped INDEX_STATISTICS, object-based INDEX type: 'xxx '. "' CLAIMS_IDX1 ' failed to create
    ORA-39112: dependent object type skipped INDEX_STATISTICS, object-based INDEX type: 'xxx '. "" Creation of CLAIMS_PK ' failed
    Object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS treatment
    ORA-39112: dependent object type skipped TABLE_STATISTICS, object-based ARRAY type: "xxx". "" Creation of CLAIMS ' failed
    Work 'SYSTEM '. "" SYS_IMPORT_TABLE_01 "completed with error (s) from 6 to 13:49:33


    Impdp name username/password DIRECTORY = DATA_PUMP_DIR DUMPFILE = EXP_Mxxxx_1203.DMP LOGFILE is Mxxxx_CLAIMS. LOG TABLES = CLAIMS

    Before you create the tablespace or use clause REMAP_TABLESPACE for the import.

  • ORA-39139: Data Pump

    As he tried to export certain patterns of mine, he ended up showing...
    'complete with 1 error (s)'

    Looking at the export log file that I found:
    ORA-39139: data pump does not support the XMLSchema objects. TABLE_DATA: "BETUSR." "' NSC_SERVICES ' will be ignored.

    What it means?

    expdp does not support the columns based on the XMLSchema before 11.x or XMLSchemas.
    To fix this, you can use the "old" exp utility

    Take a look at this note:
    How to move the XML schema based on data from xmltype 10 g to another schema / 11 g [1318012.1 ID]

  • Moving database from 1 server to another via Data Pump - some queries

    Hello

    I am planing to move my database from one windows server to another. Part of the obligation is also to update this database 10g to 11.2.0.3. So I'm combining 2 tasks using the export / import method (via Data Pump) upgrade.

    Regarding export / import (which will be a pump full data of the database export) my plan is.

    create an empty 11.2.0.3 target database on the server (same number of control files, and redo logs etc.) ready for an import of the source database
    Q1. This will create tablespaces SYSTEM and UNDO - I presume the datapump export doesn't include these spaces of storage anyway?

    For export, I intend to simulate CONSISTENT = Y using FLASHBACK_TIME = SYSTIMESTAMP
    Q2. What is the return of flame characteristics must be active on the source database to use this knowledge should I Flashback Database enabled on the source (as opposed to other flashback features) database?

    My target is a virtual server with a single virtual processor
    Q3. Is there any point PARALLEL usinng in the import settings file (normally I would fix this number of processors - however in the case of this virtual server, it is actually onely a virtual processor)?

    For the import, I intend to use the REMAP_DATAFILE option to change the location of the data files on the target server

    Q4. If the import fails before the end, what is the best course of action? for example I just let go of data storage spaces and remake a complete import?

    Thank you
    Jim

    Jim,

    I'll take a pass on your questions:

    create an empty 11.2.0.3 target database on the server (same number of control files, and redo logs etc.) ready for an import > source database
    Q1. This will create tablespaces SYSTEM and UNDO - I presume the datapump export does not include these storage spaces > anyway?

    The system tablespace is created when you create a database, but the Data Pump will export and try to import. It will fail with tablespace exists. I am sure that the undo tablespace will be also exported and imported. If they are there, then just import will report that they already exist.

    For export, I intend to simulate CONSISTENT = Y using FLASHBACK_TIME = SYSTIMESTAMP
    Q2. What is the return of flame characteristics must be active on the source database to use this knowledge should I Flashback Database active > on the source (as opposed to other flashback features) database?

    I know not true about it. I thought that you need just enough cancel, but I hope that others will run in.

    My target is a virtual server with a single virtual processor
    Q3. Y at - it no PARALLEL point usinng in the import settings file (normally I put this on the number of processors - > however in the case of this virtual server, it is actually onely a virtual processor)?

    We recommend usually 2 times the number of processes, so 2 parallel should be ok.

    For the import, I intend to use the REMAP_DATAFILE option to change the location of the data files on the target server

    Q4. If the import fails before the end, what is the best course of action? that is, do I just give up storage of data and redo a > full import?

    It depends what is failure. Most of the failures will not stop work, but if this is the case, then most of these jobs may simply be restarted. To restart a job, you just need to know the name of the task, which is printed as you start to export/import, or you name the task in your order Data Pump. To restart, follow these steps:

    password/user Impdp attach = job_name

    If you do not name the work, the name of the job will be something like

    User.sys_import_full_01

    Hope that helps - and good luck with your migration.

    Dean

  • selective column of data pump export

    Dear Experts,

    I'm using Oracle 11 g. With the help of data pump is it possible to choose only some columns in a table to export.

    Thanks in advance.

    spur230 wrote:
    Dear Experts,

    I'm using Oracle 11 g. With the help of data pump is it possible to choose only some columns in a table to export.

    Thanks in advance.

    It is not possible in data pum export only selective table columns. However you can try next.

    (1) create table export_selective as select c1, c2 from the source table; (you can create it in the source db)

    (2) expdp export_selective table

    (3) export_selective impdp in the target table

    Reverse:
    Dblink allows to obtain this table created with selective columns using DEC

  • 10g to 11 GR 2 upgrade using Data Pump Import

    Hello

    I intend to move a database 10g a windows server to another. However, there is also a requirement to upgrade this database to GR 11, 2. That's why I was going to combine the 2 in one movement.

    1. take an export of complete data to source 10 g data pump
    2. create a new database empty 11 g on the target environment
    3 import the dump file into the target database

    However, I have a couple of queries running in my mind about this approach-

    Q1. What happens with the SYSTEM and SYS SYSTEM and SYSAUX objects when importing? Given that I have in fact already created a new dictionary on the empty target database - import SYS or SYSTEM will simply produce errror messages that should be ignored?

    Q2. should I use EXCLUDE on SYS and SYSTEM (is it EXCLUDE better export or import side)?

    Q3. What happens if there are things like scheduled tasks, etc. on the source system - since these would be stored in the property SYSTEM tables, how I would bring them across to the target 11 g database?

    Thank you
    Jim

    This approach is included in the Upgrade Guide 11 GR 2 - http://docs.oracle.com/cd/E11882_01/server.112/e23633/expimp.htm

    PL, ensure that you use SYSDBA privileges to run commands expdp and impdp. See here the first sections of "Note".

    http://docs.Oracle.com/CD/B19306_01/server.102/b14215/dp_export.htm#sthref57
    http://docs.Oracle.com/CD/E11882_01/server.112/e22490/dp_import.htm#i1012504

    As mentioned, sown schemas (for example SYS etc.) are not exported. See http://docs.oracle.com/cd/B19306_01/server.102/b14215/dp_export.htm#i1006790

    HTH
    Srini

  • Data pump and the users and developers of Apex_Admin

    Hello

    I use Data Pump to save the schema that I use. It works very well. Now I would like to use Data Pump export and import the users listed in the application of Apex_Admin under workspaces manage / managing the Deverlopers and users.
    Where are stored the users? How to build the EXPDP / statements IMPDP.

    Thanks for your help.

    Hello

    Schema APEX_xxxxxx or FLOWS_xxxxxx are stored users APEX is where all your application metadata and workspace. The scheme name depends on your version of the APEX.
    Maybe you're using APEXExport. Check out this blog of Johns.
    http://Jes.blogs.shellprompt.NET/2006/12/12/backing-up-your-applications/

    Kind regards
    Jari

    http://dbswh.webhop.NET/dbswh/f?p=blog:Home:0

Maybe you are looking for

  • Casting of object type to aid to more specific

    Hello I'm bit stuck in the legacies of the class in my current project. I implement a zero-coupling messaging architecture that uses variant tables to store objects to be cast to sample. The good news: it works in general, little we take a peek at so

  • Games for Sylvania Netbook.

    I have a Netbook from Sylvania, 7 ". It uses Windows CE 6.0, has an ARM processor, has about 41 MB available for the program and comes with two games: Solitaire and Freecell. I could not find any other games that will run on it. Where can I get more

  • I'm having a problem with Skype - proofreading

    I recently downloaded Skype.  I don't know what's going on, but after opening or start my desktop, it happens - but in a language I did not recognize or understand.  I need to go back to English as before.  I have deleted or uninstalled Skype and res

  • When you try to run disk defragment tool I get the message "module stop working" and it closes

    Original title: defrag computer Each time defragmentation runs, if at the request, or I try, I get module quit job and program of the windows closed. developed.  Then when he pointed out the problem, I have just sent me to the updates that I have not

  • batch file to unzip the files in the 64 bit version of windows vista Home premium

    Hello I'm looking to write a batch file to decompress the files. I'm working on a windows vista Home premium 64-bit. any help will be really wonderful.thanx .pdp2907