Full expdp and impdp: one db to another

Hello! Nice day!

I would like to ask for any help with my problem.

I would like to create a full database export and import them to a different database. This data base 2 are on separate computers.
I try to use the expdp and impdp tool for this task. However, I experienced some problems during the import task.

Here are the details of my problems:

When I try to impdp the dump file, it seems that I was not able to import data and metadata to the user.

Here is the exact command that I used during the import and export of task:

Export (Server n ° 1)

expdp user01 / * directory = ora3_dir full = y dumpfile=db_full%U.dmp filesize = parallel 2 logfile = db_full.log G = 4

import (Server #2)
Impdp user01 / * directory = ora3_dir dumpfile=db_full%U.dmp full = y log = db_full.log sqlfile = db_full.sql estimate = parallel blocks = 4

Here is the log that was generated during the impdp runs:

;;;
Import: Release 10.2.0.1.0 - 64 bit Production on Friday, 27 November 2009 17:41:07

Copyright (c) 2003, 2005, Oracle. All rights reserved.
;;;
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64 bit Production
With partitioning, OLAP and Data Mining options
Table main "PGDS. "' SYS_SQL_FILE_FULL_01 ' properly load/unloaded
Departure "PGDS. ' SYS_SQL_FILE_FULL_01 ': PGDS / * directory = ora3_dir dumpfile=ssmpdb_full%U.dmp full = y logfile = ssmpdb_full.log sqlfile = ssmpdb_full.sql.
Object DATABASE_EXPORT/TABLESPACE of treatment type
Type of object DATABASE_EXPORT/PROFILE of treatment
Treatment of DATABASE_EXPORT/SYS_USER/USER object type
Treatment of DATABASE_EXPORT/SCHEMA/USER object type
Type of object DATABASE_EXPORT/ROLE of treatment
Treatment of type of object DATABASE_EXPORT, GRANT, SYSTEM_GRANT, PROC_SYSTEM_GRANT
DATABASE_EXPORT/SCHEMA/SCHOLARSHIP/SYSTEM_GRANT processing object type
DATABASE_EXPORT/SCHEMA/ROLE_GRANT processing object type
DATABASE_EXPORT/SCHEMA/DEFAULT_ROLE processing object type
DATABASE_EXPORT/SCHEMA/TABLESPACE_QUOTA processing object type
DATABASE_EXPORT/RESOURCE_COST processing object type
Treatment of DATABASE_EXPORT/SCHEMA/DB_LINK object type
DATABASE_EXPORT/TRUSTED_DB_LINK processing object type
DATABASE_EXPORT/PATTERN/SEQUENCE/SEQUENCE processing object type
Treatment of type of object DATABASE_EXPORT/PATTERN/SEQUENCE/EXCHANGE/OWNER_GRANT/OBJECT_GRANT
DATABASE_EXPORT/DIRECTORY/DIRECTORY of processing object type
Treatment of type of object DATABASE_EXPORT/DIRECTORY/EXCHANGE/OWNER_GRANT/OBJECT_GRANT
Treatment of type of object DATABASE_EXPORT/DIRECTORY/EXCHANGE/CROSS_SCHEMA/OBJECT_GRANT
Type of object DATABASE_EXPORT/CONTEXT of transformation
Object DATABASE_EXPORT/SCHEMA/PUBLIC_SYNONYM/SYNONYM of treatment type
Object DATABASE_EXPORT/SCHEMA/SYNONYM of treatment type
DATABASE_EXPORT/SCHEMA/TYPE/TYPE_SPEC processing object type
Treatment of type of object DATABASE_EXPORT, SYSTEM_PROCOBJACT, PRE_SYSTEM_ACTIONS, PROCACT_SYSTEM
Treatment of type of object DATABASE_EXPORT, SYSTEM_PROCOBJACT, POST_SYSTEM_ACTIONS, PROCACT_SYSTEM
DATABASE_EXPORT/SCHEMA/PROCACT_SCHEMA processing object type
DATABASE_EXPORT/SCHEMA/TABLE/TABLE processing object type
Treatment of type of object DATABASE_EXPORT, SCHEMA, TABLE, PRE_TABLE_ACTION
Treatment of type of object DATABASE_EXPORT/SCHEMA/TABLE/SCHOLARSHIP/OWNER_GRANT/OBJECT_GRANT
Treatment of type of object DATABASE_EXPORT/SCHEMA/TABLE/SCHOLARSHIP/CROSS_SCHEMA/OBJECT_GRANT
Treatment of type of object DATABASE_EXPORT/SCHEMA/TABLE/INDEX/INDEX
DATABASE_EXPORT/SCHEMA/TABLE/CONSTRAINT processing object type
Object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/STATISTICS/INDEX_STATISTICS of treatment
DATABASE_EXPORT/SCHEMA/TABLE/COMMENT processing object type
DATABASE_EXPORT/SCHEMA/PACKAGE/PACKAGE_SPEC processing object type
Treatment of type of object DATABASE_EXPORT/SCHEMA/PACKAGE/SCHOLARSHIP/OWNER_GRANT/OBJECT_GRANT
DATABASE_EXPORT/SCHEMA/FEATURE/FUNCTION processing object type
Treatment of type of object DATABASE_EXPORT/SCHEMA/FUNCTION/GRANT/OWNER_GRANT/OBJECT_GRANT
DATABASE_EXPORT/DIAGRAM/PROCEDURE/PROCEDURE processing object type
Treatment of type of object DATABASE_EXPORT/DIAGRAM/PROCEDURE/GRANT/OWNER_GRANT/OBJECT_GRANT
DATABASE_EXPORT/SCHEMA/FUNCTION/ALTER_FUNCTION processing object type
DATABASE_EXPORT/DIAGRAM/PROCEDURE/ALTER_PROCEDURE processing object type
DATABASE_EXPORT/SCHEMA/VIEW/VIEW processing object type
Treatment of type of object DATABASE_EXPORT/SCHEMA/VIEW/SCHOLARSHIP/OWNER_GRANT/OBJECT_GRANT
Treatment of type of object DATABASE_EXPORT/SCHEMA/VIEW/SCHOLARSHIP/CROSS_SCHEMA/OBJECT_GRANT
DATABASE_EXPORT/SCHEMA/VIEW/COMMENT processing object type
Type of object DATABASE_EXPORT/SCHEMA/TABLE/CONSTRAINT/REF_CONSTRAINT of treatment
Type of object DATABASE_EXPORT/SCHEMA/PACKAGE_BODIES/PACKAGE/PACKAGE_BODY of treatment
DATABASE_EXPORT/SCHEMA/TYPE/TYPE_BODY processing object type
Treatment of type of object DATABASE_EXPORT/SCHEMA/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
Treatment of object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
Treatment of type of object DATABASE_EXPORT/SCHEMA/TABLE/STATISTICS/TABLE_STATISTICS.
Treatment of type of object DATABASE_EXPORT, SCHEMA, TABLE, POST_TABLE_ACTION
DATABASE_EXPORT/SCHEMA/TABLE/TRIGGER processing object type
DATABASE_EXPORT/SCHEMA/VIEW/TRIGGER processing object type
Treatment of object DATABASE_EXPORT/PATTERN/JOB type
Object DATABASE_EXPORT/SCHEMA/DIMENSION of treatment type
Treatment of type of object DATABASE_EXPORT/SCHEMA/TABLE/POST_INSTANCE/PROCACT_INSTANCE
Treatment of type of object DATABASE_EXPORT/SCHEMA/TABLE/POST_INSTANCE/PROCDEPOBJ
Treatment of type of object DATABASE_EXPORT, DIAGRAM, POST_SCHEMA, PROCOBJ
Treatment of type of object DATABASE_EXPORT, DIAGRAM, POST_SCHEMA, PROCACT_SCHEMA
Job 'PGDS. "" SYS_SQL_FILE_FULL_01 "led to 17:43:09

Thank you in advance.

The good news is that your dumpfile seems fine. It has metadata and data.

I looked through your impdp command and found your problem. You have added the sqlfile parameter. This indicates datapump to create a file that can be run from sqlplus. In fact, it is not objects. It also excludes data because data could get pretty ugly in a sqlfile.

Here's your impdp command:

Impdp user01 / * directory = ora3_dir dumpfile=db_full%U.dmp full = y log = db_full.log sqlfile = db_full.sql...

Just remove the

sqlFile = db_full. SQL

After you run your first job, you will have a file named db_full.sql that has all the inside create statements. After you remove the sqlfile parameter, your import will work.

Dean

Tags: Database

Similar Questions

  • need advice on expdp and impdp

    I have a database on server win of oracle 10g.
    I installed a newserver on oracle 11g in GNU / linux.
    Now I would like to clone the data old server to the new server.
    I think expdp and impdp is the best option.

    If so, what would be the maximum amount of data that I can cope?
    its better to do it in shema shema or database data?

    help me guys

    If you have SQL Developer, on the Tools menu, there is an option to copy the database. It will not be an easy one.

  • Help EXPDP and IMPDP

    Hey guys
    I'm fairly new to using expdp and impdp and I was wondering if anyone knew a solution to my problem. I am currently using expdp in parallel mode, which creates several DMP files. Now, when I use impdp, I use the following code:

    Impdp stglsr/stglsr@tintdb tables = CTDA0FIL_CDC CONTENT = ALL = LSR_DQA_REPORTS dumpfile = CTDA0FIL_CDC_01.dmp directory, CTDA0FIL_CDC_02.dmp logfile = CTDA0FIL_CDC_IMP.log

    As you can see I'm going into each of the files to be imported. As it will be run by someone else and I can't guarantee the export will produce the number of files, is it possible to just import all the files associated with an export particular rather than to specify the files to be imported individually.


    -Below is an example of my export:

    expdp stglsr/stglsr@tintdb parfile = ctca0fil_export.par


    -By file:

    TABLES = CTCA0FIL_CDC
    DUMPFILE=CTCA0FIL_CDC_%U.dmp
    CONTENT = ALL
    DIRECTORY = LSR_DQA_REPORTS
    LOGFILE = CTCA0FIL_CDC.log
    PARALLEL = 4
    QUERY = CTCA0FIL_CDC: ' where ETL_STATUS = 'E "".
    JOB_NAME = CTCA0FIL_EXP_JOB

    If anyone can help I would be very grateful.

    Thank you
    Simon

    [email protected] wrote:
    Here's the import statement as I'm now using:

    Impdp stglsr/stglsr@tintdb tables = CTDA0FIL_CDC CONTENT = directory = LSR_DQA_REPORTS dumpfile=CTDA0FIL_CDC_U%.dmp logfile = CTDA0FIL_CDC_IMP.log ALL

    But when I start using it, it says:

    ORA-39001: invalid argument value
    ORA-39000: bad dump file specification
    ORA-39124: dump file name 'CTDA0FIL_CDC_U%.dmp' contains invalid substitution variable

    Thank you
    Simon

    Its you and not U % %.

  • Backup and recovery with EXPDP and IMPDP

    Hello

    I have a problem of beginner!

    My environment is 10.2.0.4 & 11.2.0.2.

    I wish
    to backup a small database with EXPDP (FULL=Y)
    and to recover it with IMPDP(FULL=Y). ( *I don't want to use RMAN in this case*)
    My question:
    + what have I more to backup ( e.g. Passwordfile, control files, init.ora..) to be able to recover  
      successfully with IMPDP?
    + what are the steps to recover with IMPDP?
    I tried looking for a Howto in INTERNET, but in vain so far!

    Can any expert answer me? or show me a place to get a good HOWTO?

    Thanks and greetings

    hqt200475

    So that's a little different situation from the initial, general issue.

    This situation Yes, create a new instance when the user that contains the data catalog rman since your last export import. The rman catalog is usually low volume enough this import will have no problem (we will assume that the export has not held while all backups rman-ducts were running).

    We know that your environment already relies on rman, since it is the rman catalog you save, you must familiarize yourself with rman still. Set up rman backup and restore for your sounds of database catalog rman as the perfect learning exercise that you improve on the procedures of your predecessor.

    Backup and recovery quick start for 10.2 Guide: http://download.oracle.com/docs/cd/B19306_01/backup.102/b14193/toc.htm

    Backup and Guide recovery for 11.2: http://download.oracle.com/docs/cd/E11882_01/backup.112/e10642/toc.htm
    A section of this guide to a direct interest (protection Recovery Catalog): http://download.oracle.com/docs/cd/E11882_01/backup.112/e10642/rcmcatdb.htm#CHDEBDJG

  • Question about expdp and impdp number of tables in a schema 11.2.0.3

    Hello Experts-

    I have 145 tables of a certain pattern that I want to save for example export and import the same tables in another schema is possible?

    1. REMAP schema to use for this purpose?
    2. I must quote all 145 tables? or I can use '%' as all the tables I want export & import start with a certain word-(RECORDS_)?

    Thanks in advance

    Hi 885068,

    I have 145 tables of a schema of some I want to save for example export & import the same tables in another schema is possible?

    Yes, it is possible.

    REMAP schema to use for this purpose?

    Yes, it is the purpose of REMAP_SCHEMA. Import metadata and data from one schema to another.

    I must quote all 145 tables? or I can use '%' as all the tables I want to export & import begin with a certain word-(RECORDS_)?

    It's always better when you have total control of what you're doing. I prefer to put all the tables, of course it not easy, but you can use sql statements to generate the TABLES all = parameters you need like the following:

    SELECT 'TABLES='||table_name
    FROM all_tables
    WHERE owner = ''
      and table_name like 'RECORDS_%'
    ;
    

    In case of import, you need not specify that you must import because the entire dump contains the tables that you imported exported tables, but it is very advisable that you specify the tables.

    So if the tables in the target schema must be created with names different than the original, you must use the parameter REMAP_TABLE =.:

    HTH,

    Juan M

  • Not a Dba, try a simple operation with expdp and impdp

    Hi all

    My Oracle database: 10G

    I'm not a DBA, but I'm stuck to what looks like a simple operation, here is the context

    I exported 2 tables schema HR using expdp and here is the command:

    C:\ > expdp hr/hr@ORCL tables = EMPLOYEES, DEPARTMENTS directory = ORACLE_BKP dumpfile = hr_e_d.dmp logfile = hr_e_d.log

    so this part uptill now is no problem, now I try to use this dmp 'hr_e_d.dmp' file and the urge to put these two tables in a different pattern that is pkg_utils and here is the command that starts the error:

    C:\ > impdp pkg_utils/pkg_utils@ORCL schemas = HR = dumpfile ORACLE_BKP1 directory = hr_

    e_d.dmp

    Import: Release 10.2.0.1.0 - Production on Sunday, September 8, 2013 14:51:04

    Copyright (c) 2003, 2005, Oracle.  All rights reserved.

    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - production

    tion

    With partitioning, OLAP and Data Mining options

    ORA-31655: no data or metadata of objects selected for employment

    ORA-39039: pattern Expression "(' HR')" contains no valid schema.

    Table main "PKG_UTILS." "' SYS_IMPORT_SCHEMA_01 ' properly load/unloaded

    Start "PKG_UTILS". ' SYS_IMPORT_SCHEMA_01 ': pkg_utils/***@ORCL schemas = HR.

    Directory = ORACLE_BKP1 dumpfile = hr_e_d.dmp

    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA

    Work "PKG_UTILS". "" SYS_IMPORT_SCHEMA_01 "carried out at 14:51:05

    PS: the requirement is only to employees, departments of the HR export schema pkg_utils schema tables

    Concerning

    Rahul

    C:\>Impdp pkg_utils/pkg_utils@ORCL schemas = HR directory = ORACLE_BKP1 = hr_ dumpfile

    e_d.dmp

    Instead of schemas = HR try remap_schema = h:pkg_utils

    See Data Pump Import

  • necessary rights for expdp and impdp

    Hello friends,

    I use oralce 10g 10.2.0.4 in Sun os. I want to use the data with schema_remap option pump has: b and posibility network_link.

    What are all grants and privileges of user database that I need to ask in the schema's and B?

    What type of OS user could perform impdp /expdp? Finally, will it all special right if I want to use OEM?

    Thanks for any comment and suggestion.

    Here is a good starting point:

    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_overview.htm

    You need permission to export schemas and other IMP_FULL_DATABASE to import other schemas (including SCHEMA_REMAP).

    DataPump is authenticated by the user running, so if you run it from a client workstation or OEM or server (okay, not the server, you will need to be part of the DBA group for this) datapump will use the permissions of the user of your Oracle.

    DataPump creates its files on the database server, so if you want to make these files in another directory, you must create and Oracle Directory.

  • expdp and impdp question

    Hi guys

    Actually I got an export with the utility expdp Oracle 11 g database.

    It is important to mention that the source database has a 8K db_block_size and the target a db_block_size = 16K, so I want to ask if there are any isue referring to this case. +

    Thanks in advance.

    Concerning

    Hello

    Yes, you'll only face from data while doing impdp due to character set conversion, in some paintings

    and you encounter this symbol-> where that oracle cannot perform the conversion.

    Thank you

    Himanshu

  • Expdp and Impdp

    Hi all...

    I try to use import export data utility pump first time in a test pattern... so I will promote it to the application database.
    before that I used the simple command exp/IMP...

    I have made a new schema 'TARGET_SCHEMA' and give the reading, writing, execution of subsidies on the DATA_PUMP_DIR to this schema.
    I want to take the dump of my SOURCE_SCHEMA and import it on my TARGET_SCHEMA.


    My expdp order:
    Expdp directory SOURCE_SCHEMA and password = DATA_PUMP_DIR dumpfile = DUMP_NAME.dmp logfile = LOG_NAME_EXP.log
    (run successfully and got the dump file)

    My impdp command:
    Directory TARGET_SCHEMA as a Word from PAST Impdp = DATA_PUMP_DIR dumpfile = DUMP_NAME.dmp logfile = LOG_NAME_IMP.log
    (ne)

    I'm sure I've made some mistakes in my impdp command... may be concerning the credentials and everything... Please help...

    I have made a new schema 'TARGET_SCHEMA' and give the reading, writing, execution of subsidies on the DATA_PUMP_DIR to this schema.
    I want to take the dump of my SOURCE_SCHEMA and import it on my TARGET_SCHEMA.

    My expdp order:
    Expdp directory SOURCE_SCHEMA and password = DATA_PUMP_DIR dumpfile = DUMP_NAME.dmp logfile = LOG_NAME_EXP.log
    (run successfully and got the dump file)

    You can use it as

    expdp system / * directory = DATA_PUMP_DIR DUMP_NAME.dmp logfile = LOG_NAME_EXP.log patterns = SOURCE_SCHEMA = dumpfile

    My impdp command:
    Directory TARGET_SCHEMA as a Word from PAST Impdp = DATA_PUMP_DIR dumpfile = DUMP_NAME.dmp logfile = LOG_NAME_IMP.log
    (ne)

    Impdp system / * directory = DATA_PUMP_DIR dumpfile = logfile = LOG_NAME_IMP.log remap_schema =(SOURCE_SCHEMA:TARGET_SCHEMA) DUMP_NAME.dmp

    Thank you

  • Using expdp and impdp in the instantiation of stream data.

    Hi all


    I need to configure the flow level table for 5 tables between 2 databases (version 10.2.0.5). I need to export those tables from the source to the target and want to use expdp/impdp. In order to maintain the instantiation using exp ' object_consistant = y' and imp ' streams_instantiation = y' are specified. I like to use the expdp until must be specified instead of ' object_consistant = y' and in the same way, which is the replacement of ' streams_instantiation = y' in impdp.

    If anyone has an example or follow the installation steps streams at the level of the table that would be an excellent place to start help.

    Thanks in advance for your support and help.
    Shashi

    Once you have instantiated table on the source site, the data exported and imported on the target site.
    It is easy instantiate the target source following YVERT table tables:

    set lines 190 pages 66
    
    -- Prepare the table for streams capture. Once this command is executed any DML against the table
    -- will be captured, irrelevant of streams configuration state on source and apply. If no capture process,
    -- or propagation process is yet active, Streams will simply wait untill everything is clear to mine logs starting
    -- from this SCN.
    
    execute DBMS_CAPTURE_ADM.PREPARE_TABLE_INSTANTIATION(table_name => 'OWNER.TABLE');
    
    -- You can see on source DB the SCN
    
    SELECT
        TABLE_OWNER,TABLE_NAME,
        SCN,
        to_char(TIMESTAMP,'DD-MM-YYYY HH24:MI:SS') ti
    FROM
        dba_capture_prepared_tables
    ORDER BY table_owner;
    
    -- export the data here and configure the apply process.
    -- once the apply process is configured, before starting the apply process,
    -- run on  target DB to check the SCN status. Compare the following with query on source.
    
    COL objt HEADING 'Object| Type' FORMAT A9
    COL own_obj HEADING 'Object Name' FORMAT A45
    COLUMN lnk HEADING 'Using|Dblink' format A18
    COL SOURCE_DATABASE HEADING 'Source Database' FORMAT A38
    COL INSTANTIATION_SCN HEADING 'Instantiation| SCN' FORMAT 999999999999 justify c
    
    SELECT
          distinct source_database,
          source_object_owner||'.'||source_object_name own_obj,
          source_object_type objt,
          instantiation_scn,
          apply_database_link lnk
    FROM
         DBA_APPLY_INSTANTIATED_OBJECTS
    ORDER BY 1,2;
    
    -- if SCN are missing then
    
    execute DBMS_APPLY_ADM.SET_TABLE_INSTANTIATION_SCN(
             source_object_name=> 'OWNER.TABLE',
             source_database_name => '' ,
             instantiation_scn => nn );   --  where nn is taken from the query above on capture site for the relevant table.
    
     
    
  • Full Expdp and try to import the unique schema

    Hi all


    I have backup complete expdp & I want simple schema import.

    When I im trying to import the unique scheme, he began the full import.

    Export: -.

    expdp praful/praful directory = logfile = full.log dumpfile = full dump full.dmp = y

    Import: -.

    Impdp praful/praful directory = dump dumpfile = full.dmp logfile = imp_praful.log remap_schema = PRAFUL:TEST


    Kindly give a suggestion for the unique pattern of export import full dump.

    Kind regards
    Praful

    Hello

    Don't forget to include the parameter PATTERNS - that provides details, wher as remap_schema - sufficient details for remapping instead of import in the current schema, you're connected in what concerns the current session.

    -Pavan Kumar N

  • Form fields copy and pasting one PFD to another

    Copy and paste form fields of a PFD to another while in form editing mode has stopped working. I did a hundred times and then the second document suddenly stopped to accept my objects to the Clipboard. That's happened?

    I answered my own question by trial and error.

    I worked with excel to create columns that will automatically detect the wizard. I then transferred these columns into a MASTER spreadsheet.

    I had unconsciously tried to copy/paste text fields with the same name as the associated checkbox. It gives me a warning pop - up or anything like that so I could ' t understand why it didn't work. Hope this helps someone who has the same frustration I was.

    Thank you

  • How failover VIP SCAN and SCAN one node to another listener?

    Environment:

    OS: HP - UX B.11.31 U ia64

    RDBMS: Oracle Database 11 g Enterprise Edition Release 11.2.0.3.0 - 64 bit Production

    It is a 2 RAC node.

    Question:

    1. How tipping the VIP SCAN and SCAN LISTENER running on node 1 to node 2?
    2. What is the relationship between AUDITOR and AUDITOR SCAN standard?
    3. Why must we LISTENER, when we SCAN LISTENER?
    4. When I tried with LISTENER to STOP SRCVTL , I thought the LISTENER to SCAN and SCAN IP are switched, but there no?
    5. Also, please clarify if I'm using SRVCTL INDIA SCAN Node1-i 1-n
      Actalluy I am that by moving the SCAN listeners so that when I do the correction of DIET 7 block on node 1, no incoming connection attempt to clear
      a process and thus to open files in $ORACLE_HOME (which would prevent the patch does not happen)

    Please clarify my queries.

    Thanks, Sivaprasad.S

    Your questions:

    1. the failover occurs after a failure, you can't do it manually. Restart a node and you will see it happen.

    2 SCAN redirect listeners to the listeners of node connection requests.

    3. they have different functions. SCAN of listeners redirection, listeners node spawn and leave sessions.

    4 srvctl stop listener has no effect on a listener SCAN or SCAN VIP.

    5. the instructions with the game patches don't tell you to do.

  • expdp and impdp between TWO different tables

    DBVersion: 10g and 11g

    I have two tables with the same data structure, same columnames and wide except name difference

    orders and order_history

    I got export orders for a date range and now want to import in order_history

    What do I do? If so can you please provide me with the syntax


    Thank you

    REMAP_TABLE:

    http://docs.Oracle.com/CD/E11882_01/server.112/e22490/dp_import.htm#BABIGHCC

  • How replicates all records from one table to another table without using expdp, impdp

    Hi I have two database in a database, that I have a table called t1 and another base data, I have the second table .so, my first table have records, that I need to transfer all records second T2 without use of expdp and impdp in every 5 min... what I do

    ??

    The best solution for this scenario is to use Oracle Golden Gate

    However, it requires a license, and you must pay for it.

    If this is not possible, you can create a job scheduler that uses a link DB in order to reproduce the recordings of the target database, but it will take the entire table to the target database and then INSERT AS SELECT truncate the data of the entire table every time that the job is running (because you can't follow only the records that have been changed or modified).

    In addition, read here on the replication of data using materialized views.

Maybe you are looking for