Import of DataPump

Hi all

We are 2008R2 in windows server with 2 node RAC v11.2.0.3. My client wants data from database of production in SYNC with staging database.  I choose to use the data pump import. The process that I follow as below

(1) export with FULL = Y X DB production

(2) disable all constraints in staging XDB

(3) save all grants and privileges of staging XDB

4 run under import controls

Impdp 'sys/XXXX@XDB as sysdba' CONTNENT = all THE directory = dumpfile XDB_DATAPUMP = exp - XDB.dmp logfile = imp - parallel XDB.log = cluster 4 = ESTIMATE of N = STATISTICS TABLE_EXISTS_ACTION = REPLACE

My question is

(1) what I have to analyze all the clues after importation, even if I use ESTIMATE = STATISTICS?

(2) is there main steps I need to do before or after import?

Please give me some information. Help, please.

I expect that

RMAN > DUPLICATE DATABASE

to be faster

Tags: Database

Similar Questions

  • Import with datapump when exporting datapump was executed as user SYS

    Hi all

    all I have is a dumpfile and an export a datapump log file. Export has been executed as user sys:

    Export: Release 11.2.0.1.0 - Production on mid Dec 3 12:02:22 2014

    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    ;;;
    Join EIB: Oracle Database 11 g Release 11.2.0.1.0 - 64 bit Production
    "SYS". "' SYS_EXPORT_FULL_01 ':"sys/***@database AS SYSDBA"directory = data_pump_dir dumpfile = db_full.dmp db_full.log = complete logfile = gestartet wird y
    Mobiltelefondienste participations mit method BLOCKS...
    Objekttyp DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA wird thanks
    Approximately Schatzung mit BLOCKS method: 52.84 GB

    Now I have to import (with datapump) user USER01 and export USER02. But I don't know the name of the source database tablespaces.

    I want to keep the name of the user (USER01/USER02). That's why I created these users in the target database.

    The questions I have are:

    -should I start the import by datapump as user SYS?

    -Should what settings I use to import users USER01, USER02 and data in the target database? Since I do not know the names of tablespaces

    in the source database parameter REMAP_TABLESPACE will not help

    Any help will be appreciated

    J.

    Hi J.

    The questions I have are:

    -should I start the import by datapump as user SYS?

    No, you need to import with a user in the role of "imp_full_database".

    -Should what settings I use to import users USER01, USER02 and data in the target database? Since I don't know the names of the storage spaces in the source database parameter REMAP_TABLESPACE will not help

    Well, an idea is to generate you a schema import sqlfile and see in the ddl in what tablespace, it will try to create objects.

    Impdp------' / as sysdba------' directory = dumpfile = = = USER01, USER02 patterns sqlfile

    For more information, take a look at the import documentation

    http://docs.Oracle.com/CD/B19306_01/server.102/b14215/dp_import.htm

    Hope this helps,

    Kind regards

    Anatoli has.

  • How to import with datapump tablespaces and another schema?

    Hi, I need to import with datapump to another schema, data in 'x' tablespace and indexes in the tablespace 'y'

    example:

    My schema: prueba

    I need to import the schema "prueba" but the data must be in the tablespace called "Tb1" and its index should be in the tablespace called "Tb2".

    So, how can I do?

    Any help will be appreciated.

    As the noted the ground, do two imports, exclude to limit import and map objects to the appropriate storage space (to know what tablespace they were however)

    import 1 - exclude = index remap_tablespace = source: tb1

    import 2 - exclude = remap_tablespace = source table: tb2

  • Sequence after import via DataPump behavior

    Hi friends,

    I'm under Oracle DB 11.2.0.3 on Windows 2008 R2 SP1 servers and I faced a strange behavior sequences after importing a schema via Data Pump.

    The export is done in this way:

    EXPDP userid/password dumpfile = logfile = directory = remap_dumpfile = y (no news)

    Importation is made this way

    IMPDP userid/password dumpfile = logfile = directory = =(old_one:new_one) remap_tablespace = remap_schema (old_ones:new_ones, so on...)

    Import works fine. There is no errors and the sequences are thus imported without warning.

    Strange behavior, seems that sequences of "reset". When we call a sequence the NEXTVAL is just lower than the values already stored in the database, and we get ORA-00001 much. The sequence should know as vale. I don't have this problem when you use exp/imp, just through DataPump.

    So that when we create an order which will receive the value of 100, for example, because we have 99 commands on the system, Oracle suggests a value less than 99 or even the value number one (01).

    Then, we wrote a script to check the CURVAL of the sequences on the basic scheme to recreate sequences using the initial value on the new imported schema.

    Did anyone face this problem before?

    Any suggestions?

    TKS a lot

    Hello

    You should definitely make the consistent export - is not be default in datapump (although in previous versions, you might think that it was because of misleading him he used to write informational messages).

    You can either use flashback_time = systimestamp, lalshback_scn = xxxxx (where you work on what SNA use) or you are on 11.2 you can even use compatible = y as oracle reintroduced to facilitate upgrades of exp for the people.

    That might solve the problem, but if the number is reset to 1 in some cases it may be another problem.

    See you soon,.

    Harry

  • Import/Export DataPump for refreshing production ERP test

    Hi all

    We have 11i Oracle applications running on AIX 5.3ML8 production and test. The production is autonomous with the basic 10.2.0.4 and test is RAC with database version 10.2.0.4.

    Now the question is that we intend to update the Test with production every night. We plan to use Datapump Import/Export for that. I just wanted to know from you guys, if any body has a bad experience using this utility with ERP.

    Thank you and best regards,
    Vijay

    Vijay,

    I am sensitive on the use of export/import to me because someone told me that with ERP type database we should not use Export/Import, although he was not able to specify the exact reason for this. Y at - it no problem using Import/Export the ERP database refresh?

    Using import/export is supported with Oracle Apps database (for the database full exp/imp and certain patterns such as custom). The Apps schema, I think that it is not supported due to the dependencies of the object and the integrity constraints. However, you can open an SR to confirm this with the support of the Oracle.

    Kind regards
    Hussein

  • Try to import tables datapump file that makes use of the transportable mode



    Hi using impdp on oracle 11.2.0.3 and have a dumpfile that contains export of tables which makes use of the transportable tablespace mode.



    To import 3 of the cobncerned file just form tables in another database using DME, but does not


    Error

    ORA-39002: invalid operation

    ORA-39061: import conflicts FULL mode with the TRANSPORTABLE export mode


    {code}

    UserID = archive / MDbip25

    DIRECTORY = TERMSPRD_EXTRACTS

    DUMPFILE = archiveexppre.964.dmp

    LOGFILE = por_200813.log

    PARALLEL = 16

    TABLES = ZPX_RTRN_CDN_STG_BAK, ZPX_RTRN_STG_BAK, ZPX_STRN_STG_BAK

    REMAP_TABLESPACE = BI_ARCHIVE_DATA:BI_ARCHIVE_LARGE_DATA

    REMAP_TABLESPACE = BI_ARCHIVE_IDX:BI_ARCHIVE_LARGE_IDX

    {code}

    All ideas

    A transportable export must be imported using an import of transportable.  Complete using = is not a valid option.  You might be able to pass in an expression to include for the tables that you want to, but the work must always be transportable.  Your import command should look like:

    Impdp directory of the user/password transport_datafiles=/path1/pathx/dir1/dir5/filename.dbf dpump_dir dumpfile = your_dumpfile.dmp = include = table: 'In ('a', 'b', 'c') '.

    I see you are using the api, but this would be the type of command line to import.

    I hope this helps.

    Dean

  • Could not import the datapump

    Hello

    Oracle Version: 10.2.0.1
    Linuxversion: Rhel4

    When I try to import the dump file I can't import the dump. It's on the virtual machine of san storage.
    [oracle@VM1-3 dbdump]$ impdp prod_91110/prod_91110 directory=dbdump dumpfile=prod_91102.dmp  remap_schema=prod_91102:prod_91110 remap_tablespace=nds_indx2:prod_91110,nds_data:prod_91110,ndsprod:prod_91110,nds_indx:prod_91110,prod_91102:prod_91110 parallel=3 job_name=prod_91110
    
    Import: Release 10.2.0.1.0 - Production on Tuesday, 10 November, 2009 23:17:21
    
    Copyright (c) 2003, 2005, Oracle.  All rights reserved.
    
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    ORA-39002: invalid operation
    ORA-39070: Unable to open the log file.
    ORA-29283: invalid file operation
    ORA-06512: at "SYS.UTL_FILE", line 475
    ORA-29283: invalid file operation
    Published by: SIDDABATHUNI on November 10, 2009 04:25

    SIDDABATHUNI wrote:
    Hello

    Oracle Version: 10.2.0.1
    Linuxversion: Rhel4

    When I try to import the dump file I can't import the dump. It's on the virtual machine of san storage.

    [oracle@VM1-3 dbdump]$ impdp vsprod_91110/vsprod_91110 directory=dbdump dumpfile=vsprod_91102.dmp  remap_schema=vsprod_91102:vsprod_91110 remap_tablespace=qfundvs_indx2:vsprod_91110,qfundvs_data:vsprod_91110,qfundvsprod:vsprod_91110,qfundvs_indx:vsprod_91110,vsprod_91102:vsprod_91110 parallel=3 job_name=vsprod_91110
    
    Import: Release 10.2.0.1.0 - Production on Tuesday, 10 November, 2009 23:17:21
    
    Copyright (c) 2003, 2005, Oracle.  All rights reserved.
    
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    ORA-39002: invalid operation
    ORA-39070: Unable to open the log file.
    ORA-29283: invalid file operation
    ORA-06512: at "SYS.UTL_FILE", line 475
    ORA-29283: invalid file operation
    

    Have you tried searching for your code on google before posting here?

    http://www.Google.AZ/search?hl=ru&source=HP&q=ora-39070&LR=&AQ=f&OQ=

  • can we use original export to import using datapump 10 g file?

    Hi all

    I'm trying to find an answer to a question. Lets say both my DB are on 10g. and on the basis of data original used A...i exp uitility to export the database/schema/table... so can I use the same export file (created by original exp uitility)... on the B database and import the file using impdp uitility or I can't?

    Hello

    No, not possible.

  • Import/export DataPump object grants given by another user

    Hello

    I searched through the forum and the documentation but couldn't find an answer. I have the following case:

    (1) I have a USER1 user who did not receive subsidies on the tables belonging to the user SCOTT.
    (2) user SCOTT gives the right to SELECT for his table EMP user1:
    Grant select on emp for User1;

    (3) I do exporting of the User1 schema (schema = User1). But in addition I also perform a full database export (full = y)
    (4) I have let down the user User1 and then import back

    After importation, the USER1 user has the right to SELECT on SCOTT. EMP more (no matter if I imported the schema of the schema or full dump of export file mode). Is it possible to import the user so that it has the same exact privileges that he had for the export (also the ones he was given to her by other users)? I know that the privileges granted to objects owned by SYS are not exported, but what about other non-system users?

    Thanks in advance for any answers.

    Kind regards
    Swear

    Swear,

    Subsidies are impoted when the objects to which they belong are imported, not when the schema that the grants were awarded to the is imported. So, given that scott made the concession on scott.emp to user1, this grant be imported when scott.emp is imported. They get also only exported when the object is exported. Because scott.emp was not exported, this grant will not be exported when only User1 schema is exported.

    I don't know an import of the single step that you will get what you want. If it's just the grants that you are looking for, you could make, it is 2 steps and should be an export complete dumpfile. Export could be reduced down a bit so it would be a condensed dumpfile. Here's what you do:

    1. from the source database:
    a. do this if you want a digest dumpfile
    Directory System/Manager expdp dumpfile = full_grant.dmp dpump_dir = include = grant
    b. do that if you want a complete dumpfile
    expdp System/Manager directory = dpump_dir dumpfile = full.dmp

    2 from the source or the target database:
    a. If you have just the source grants follow these steps:
    Directory System/Manager Impdp = dpump_dir dumpfile = full_grant.dmp sqlfile = grant_only.sql
    b. If you don't have a full dump and only want to see grants, so that
    Directory System/Manager Impdp dumpfile = full.dmp dpump_dir = include = grant sqlfile = grant_only.sql
    c. If you hav only a complete dumpfile and you want to see everything then
    Directory System/Manager Impdp = dpump_dir dumpfile = full.dmp sqlfile = full.sql

    Now, you can edit one of these files, .sql to find the commands you want. Your best bet would be to look at grant_only.sql and search for "to user1. This will give you all the subsidies granted to User1. These could be run from sqlplus.

    A shortcut if you have configured database links would be something like:
    Directory System/Manager Impdp = network_link dpump_dir = link_from_target_to_source include grant sqlfile = grant_only.sql =
    Then, edit the file as shown above.

    I hope this helps.

    Dean

  • Parallel degree importing Datapump

    Hello

    For several imports that I ran, I observed a constant degree of parallelism = 10 when creating index.

    Moreover, I parallel_degree_limit = en parallel_degree_policy = MANUAL CPU.

    The first parameter specifies that Oracle sets up to an upper limit on the DOP.

    If this explains the value observed?

    Kind regards

    Hello

    At the same time goes like this:

    all the tables, plsql definitions are created in series

    All data in the table of loading in parallel - so not parallel insertion into a table but a work process separate datapump by table - so EMP would be charged by slave1 and DEPT by slave2 - but at the same time. IN your case 10 separate tables would all be imported at the same time.

    The indexes are built at the same time you specify

    Constraints are made in series one and the same process, I think

    I wrote a short blog about it earlier that can be useful: it contains pictures! :-)

    Oracle DBA Blog 2.0: What happens during an import parallel datapump?

    See you soon,.

    Rich

  • DataPump importing from a card reader

    Aloha,

    I would ask my problem about the importation of datapump. I am importing a datapump having a significant size, I have to create a player card just for temporary storage. My problem is that I have a problem when importing from my dump (datapump) resides on a card reader. The errors are listed below.

    ORA-39002: invalid operation
    ORA-39070: unable to open the log file.
    ORA-29283: invalid file operation
    ORA-06512: at "SYS." UTL_FILE", line 536
    ORA-29283: invalid file operation

    Thanks for your time,

    Hades

    Hello
    When using data pump import, you must have this file under the system read/write access in the directory path.then only it works as you are scheduled.

  • Export DataPump - how to import user specified?

    Hi gurus,

    I have a big problem. I exported all Oracle instance users using datapump command and right now I have a problem with the import of this new instance.

    Is it possible to import the datapump dump file specified users (not full instance)?

    You can use the option of patterns & remap_schemas

    If you want to import with the same name:

    Impdp directory = directory_name dumpfile = dumpfile_name.dmp logfile = logfile_name.log patterns = schema_name

    If you want to import with a different name:

    Impdp directory = directory_name dumpfile = dumpfile_name.dmp = logfile_name.log patterns = schema_name remap_schema logfile = export_schema:import_schema

    Use may try to import in the different tablespace using remap_tablespace also

    Published by: Riou on July 9, 2010 01:12

  • ERROR importing DATAPUMP

    Hello


    Oracle Release version: 10.2.0.3.0
    Server: Linux


    We get the following error when importing the datapump. Here we export the dump on the server even and then he was trying to import in the same server, we get the following error in our servers.yesterday also import datapump production works very well, but today, we get the error.


    We tried to import on different servers too, but the same problem we face



    ORA-39014: one or more workers have left prematurely.
    ORA-39029: worker 1 with the process name "DW01' end prematurely
    ORA-31672: process work DW01 died suddenly.


    Please help me.

    Published by: SIDDABATHUNI on June 1st, 2009 21:46

    Hello

    When you have attached the work in the prompt before running continue_client.
    can you issue the command status and check which object it was important during the crash.

    now run a new job with parameter exclude to exclude this one object.

    Thank you.

  • Impdp - data not import on only a tablespace

    Windows 32 - bit Oracle 11g (11.2.0.1.0)

    Hello

    I did an import with datapump. All the datas of tablespaces of my old DB are perfectly transferred to my new exept DB for a witch is empty.

    I used: name/password@oldDB expdp DUMPFILE = EXP_DB_FULL. DMP LOGFILE = EXP_DB_FULL. JOURNAL

    Then I created on my new DB all tablespaces that were on the old DB

    I did the datapump with: impdp DIRECTORY DUMPFILE = EXP_DB_FULL dpump_dir1 = system/password@newDB. DMP LOGFILE = EXP_SIVOA_FULL. JOURNAL FULL = Y

    I have no error message, but when I'm loocking tablespaces on Enterprise Manager of them is not complete. I created the table space with the same name and the same data file name.

    On my old DB this tablespace is 13Mo, and after import on my new DB, it is only 1.9 MB. Other tablespaces I created are perfectly complete, they have almost the same size as they were on the old DB.

    I have not understant why this tablespace will not be filled. What coul be the reasons given is not import on only a tablespace without errors?

    I do not know if I give you all the information on my application so tell me if I forgot something.

    Thanks for any information

    It simply means that the object is larger in the old DB bacause it had a free space inside.

    When you have imported, free inside the object has been deleted, so the tablespace is small, because the object is smaller.

    All data is always in the object, and you can count the rows to confirm.

  • Difference of TTS and TDB 12 c and 11g to import non - CBD in PDB

    According to http://www.Oracle.com/technetwork/database/Enterprise-Edition/full-transportable-WP-12C-1973971.PDF , I use full Transportable to the export and import of 12 c and + 11.2.0.3 to a non - CBD database transport in a PDB file.

    -That's a very similar concept as TTS in 11g? Application and user tablespaces must be read-only unless I want the target to be only read-only and must have the same database and the same national character set. To import storage space in an existing Database, I need to copy the data files and import the datapump metadata. Where I have to deal with different endian between source and target, I use RMAN on the source or target to convert data files. The EXPDP in 11g use TRANSPORT_TABLESPACE = Y.

    12 c and + 11.2.0.3, I can use Datapump EXPDP with FULL = Y TRANSPORTABLE = ALWAYS. As far as I understand it, I don't know, unlike the previous TTS is how EXPDP treats provided Oracle database objects. Is this correct?

    According to the guide of students 12 c new features of Oracle, in order to move a non - CBD in a preliminary draft BUDGET, I should first of all, for example, a TTS or BDM export and then perform an import TTS or BDM in the PDB using export dump files metadata and a data tablespace.

    I'm confused. As far as I know, TDB requires RMAN convert, which creates a special SQL script to import the database. I can do with Datapump TDB?

    Try to look for some examples I found http://Oracle-base.com/articles/12C/upgrading-to-12C.php#transport-database . He also mentions the Transport database, but actually seem to use TTS using datapump.

    Could someone please shed some light on the different concepts and it is what? I already spent reviewing the literature. Thank you!

    I came to the following conclusions, so far:

    Complete Transportable Import/Export, available, in 11.2.0.3 and later versions and TTS (Transportable Tablespace) can be used to a basic database not directly transport container in the PDB or CBD, without having first put the database source to 12 c. TTS, however, may require a separate import/export to transfer schema objects in the sysaux tablespace or system.

    TDB (Transportable Database) for the transport of a database complete, usable only to transfer a non - CBD to a BDC, and the source database must be upgraded first to 12 c.

Maybe you are looking for

  • How does a smart mailbox?

    How does a smart mailbox?

  • How can I reset tabs at the bottom instead of top to v29.0.1?

    Is there a solution that actually works? Just migrated to v29 and try to reset the tabs at the bottom instead of use albums about: config. Does not work. Installed recommended add-on https://addons.mozilla.org/firefox/addon/classicthemerestorer/ in r

  • Cannot transfer pictures for MacBook phone Virgin Mobile... Please help!

    Please tell me how to transfer photos from MacBookAir at a cheap Virgin Mobile phone... tried several times and nothing happens.  Is this possible?  Thank you!

  • Activate Windows 7-> WIndows-> Windows 10 8.1?

    I have a X 220 five years, which was running WIndows 7 Pro. I used a USB recovery to reset the machine, intending to have a clean win7 installation so that I could upgrade to Win10. I tried the restore on board, but it did not work. So I used a USB r

  • T530 keyboard

    Hello I'm new to this forum. I have a 2nd hand X 220 for 3 months now, easily the best machine that I had never. I'm now intending to buy an another thinkpad and intend to spend my X 220 to my wife. I am interested in the T530, but in the images of t