Upgrade of database using Data Pump

Hello

I'm moving my database to a Windows 2003 server to Windows 2007 server. At the same time I get to level this database from 10g to 11 g 2 (11.2.0.3).
so, I use the export / import method of upgrade (via Data Pump not the old exp/imp).

I have exported successfully by the source database and created the empty shell ready to take import database. However, I have a couple of queries

Q1. with regard to all objects in the source database SYSTEM. How they will import as target the new database already has a SYSTEM tablespace
I guess I need to use the TABLE_EXISTS_ACTION option for import. However should I set this Append, SKIP, REPLACE or TRUNCATE - which is best?

Q2. I intend to slightly modify the directory structure on the new database server - it would be preferable to pre-create the tablespaces or leave that to import but use the REMAPPER DATAFILE option - what is all the world experience to know what is the best way to go? Once again if I pre-create the tablespaces, how should I inform the import to ignore the creation of the tablespaces

Q3. These 2 databases are on the same network, so in theorey instead of a manual export, copy of the dump for the new server file, then import, I could use a link to network for import. I was wondering where it all stupid this method using the explicit export dump file?

Thank you
Jim

Jim,

Q1. with regard to all objects in the source database SYSTEM. How they will import as target the new database already has a SYSTEM tablespace
I guess I need to use the TABLE_EXISTS_ACTION option for import. However should I set this Append, SKIP, REPLACE or TRUNCATE - which is best?

What if all you have is the basic database and nothing created, then you can do the full = y. In fact, it is probably what you want. The system tablespace will be there then when Data Pump attempts to create, it will fail just as the create statement. Anything else will fail. In most cases, your system tables will already be there, and that's ok too. If you import schema view, you will miss out on some of the other things.

Q2. I intend to slightly modify the directory structure on the new database server - it would be better to pre-create the tablespaces or leave cela import but use the REMAPPING > DATAFILE option - what is everyones experience to know what is the best way to go? Once again if I pre-create the tablespaces, how should I inform the import to ignore the creation of the tablespaces

If the directory structure is different (which they usually are), then there is no easier way. You can run impdp but with sqlfile and you can tell - include = tablespace. This will give you all orders of tablespace to create in a txt file and you can edit the text file to change what you want to change. You can tell datapump to ignore the creation of tablespace using--exclude = tablespace

Q3. These 2 databases are on the same network, so in theorey instead of a manual export, copy of the dump for the new server file, then import, I could use a link to network for import. I > I was just wondering where it all with this method using the explicit export dump file?

The only con might be if you have a slow network. This will make it slower, but if you need to copy the dumpfile on the same network, then you will always see the same basic traffic. The benefits are that we should not have additional disk space. Here's how I look at it.

1. you have XX GB for the source database
2. you are looking for the source dumpfile YY GB
3. you must YY GB for the dumpfile target that you copy
4. you have XX GB for the target database.

By network get rid if YY * 2 GB for the dumpfiles.

Dean

Tags: Database

Similar Questions

  • Export the whole (10 GB) using Data Pump utility export base

    Hello

    I have a requirement that we have to export all of the base (10 GB) using Data Pump export utility because it is not possible to send the discharge of 10 GB in a CD/DVD for the seller of our application system (to analyze a few problems that we have).

    Now when I checked online full export is available but not able to understand how it works, as we never used this data pump utility, we use the method of normal export. In addition, the data pump will reduce the size of the dump file so it can fit on a DVD or can we use utility export parallel DB full to split the files and include them in a DVD, it is possible.

    Please correct me if I am wrong and kindly help.

    Thank you for your help in advance.

    Pravin,

    The server saves files in the directory object that you specify on the command line. So what you want to do is:

    1. from your operating system, find an existing directory or create a new directory. In case you, C:/Dump is as good a place as any.

    2. connect to sqlplus and create the directory object. Just use the path. I use linux, so my directory looks like/scratch/xxx/yyy
    If you use Windows, the path to your directory would look like C:/Dump you should not attack.

    3. don't forget to grant access to this directory. You can grant access to a single user, group of users, or the public. Just like
    any other object.

    If it helps, or if she has answered your question, please mark messages with the appropriate tag.

    Thank you

    Dean

  • Select table when import using Data Pump API

    Hello
    Sorry for the trivial question, I export the data using Data Pump API, with the mode "TABLE".
    If all the tables will be exported in a .dmp file.

    So, my question is how to import a few tables using Data Pump API?, how to set the "TABLES" property as a command line interface?
    can I use procedures DATA_FILTER?, if so how?

    Really thanks in advance

    Kind regards

    Kahlil

    Hello

    You should use the procedure of metadata_filter for the same thing.
    for example:

    dbms_datapump.metadata_filter
                (handle1
                 ,'NAME_EXPR'
                 ,'IN (''TABLE1'', '"TABLE2'')'
                );
    {code}
    
    Regards
    Anurag                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        
    
  • 10g to 11 GR 2 upgrade using Data Pump Import

    Hello

    I intend to move a database 10g a windows server to another. However, there is also a requirement to upgrade this database to GR 11, 2. That's why I was going to combine the 2 in one movement.

    1. take an export of complete data to source 10 g data pump
    2. create a new database empty 11 g on the target environment
    3 import the dump file into the target database

    However, I have a couple of queries running in my mind about this approach-

    Q1. What happens with the SYSTEM and SYS SYSTEM and SYSAUX objects when importing? Given that I have in fact already created a new dictionary on the empty target database - import SYS or SYSTEM will simply produce errror messages that should be ignored?

    Q2. should I use EXCLUDE on SYS and SYSTEM (is it EXCLUDE better export or import side)?

    Q3. What happens if there are things like scheduled tasks, etc. on the source system - since these would be stored in the property SYSTEM tables, how I would bring them across to the target 11 g database?

    Thank you
    Jim

    This approach is included in the Upgrade Guide 11 GR 2 - http://docs.oracle.com/cd/E11882_01/server.112/e23633/expimp.htm

    PL, ensure that you use SYSDBA privileges to run commands expdp and impdp. See here the first sections of "Note".

    http://docs.Oracle.com/CD/B19306_01/server.102/b14215/dp_export.htm#sthref57
    http://docs.Oracle.com/CD/E11882_01/server.112/e22490/dp_import.htm#i1012504

    As mentioned, sown schemas (for example SYS etc.) are not exported. See http://docs.oracle.com/cd/B19306_01/server.102/b14215/dp_export.htm#i1006790

    HTH
    Srini

  • How to exclude statistics using Data Pump API?

    How to exclude all statistics when exporting data by using the data pump API Oracle (package DBMS_DATAPUMP)?

    You would call the api metadata filter as follows:

    dbms_datapump. () METADATA_FILTER
    manage = your_handle_here,
    name = "EXCLUDE_PATH_LIST"
    value = 'STATISTICS');

    I hope this helps.

    Dean

  • Upgrade of database using Export/Import

    Hello
    IM upgrading a Database 9i to 10g database
    1 > I created a database to 10g, for ex... ABC10... catproc. SQL, sql, and catalog of scripts run successfully
    2 > I took a full export of Database 9i... .for ex, ABC... using 9i exp utility
    3 > I created tablespaces and users in my new database 10g ABC10, exactly as they are in the 9i database

    I want to import into the database ABC10

    It's my order imp (imp utility of 10g)

    IMP file = xxx.dmp = full zzz.log = log is ignore = y feedback = 10000

    This will create problems or give any error...

    Please let me know the correct steps

    Thank you

    This will create problems or give any error...

    You should be all set. He might try to import SYSTEM objects, that you can ignore.

    Check the Clone has platform independent Db - 273140.1 from Metalink.

    Published by: André on January 15, 2009 10:30

  • Question about upgrading a database using Experienced/IMPDB

    Hi all.

    Now that we have the opportunity to virtualized environments, i´ll perform some tests on the upgrade of a production environment, always on 10g (10.2.0.4 64-bit) to 11g Rel2 (11.2.0.4) on 64-bit Linux.

    Although my preferred method would use DBUA, I know you can use Oracle Datapump for the same purpose. Before the test, I would like to clarify something:

    Oracle Enterprise Manager repository will be excluded from the exp file?. I mean, even if I need an expfull of the database, whereas I m excluding the following diagrams (those that come pre created on the original database):

    USERNAME

    ------------------------------

    DBSNMP

    SYSMAN

    MGMT_VIEW

    SYSTEM

    FLOWS_FILES

    MDSYS

    ORDSYS

    EXFSYS

    SCOTT

    WMSYS

    ORACLE_OCM

    APPQOSSYS

    XS$ NULL

    APEX_030200

    OWBSYS_AUDIT

    MDDATA

    ORDDATA

    CTXSYS

    ANONYMOUS

    OUTLN

    DIP

    APEX_PUBLIC_USER

    XDB

    So, technically this won´t an EXPFULL, but that's what I really need. For example, I Don t want to APEX, FLOW_FILES patterns, nor DBSNMP and SYSMAN, etc..

    In short, using this syntax will solve my problem: exclude = pattern: '("USER1", "User2") IN'?

    Best regards, Luis.

    A full export does not, by default, the system schemas that contain metadata export and data managed by Oracle. Examples of system schemas that are not exported by default include SYS , ORDSYS , and MDSYS .

    http://docs.Oracle.com/database/121/Sutil/GUID-BA07401C-6261-4B07-AD2C-06CD0A6E0BE9.htm#SUTIL851

    This means that you don't need to worry about excluding these patterns related to the system.

    I think that the OEM schema is exported because it is just another scheme.

  • 12 c multitenant, clone database using Data Guard?

    Hello

    11.2.0.3

    We have a database that uses the data custody (physical standby). Looking at the future, the ability to clone of the multitenant 12th century is really very nice.

    Someone has already tried to clone a physical database eve? That the cloning of the production database is one of the real benefits that we see today with 12 c. Cloning of the primary requires downtime.

    How you guys to clone your production in 12 c? Would be nice if we could clone the day before.

    Concerning

    3

    It is:

    create standby_2 of the pluggable database of Eve;

    Plug-in, database created.

    Take a look at the link in my previous post.

    Best regards

    mseberg

  • Flashback Database using Data Guard environment

    11.2.0.3/RHEL 5.8

    I came across several documents that speak of FLASHBACK DATABASE configuration in dataguard environments. We have several
    Physical DBs sleep (Single Instance & RAC) running in our shop. I would like to know two or three major (common) use of FLASHBACK DATABASE data child care settings.

    I understand one of the uses mentioned in the below URL ie. a logical error recovery

    http://uhesse.com/2010/08/06/using-flashback-in-a-Data-Guard-environment/

    I would like to know what are the other major/common use of Flashback Database River in environment DataGuard

    A few other uses:

    (1) flashback allows you to test your DR. So you can activate your watch. Test the connectivity of network/applications and features on your recovery site and when done return the database to a physical standby. It must however ensure that this is allowed in your environment. In some places, I worked, it would be a big no, no, because they have no needs of data loss. However some companies that will allow as long that the watch is delivered in place for some time.

    (2) in the case that you do a failover for a reason, but what has been the main site would then become available, you can flashback what has been your primary to make the eve rather than re-instatiating the database from scratch.
    For example. You have a power failure on your main site if perform you a failover and your watch becomes the principal. Once it's your main site is back online, you can convert your previous primary in a day before making a full return/restoration (or any method you choose) to recreate your standby again. However you also have the option to use flashback on this database and then convert it to a watch because it would be potentially faster than the re-instantiation of the day before.

  • Migration using data pump for Oracle 10 g-> Oracle 11 g

    Hi all

    1)
    At the moment I am using Oracle 11g. I have a plan to import data from Oracle 10 g. I would like to know if its possible to import data that has been exported by datapump on Oracle 10 g?

    Can I convert somehow expdp out of Oracle 10 g Oracle 11 g format?





    2)
    The next question is. If I use expdp to create the dump of the database complete. Can I use *.dmp for import selected users? Or only the complete database can be restored?

    Yes, you can import dump 10g in an 11g database.

    Maybe you should take the time and read the section on datapump in the Oracle thin® [Database Utilities | http://download.oracle.com/docs/cd/B28359_01/server.111/b28319/dp_import.htm#i1007324] Manual.
    : p

  • Physical movement of the databases using Data Guard

    Hi all

    We need to move our equipment to a new physical location.

    Our production database (11.2.0.4) is replicated to another computer using dataguard (MaxAvailability mode)

    The plan was to move the machine SRD first Monday, then the database of production during the next weekend.

    Do you have an idea how to do a 'pause' the replication when the move will take place for two servers?

    We do not want to do a failover because the production should not be available over the weekend.

    Thanks in advance,

    Renaud

    Do you have an idea how to do a 'pause' the replication when the move will take place for two servers?

    There is no such method of the break, but you could defer the primary log shipping to wait.

    ALTER system set log_archive_dest_state_2 = reporter; #assuming that log_archive_dest_2 on the primary it again waiting for ships

    But, you must make sure that the archives on the primary are not lost or deleted after reporter such as these are not available on standby.

    Once you 'enable' the destination deferred, Eve should automatically catch up with the primary.

    -Jonathan Rolland

  • How to upgrade a database using a form in the workspace.

    Hello

    I designed a form that has several fields. I created a table inside the 'adobe' (installed through turnkey) schema. I want o create processes so that after filling all fields required if the user in the workspace clicks complete or submit the form, then the corresponding columns in the DB should be updated.

    What services do I need? I know that I need service JDBC and I guess than some XML utilities for this. Can anyone suggest the same thing to me?

    Thank you

    Sidonie.

    Hi Francine,.

    I got it to work. Here is what was "wrong". The active reference to formData variable xml was the xsd schema and for some reason it wasn't working (I think that the xpath expression was wrong, because you check no "store data as xdp"). So to be sure that I set as active reference your NewForm1 form, while 'store data as xdp' is automatically checked:

    Then I reorganized the sql statement according to this new xpath (remember to check the "use parameterized query"):

    And it works well for me!

    Hopefully it'll be ok on your side

    Kind regards

    Thomas

  • Data Pump Export/Import

    Hello Forum,

    I have a question regarding imports and exports of data pump, perhaps what I should already know.

    I need to empty a table that has about 200 million lines, that I need to get rid of about three quarters of data.

    My intention is to use the data pump to export the table and and indexes and constraints etc..

    The table has no relationship to any other table, it is composed of approximately 8 columns with constraints not null.

    My plan is

    1 truncate table

    2. disable or remove index

    3 leave the constraints in place?

    4. use data pump to import a lines to keep.

    My question

    will be my clues and imported too much constraints I want to import only a subset of my exported table?

    or

    If I dropped the table after truncation, I'll be able to import my table and indexes, even if I use the sub as part of my statement to import query functionality?

    My table using the query sub in data pump functionality must exist in the database before doing the import

    or handful of data pump import as usual IE will create table indices grants and statistics etc.?

    Thank you for your comments.

    Concerning

    Your approach is ineffective.

    What you need to do is to

    create the table in select foo * bar where the

    bar of truncate table;

    Insert / * + APPEND * / into select bar * foo.

    Rebuild the indexes on the table.

    Fact.

    This whole thing with expdp and impdp is only a waste of resources. My approach generates redo a minimum.

    ----------

    Sybrand Bakker

    Senior Oracle DBA

  • a question about data pump

    Hello

    I'm running a full database export using data pump. (full = y)


    To ensure data integrity, I have blocked certain patterns before starting the exp.
    I want to assure you that these patterns locked still be exported (not not being ignored), right?

    Please help confirm.

    Thank you very much.

    db version: 10.2.0.3 in Linux 5

    Published by: 995137 on April 23, 2013 15:30

    Hello
    If a schema is locked/unlocked makes no difference to datapump - he extracted them anyway with a full extract. The log file should list all the tables that are expoted, so you should see them there.

    Kind regards
    Harry

  • Data Pump Export Wizard in TOAD

    Hello
    I am new to the interface of TOAD.

    I would like to export tables from a database and import it into another. I intend to use Data Pump Export / Import Wizard in TOAD.

    I installed 9.1 Toad on WINDOWS XP and I connect to the BOX (DB server) using the customer Oracle UNIX.

    I know that the command line data pump process IE $expdp and $impdp.

    But I don't have a direct access to the UNIX BOX, IE the HOST so I would use TOAD to do the job.


    I would like to know what is the process for this.

    How different it is to use the command line data pump... With the help of TOAD where we create DIRECTORY DATAPUMP?

    I can do it on the local computer?

    Basically, I would like to know the process of import/export using TOAD with no direct access to the UNIX MACHINE.

    Thanks in advance.

    user13517642 wrote:
    Hello
    I am new to the interface of TOAD.

    I would like to export tables from a database and import it into another. I intend to use Data Pump Export / Import Wizard in TOAD.

    I installed 9.1 Toad on WINDOWS XP and I connect to the BOX (DB server) using the customer Oracle UNIX.

    I know that the command line data pump process IE $expdp and $impdp.

    But I don't have a direct access to the UNIX BOX, IE the HOST so I would use TOAD to do the job.

    I would like to know what is the process for this.

    How different it is to use the command line data pump... With the help of TOAD where we create DIRECTORY DATAPUMP?

    I can do it on the local computer?

    Basically, I would like to know the process of import/export using TOAD with no direct access to the UNIX MACHINE.

    Thanks in advance.

    I don't think that you can do with TOAD withouth physically copy of the file on the remote host. However, you have another option. "You can use the link to database to load data * without copying it to the remote host" using NETWORK_LINK parameter, as described below:

    For export:
    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_export.htm#sthref144
    The NETWORK_LINK parameter initiates an export using a database link. This means that the system whereby the customer expdp is connected contact data source referenced by the source_database_link, he extracted data and writes the data to a dump on the connected system file.

    To imprt:
    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_import.htm#sthref320
    The parameter NETWORK_LINK initiates a network import. This means the impdp customer initiates import demand, usually in the local database. This server comes in contact with the remote source database referenced by source_database_link and retrieves the data directly, it rewrites in the target database. There is no involved dump file.

    Kamran Agayev a.
    Oracle ACE
    - - - - - - - - - - - - - - - - - - - - -
    My video tutorials of Oracle - http://kamranagayev.wordpress.com/oracle-video-tutorials/

Maybe you are looking for

  • Specific preferences are not save

    Cookies settings are not saved. Since around August 22, 2014 my cookies settings are not be saved; After a browser restart "do not accept cookies" setting always happening to "Accept cookies from sites" and "accept third-party cookies: never". This i

  • Info satellite A210 171 what do I need before using the recovery disk?

    Hello I have 1 recovery disk that came with my equium, have not used before.It says on the back of the envelope of recovery disk 'insert the first diskette. I guess there should be a second? Also this info I need to provide during the installation pr

  • Does an iPad pro need a screen protector?

    Apple recommends a for a pro iPad screen protector? I noticed that the Apple Store does not have one. Read something about the film on the screen of overtime covering. Have also heard that a screen protector would affect the dynamics of pencil/screen

  • How to restore the cookies in Chrome

    Google Chrome said that I need to put my biscuits come back so I can get to Facebook and other sites that I like to go to.  I don't know how to do this and would be a "clear" and easy solution to this problem.  I don't know how they became disabled O

  • Audio for Windows 7 to cancel everything at stake

    Windows 7 Pro EVGA X 58-class 3 E770 EVGA GTX 570HD 2.5 GB/1.2 PhsyX Intel Core i7 950 Corsair AX1200i Corsair H80i Corsair K95 Razer Mamba 2012 Map its Creative Sound Blaster Fatal1ty X - Fi Titanium Logitech Z-5500 5.1 Surround In the last days of