Migration using data pump for Oracle 10 g-> Oracle 11 g

Hi all

1)
At the moment I am using Oracle 11g. I have a plan to import data from Oracle 10 g. I would like to know if its possible to import data that has been exported by datapump on Oracle 10 g?

Can I convert somehow expdp out of Oracle 10 g Oracle 11 g format?





2)
The next question is. If I use expdp to create the dump of the database complete. Can I use *.dmp for import selected users? Or only the complete database can be restored?

Yes, you can import dump 10g in an 11g database.

Maybe you should take the time and read the section on datapump in the Oracle thin® [Database Utilities | http://download.oracle.com/docs/cd/B28359_01/server.111/b28319/dp_import.htm#i1007324] Manual.
: p

Tags: Database

Similar Questions

  • How can we export itno csv data file for oracle forms

    Hello

    How can we export itno csv data file for oracle forms

    For example. I have the block called A.what never the data displayed on a block, when I click on a button, displays the block data, must be exported to the csv file.

    My application is running on the unix operating system.

    Please help on this.

    First of all.  What is your version of forms (for example: 11.1.2.2.0 not 11g).  Finally, who will use the .csv file?  If it is a user on their client computer - CLIENT_TEXT_IO TEXT_IO or WebUtil are standard packages used to export data to a file of Oracle Forms.

    The amount of data to be exported?  If you export only a couple hundred lines - export of Froms will be OK.  If you export more lines than that (300 + lines) then the export will be extremely slow to your username.  Keep in mind that forms is not designed to perform data exports - there are better tools available for this...

    Craig...

  • How to exclude statistics using Data Pump API?

    How to exclude all statistics when exporting data by using the data pump API Oracle (package DBMS_DATAPUMP)?

    You would call the api metadata filter as follows:

    dbms_datapump. () METADATA_FILTER
    manage = your_handle_here,
    name = "EXCLUDE_PATH_LIST"
    value = 'STATISTICS');

    I hope this helps.

    Dean

  • Export the whole (10 GB) using Data Pump utility export base

    Hello

    I have a requirement that we have to export all of the base (10 GB) using Data Pump export utility because it is not possible to send the discharge of 10 GB in a CD/DVD for the seller of our application system (to analyze a few problems that we have).

    Now when I checked online full export is available but not able to understand how it works, as we never used this data pump utility, we use the method of normal export. In addition, the data pump will reduce the size of the dump file so it can fit on a DVD or can we use utility export parallel DB full to split the files and include them in a DVD, it is possible.

    Please correct me if I am wrong and kindly help.

    Thank you for your help in advance.

    Pravin,

    The server saves files in the directory object that you specify on the command line. So what you want to do is:

    1. from your operating system, find an existing directory or create a new directory. In case you, C:/Dump is as good a place as any.

    2. connect to sqlplus and create the directory object. Just use the path. I use linux, so my directory looks like/scratch/xxx/yyy
    If you use Windows, the path to your directory would look like C:/Dump you should not attack.

    3. don't forget to grant access to this directory. You can grant access to a single user, group of users, or the public. Just like
    any other object.

    If it helps, or if she has answered your question, please mark messages with the appropriate tag.

    Thank you

    Dean

  • Select table when import using Data Pump API

    Hello
    Sorry for the trivial question, I export the data using Data Pump API, with the mode "TABLE".
    If all the tables will be exported in a .dmp file.

    So, my question is how to import a few tables using Data Pump API?, how to set the "TABLES" property as a command line interface?
    can I use procedures DATA_FILTER?, if so how?

    Really thanks in advance

    Kind regards

    Kahlil

    Hello

    You should use the procedure of metadata_filter for the same thing.
    for example:

    dbms_datapump.metadata_filter
                (handle1
                 ,'NAME_EXPR'
                 ,'IN (''TABLE1'', '"TABLE2'')'
                );
    {code}
    
    Regards
    Anurag                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        
    
  • Can we remove a rdf model using the Jena for Oracle adapter?

    Hello

    I use the adapter for Oracle Jena API, using java. I want to remove the database model when ever needed. Is it possible to delete the database template.

    GOS GraphOracleSem = new GraphOracleSem (oracle, modelName);
    MOS ModelOracleSem = new ModelOracleSem (gos);
    mos.removeAll ();

    This code removes the model data do not remove the model. Can we delete all of the data model. Any help?

    Thank you
    KKS

    Hello

    In the latest version of the adapter of Jena, there is a dropSemanticModel API in the oracle.spatial.rdf.client.jena.OracleUtils class.
    It is a static method. The settings are simple.

    See you soon,.

    Zhe Wu

  • With the help of Data Pump for Migration

    Hi all

    Version of database - 11.2.0.3

    RHEL 6

    Size of the DB - 150 GB

    I have to run a Migration of database from one server to another (AIX for Linux), we will use the data pump Option, we will migrate from Source to the target using schemas expdp option (5 patterns will be exported and imported on the target Machine). But it won't go live for target, after that the development of this migration team will do a job on this machine target which will take 2 days for them to fill in and to cultivate these 2 days, source database will run as production.

    Now, I have obligation which, after the development team, complete their work that I have to the changes of 2 days of source to target after what target will act as production.

    I want to know what options are available in Data Pump can I use to do this.

    Kind regards

    No business will update something that has some data that are no longer representative of live.

    Sounds like a normal upgrade, but you test it just on a copy of the direct - make sure that the process works & then play it comfortable once again, but against your last set of timely production data.

    Deans suggestion is fine, but rather than dropping coins and importation, personally I tend to keep things simple and do it all (datapump full schema if it is possible). Live in this way, you know that you put off, inclusive of all sequences and objects (sequences could be incremented, so you must re-create the fall /). Otherwise you are dividing a upgrade in stages, several measures to trace & more to examine potential conflicts. Even if they are simple, a full datapump would be preferable. Simple is always best with production data

    Also - you do not know the changes that have been made to upgrade the new environment... so you roll this back etc? Useful to look at. Most of the migration would be a db via RMAN copy / transport-endian, as you must also make sure that you inherit all the subsidies system patterns, not only summary level.

  • 10g to 11 GR 2 upgrade using Data Pump Import

    Hello

    I intend to move a database 10g a windows server to another. However, there is also a requirement to upgrade this database to GR 11, 2. That's why I was going to combine the 2 in one movement.

    1. take an export of complete data to source 10 g data pump
    2. create a new database empty 11 g on the target environment
    3 import the dump file into the target database

    However, I have a couple of queries running in my mind about this approach-

    Q1. What happens with the SYSTEM and SYS SYSTEM and SYSAUX objects when importing? Given that I have in fact already created a new dictionary on the empty target database - import SYS or SYSTEM will simply produce errror messages that should be ignored?

    Q2. should I use EXCLUDE on SYS and SYSTEM (is it EXCLUDE better export or import side)?

    Q3. What happens if there are things like scheduled tasks, etc. on the source system - since these would be stored in the property SYSTEM tables, how I would bring them across to the target 11 g database?

    Thank you
    Jim

    This approach is included in the Upgrade Guide 11 GR 2 - http://docs.oracle.com/cd/E11882_01/server.112/e23633/expimp.htm

    PL, ensure that you use SYSDBA privileges to run commands expdp and impdp. See here the first sections of "Note".

    http://docs.Oracle.com/CD/B19306_01/server.102/b14215/dp_export.htm#sthref57
    http://docs.Oracle.com/CD/E11882_01/server.112/e22490/dp_import.htm#i1012504

    As mentioned, sown schemas (for example SYS etc.) are not exported. See http://docs.oracle.com/cd/B19306_01/server.102/b14215/dp_export.htm#i1006790

    HTH
    Srini

  • Upgrade of database using Data Pump

    Hello

    I'm moving my database to a Windows 2003 server to Windows 2007 server. At the same time I get to level this database from 10g to 11 g 2 (11.2.0.3).
    so, I use the export / import method of upgrade (via Data Pump not the old exp/imp).

    I have exported successfully by the source database and created the empty shell ready to take import database. However, I have a couple of queries

    Q1. with regard to all objects in the source database SYSTEM. How they will import as target the new database already has a SYSTEM tablespace
    I guess I need to use the TABLE_EXISTS_ACTION option for import. However should I set this Append, SKIP, REPLACE or TRUNCATE - which is best?

    Q2. I intend to slightly modify the directory structure on the new database server - it would be preferable to pre-create the tablespaces or leave that to import but use the REMAPPER DATAFILE option - what is all the world experience to know what is the best way to go? Once again if I pre-create the tablespaces, how should I inform the import to ignore the creation of the tablespaces

    Q3. These 2 databases are on the same network, so in theorey instead of a manual export, copy of the dump for the new server file, then import, I could use a link to network for import. I was wondering where it all stupid this method using the explicit export dump file?

    Thank you
    Jim

    Jim,

    Q1. with regard to all objects in the source database SYSTEM. How they will import as target the new database already has a SYSTEM tablespace
    I guess I need to use the TABLE_EXISTS_ACTION option for import. However should I set this Append, SKIP, REPLACE or TRUNCATE - which is best?

    What if all you have is the basic database and nothing created, then you can do the full = y. In fact, it is probably what you want. The system tablespace will be there then when Data Pump attempts to create, it will fail just as the create statement. Anything else will fail. In most cases, your system tables will already be there, and that's ok too. If you import schema view, you will miss out on some of the other things.

    Q2. I intend to slightly modify the directory structure on the new database server - it would be better to pre-create the tablespaces or leave cela import but use the REMAPPING > DATAFILE option - what is everyones experience to know what is the best way to go? Once again if I pre-create the tablespaces, how should I inform the import to ignore the creation of the tablespaces

    If the directory structure is different (which they usually are), then there is no easier way. You can run impdp but with sqlfile and you can tell - include = tablespace. This will give you all orders of tablespace to create in a txt file and you can edit the text file to change what you want to change. You can tell datapump to ignore the creation of tablespace using--exclude = tablespace

    Q3. These 2 databases are on the same network, so in theorey instead of a manual export, copy of the dump for the new server file, then import, I could use a link to network for import. I > I was just wondering where it all with this method using the explicit export dump file?

    The only con might be if you have a slow network. This will make it slower, but if you need to copy the dumpfile on the same network, then you will always see the same basic traffic. The benefits are that we should not have additional disk space. Here's how I look at it.

    1. you have XX GB for the source database
    2. you are looking for the source dumpfile YY GB
    3. you must YY GB for the dumpfile target that you copy
    4. you have XX GB for the target database.

    By network get rid if YY * 2 GB for the dumpfiles.

    Dean

  • End of the Date supported for Oracle BPM company autonomous 10gR 3

    I'm running "Oracle BPM business standalone 10.3.1.0.0" and I can't find a date "end of support" for the product. I found a Metalink note that reported a 'BPM Matrix interoperability', here:

    http://www.Oracle.com/technetwork/middleware/BPM/documentation/obpm-config-matrix-085247.html#bpmes10301031

    who shows me the interoperability information, but no end of support date. Any ideas on where I could look at?

    Thank you

    You can get this if you search for "Oracle lifetime support policy" on Oracle's Web site. At the top of this page, click on "Oracle Lifetime Support Policies"--> download the "duration of life-support-middleware - 069163.pdf" and go to page 31 of this document.

    Dan

  • Use-business cases for Oracle JET

    Dear team,

    Get your hands wet with Oracle JET and found it interesting. However, can you help me with the use case where it can be used effectively? Can you give us some specific use cases where Oracle JET is the ideal solution? (if possible with justification on why it is perfect fit)

    Best regards

    Hari

    In one sentence you say that JET usage case is when you want to use a JavaScript/HTML5/REST of the architecture of your UI layer, and you want to have:

    Support for accessibility and internationalization of your user interface

    Ability to view the data in a variety of ways

    Ability to implement best practices of Oracle Alta UI

    For us at Oracle, which is true for many of our commodities are for example.

  • Migration sql server 2000 for oracle 9i

    Hello

    I migrate from sql server 2000 to oracle 9i

    When I capture Microsoft Sql Server, it gives an error
    oracle.dbtools.metadata.persistence.PersistenceException: ORA-04098: trigger ' MIGRATION. MD_PROJECTS_TRG' is invalid and does not re-validation
    I try again and it starts but it's not to stop until clicking on cancel or close

    What's the harm?

    Thank you

    Hello
    You hit a known issue using a repository created in 9.2.
    To show that this is the case do the following-

    SQL > alter trigger compilation MD_PROJECTS_TRG.

    WARNING: Trigger modified with compilation errors.

    SQL > show errors
    Errors for MD_PROJECTS_TRG TRIGGER:

    LINE/COL ERROR
    -------- -----------------------------------------------------------------
    3/9 PL/SQL: statement ignored
    3/20 PLS-00905: object MIGREP. MD_META is not valid

    Compilation md_meta-

    SQL > alter compilation package md_meta;

    WARNING: The package has been modified with compilation errors.

    SQL > show errors
    PACKAGE MD_META for errors:

    LINE/COL ERROR
    -------- -----------------------------------------------------------------
    0/0 PLS-00908: the stored format of MD_META is not supported by this
    Release

    21/4 PLS-00114: identifier 'PUTBAIFZKA3IHSJ5AC4ZXWYAWG41KN' too long
    21/4 PLS-00707: construction not taken in charge or internal error [2702]
    SQL >
    ==

    If you get this, so the only alternative is to create the SQL * Developer repository in a database of 10.2.

    Kind regards
    Mike

  • How to check my use (data size) of Oracle XE?

    The maximum is 4G, right? How will I know how much there is to the left of sqlplus? Thank you very much!

    Also, if I use the space, can I backup my DB and empty for later use?

    Here are a few queries dictionary to find the space used and who owns the blocks...

    select a.tablespace_name ts, a.file_id,
    sum(b.bytes)/count(*) bytes,
    sum(b.bytes)/count(*) - sum(a.bytes) used,
    sum(a.bytes) free,
    nvl(100-(sum(nvl(a.bytes,0))/(sum(nvl(b.bytes,0))/count(*)))*100,0) pct_used
    from sys.dba_free_space a, sys.dba_data_files b
    where a.tablespace_name = b.tablespace_name and a.file_id = b.file_id
     and a.tablespace_name not in ('SYSTEM', 'SYSAUX','UNDOTBS')
    group by a.tablespace_name, a.file_id
    order by 1
    

    System, sysaux cancellation are not supposed to count towards 4G. To display totals by schema...

    select OWNER, TABLESPACE_NAME, sum( BYTES ) from dba_segments
     where tablespace_name not in ('SYSTEM', 'SYSAUX','UNDOTBS')
    group by OWNER, TABLESPACE_NAME;
    

    If you are bumping into the limit of 4G and you decide one of these owners can be trashed, which will certainly free up space.

    drop user [username] cascade;
    

    Check the export and datapump utilities docs, save a username is to do an OWNER = user name of dump or simply close the database and make a backup of data files some safe place.

  • With the help of Data Pump for a copy of all the TABLSPACE DDL CREATE in a database

    I looked through the documentation for a way to export.

    I thought I could get the DDL using the donation below and them an IMPDP SQLFILES = yyy.sql

    Cat expddl.sql

    DIRECTORY = dpdata1
    JOB_NAME = EXP_JOB
    CONTENT = METADATA_ONLY
    FULL = Yes
    DUMPFILE = export_$ {ORACLE_SID} .dmp
    LOGFILE = export_$ {ORACLE_SID} .log


    I would have thought you could just do an INCLUDE on the tablespace but he doesn't like that.


    10.2.0.3

    Hello..

    I have not tried to get the tablespace ddl with expdp craete, but then I use DBMS_METADATA. This is the query.

    set pagesize 0
    set long 90000
    select DBMS_METADATA.GET_DDL ('TABLESPACE',tablespace_name) from dba_tablespaces;
    

    HTH
    Anand

  • IOM 9.1.0.2 - Multi Weblogic JDBC for Oracle RAC Data Sources

    IOM-IOM 9.1.0.2 BP07 supports Weblogic JDBC Multi Data Sources (Services > JDBC > Multi Data Sources) for Oracle RAC instead of insert the "Oracle RAC JDBC URL" on xlDS and xlXADS JDBC Data Sources (Services > JDBC > Data Sources > xlDS | xlXADS > survey login > URL)?
    If so, are there any other changes that must be made on IOM, or simply to change data sources?

    Yes, it is supported. You install against a single instance directly of the Rac server. Then you update the file config.xml and jdbc resources in your server weblogic with the full address of rac. It is documented for installation against CARS. http://docs.Oracle.com/CD/E14049_01/doc.9101/e14047/database.htm#insertedID2

    -Kevin

Maybe you are looking for