Export the whole (10 GB) using Data Pump utility export base

Hello

I have a requirement that we have to export all of the base (10 GB) using Data Pump export utility because it is not possible to send the discharge of 10 GB in a CD/DVD for the seller of our application system (to analyze a few problems that we have).

Now when I checked online full export is available but not able to understand how it works, as we never used this data pump utility, we use the method of normal export. In addition, the data pump will reduce the size of the dump file so it can fit on a DVD or can we use utility export parallel DB full to split the files and include them in a DVD, it is possible.

Please correct me if I am wrong and kindly help.

Thank you for your help in advance.

Pravin,

The server saves files in the directory object that you specify on the command line. So what you want to do is:

1. from your operating system, find an existing directory or create a new directory. In case you, C:/Dump is as good a place as any.

2. connect to sqlplus and create the directory object. Just use the path. I use linux, so my directory looks like/scratch/xxx/yyy
If you use Windows, the path to your directory would look like C:/Dump you should not attack.

3. don't forget to grant access to this directory. You can grant access to a single user, group of users, or the public. Just like
any other object.

If it helps, or if she has answered your question, please mark messages with the appropriate tag.

Thank you

Dean

Tags: Database

Similar Questions

  • Select table when import using Data Pump API

    Hello
    Sorry for the trivial question, I export the data using Data Pump API, with the mode "TABLE".
    If all the tables will be exported in a .dmp file.

    So, my question is how to import a few tables using Data Pump API?, how to set the "TABLES" property as a command line interface?
    can I use procedures DATA_FILTER?, if so how?

    Really thanks in advance

    Kind regards

    Kahlil

    Hello

    You should use the procedure of metadata_filter for the same thing.
    for example:

    dbms_datapump.metadata_filter
                (handle1
                 ,'NAME_EXPR'
                 ,'IN (''TABLE1'', '"TABLE2'')'
                );
    {code}
    
    Regards
    Anurag                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        
    
  • How to exclude statistics using Data Pump API?

    How to exclude all statistics when exporting data by using the data pump API Oracle (package DBMS_DATAPUMP)?

    You would call the api metadata filter as follows:

    dbms_datapump. () METADATA_FILTER
    manage = your_handle_here,
    name = "EXCLUDE_PATH_LIST"
    value = 'STATISTICS');

    I hope this helps.

    Dean

  • Upgrade of database using Data Pump

    Hello

    I'm moving my database to a Windows 2003 server to Windows 2007 server. At the same time I get to level this database from 10g to 11 g 2 (11.2.0.3).
    so, I use the export / import method of upgrade (via Data Pump not the old exp/imp).

    I have exported successfully by the source database and created the empty shell ready to take import database. However, I have a couple of queries

    Q1. with regard to all objects in the source database SYSTEM. How they will import as target the new database already has a SYSTEM tablespace
    I guess I need to use the TABLE_EXISTS_ACTION option for import. However should I set this Append, SKIP, REPLACE or TRUNCATE - which is best?

    Q2. I intend to slightly modify the directory structure on the new database server - it would be preferable to pre-create the tablespaces or leave that to import but use the REMAPPER DATAFILE option - what is all the world experience to know what is the best way to go? Once again if I pre-create the tablespaces, how should I inform the import to ignore the creation of the tablespaces

    Q3. These 2 databases are on the same network, so in theorey instead of a manual export, copy of the dump for the new server file, then import, I could use a link to network for import. I was wondering where it all stupid this method using the explicit export dump file?

    Thank you
    Jim

    Jim,

    Q1. with regard to all objects in the source database SYSTEM. How they will import as target the new database already has a SYSTEM tablespace
    I guess I need to use the TABLE_EXISTS_ACTION option for import. However should I set this Append, SKIP, REPLACE or TRUNCATE - which is best?

    What if all you have is the basic database and nothing created, then you can do the full = y. In fact, it is probably what you want. The system tablespace will be there then when Data Pump attempts to create, it will fail just as the create statement. Anything else will fail. In most cases, your system tables will already be there, and that's ok too. If you import schema view, you will miss out on some of the other things.

    Q2. I intend to slightly modify the directory structure on the new database server - it would be better to pre-create the tablespaces or leave cela import but use the REMAPPING > DATAFILE option - what is everyones experience to know what is the best way to go? Once again if I pre-create the tablespaces, how should I inform the import to ignore the creation of the tablespaces

    If the directory structure is different (which they usually are), then there is no easier way. You can run impdp but with sqlfile and you can tell - include = tablespace. This will give you all orders of tablespace to create in a txt file and you can edit the text file to change what you want to change. You can tell datapump to ignore the creation of tablespace using--exclude = tablespace

    Q3. These 2 databases are on the same network, so in theorey instead of a manual export, copy of the dump for the new server file, then import, I could use a link to network for import. I > I was just wondering where it all with this method using the explicit export dump file?

    The only con might be if you have a slow network. This will make it slower, but if you need to copy the dumpfile on the same network, then you will always see the same basic traffic. The benefits are that we should not have additional disk space. Here's how I look at it.

    1. you have XX GB for the source database
    2. you are looking for the source dumpfile YY GB
    3. you must YY GB for the dumpfile target that you copy
    4. you have XX GB for the target database.

    By network get rid if YY * 2 GB for the dumpfiles.

    Dean

  • 10g to 11 GR 2 upgrade using Data Pump Import

    Hello

    I intend to move a database 10g a windows server to another. However, there is also a requirement to upgrade this database to GR 11, 2. That's why I was going to combine the 2 in one movement.

    1. take an export of complete data to source 10 g data pump
    2. create a new database empty 11 g on the target environment
    3 import the dump file into the target database

    However, I have a couple of queries running in my mind about this approach-

    Q1. What happens with the SYSTEM and SYS SYSTEM and SYSAUX objects when importing? Given that I have in fact already created a new dictionary on the empty target database - import SYS or SYSTEM will simply produce errror messages that should be ignored?

    Q2. should I use EXCLUDE on SYS and SYSTEM (is it EXCLUDE better export or import side)?

    Q3. What happens if there are things like scheduled tasks, etc. on the source system - since these would be stored in the property SYSTEM tables, how I would bring them across to the target 11 g database?

    Thank you
    Jim

    This approach is included in the Upgrade Guide 11 GR 2 - http://docs.oracle.com/cd/E11882_01/server.112/e23633/expimp.htm

    PL, ensure that you use SYSDBA privileges to run commands expdp and impdp. See here the first sections of "Note".

    http://docs.Oracle.com/CD/B19306_01/server.102/b14215/dp_export.htm#sthref57
    http://docs.Oracle.com/CD/E11882_01/server.112/e22490/dp_import.htm#i1012504

    As mentioned, sown schemas (for example SYS etc.) are not exported. See http://docs.oracle.com/cd/B19306_01/server.102/b14215/dp_export.htm#i1006790

    HTH
    Srini

  • Migration using data pump for Oracle 10 g-> Oracle 11 g

    Hi all

    1)
    At the moment I am using Oracle 11g. I have a plan to import data from Oracle 10 g. I would like to know if its possible to import data that has been exported by datapump on Oracle 10 g?

    Can I convert somehow expdp out of Oracle 10 g Oracle 11 g format?





    2)
    The next question is. If I use expdp to create the dump of the database complete. Can I use *.dmp for import selected users? Or only the complete database can be restored?

    Yes, you can import dump 10g in an 11g database.

    Maybe you should take the time and read the section on datapump in the Oracle thinĀ® [Database Utilities | http://download.oracle.com/docs/cd/B28359_01/server.111/b28319/dp_import.htm#i1007324] Manual.
    : p

  • I still do the full backup after using Data Guard?

    In our system, there is a physical database standby data protection configuration. Is it still necessary to do a full backup and an incremental backup?

    Jackliusr wrote:
    In our system, there is a physical database standby data protection configuration. Is it still necessary to do a full backup and an incremental backup?

    Preferred to have full backup of every day.
    May be you can perform failover of your sleep, in which case there is no availability on the primary database. So do you think that the stability of the monitoring system is the same as elementary school and can give the same performance?
    location of Eve may be too far and it's only disaster.
    Allows another case, suppose that your wait is behind that primary for 4-5 days due to some problems. At the same time your production crashed while you have the chance of data fo loose 4-5 days. Recommended to have a full backup still database primary.

    If you check the stability of database daily watch and able to check your data by opening properly and you want RMAN backup, it is fine. But it is highly recommended to have RMAN backup.

    BTW, you can have RMAN backup full standby, if you want to avoid the resources to be used on the primary

  • When I close thunderbird is still running in the background and always using data?

    I'm with an internet provider that has limited use of data per month. If I close Thunderbird when done using it it is still running in the background and using the data or is it closed here for not to retrieve messages or using the data?

    When you close Thunderbird, it cannot check or retrieve new messages.
    Not sure what you mean with "always running in the background.
    When you close Thunderbird is closed. Period.

  • Specification of segment for the EDI X 12 using data generator

    Hello

    I created the file .ecs for EDI X 12 and now wants to create a data file by specifying only certain segments. How can I do this?

    Concerning
    Priya.

    You will need to manually edit and save the file as a type of coding 'UTF - 8'.

  • To reinstall the whole system by using the restore cd

    Product: Computer laptop Toshiba tecra A4.

    As the system files are corrupted, I tried to recover the entire system. After insertion of the restore CD and copy files, I removed the cd and restart the computer. It happens on the field where I have to give a name to the computer. Now it freezes. No reaction at all. I have reaptedly tried and waited each time until 20 to 30 minutes. There is no answer at all. How can I fix this problem?

    Hello

    On this path, it is not easy to say why this is happening. I really can't imagine what can be the reason for this. What is some external device connected to your unit?

  • migration from 10g to 12 c using the data pump in

    Hi, while I used the data pump at the level of the schema before, I'm relatively new to the full database import.

    We are trying a full database migration to 10.2.0.4 to 12 c using the complete method of database data pump over db link.

    the DBA has indicated to avoid move SYSAUX and SYSTEM objects. but initially during the documentation review, it appeared that these objects are not exported since the TRANSPORTABLE given target = system NEVER. If anyone can confirm this? done import and export log refers to the objects I thought would not:

    ...

    19:41:11.684 23 FEBRUARY 15:Estimated TABLE_DATA 3718 objects in 77 seconds

    19:41:12.450 23 February 15: total estimation using BLOCKS method: 52,93 GB

    19:41:14.058 23 February 15: object DATABASE_EXPORT/TABLESPACE of treatment type

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'UNDOTBS1' already exists

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'SYSAUX' already exists

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'TEMP' already exists

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'USERS' existing

    20:10:33.200 23 FEBRUARY 15:96 objects TABLESPACE finished in 1759 seconds

    20:10:33.208 23 February 15: treatment of type of object DATABASE_EXPORT/PROFILE

    20:10:33.445 23 FEBRUARY 15:7 PROFILE items finished in 1 seconds

    20:10:33.453 23 February 15: treatment of DATABASE_EXPORT/SYS_USER/USER object type

    20:10:33.842 23 FEBRUARY 15:1 USER objects ended in 0 seconds

    20:10:33.852 23 February 15: treatment of DATABASE_EXPORT/SCHEMA/USER object type

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'OUTLN' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'ANONYMOUS' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'OLAPSYS' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'MDDATA' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'SCOTT' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'LLTEST' already exists

    20:10:52.372 23 FEBRUARY 15:Finished objects USER 1140 in 19 seconds

    20:10:52.375 23 February 15: object DATABASE_EXPORT/ROLE of treatment type

    20:10:55.255 23 February 15: ORA-31684: object ROLE type: 'SELECT_CATALOG_ROLE' already exists

    20:10:55.255 23 February 15: ORA-31684: object ROLE type: 'EXECUTE_CATALOG_ROLE' already exists

    20:10:55.255 23 February 15: ORA-31684: object ROLE type: 'DELETE_CATALOG_ROLE' already exists

    20:10:55.256 23 February 15: ORA-31684: object ROLE type: 'RECOVERY_CATALOG_OWNER' already exists

    ...

    the most insight.

    The schema SYS, CTXSYS and MDSYS ORDSYS are not exported using exp/expdp

    DOC - ID: Note: 228482.1

    I guess that he has already installed a 12 c software and created an itseems database - so when you have imported you have this "already exists."

    Every time the database is created and the software installed by default system, sys, sysaux will be created.

  • Data Pump Export/Import

    Hello Forum,

    I have a question regarding imports and exports of data pump, perhaps what I should already know.

    I need to empty a table that has about 200 million lines, that I need to get rid of about three quarters of data.

    My intention is to use the data pump to export the table and and indexes and constraints etc..

    The table has no relationship to any other table, it is composed of approximately 8 columns with constraints not null.

    My plan is

    1 truncate table

    2. disable or remove index

    3 leave the constraints in place?

    4. use data pump to import a lines to keep.

    My question

    will be my clues and imported too much constraints I want to import only a subset of my exported table?

    or

    If I dropped the table after truncation, I'll be able to import my table and indexes, even if I use the sub as part of my statement to import query functionality?

    My table using the query sub in data pump functionality must exist in the database before doing the import

    or handful of data pump import as usual IE will create table indices grants and statistics etc.?

    Thank you for your comments.

    Concerning

    Your approach is ineffective.

    What you need to do is to

    create the table in select foo * bar where the

    bar of truncate table;

    Insert / * + APPEND * / into select bar * foo.

    Rebuild the indexes on the table.

    Fact.

    This whole thing with expdp and impdp is only a waste of resources. My approach generates redo a minimum.

    ----------

    Sybrand Bakker

    Senior Oracle DBA

  • Differences between Data Pump and always by inheritance, import and export

    Hi all

    I work as a junior in my organization dba and I saw that my organization still uses legacy import and export utility instead of Oracle Data Pump.

    I want to convence my manager to change the existing deal with Oracle Data Pump, I have a meeting with them to keep my points and convence them for Oracle utility pump data.

    I have a week very convencing power but I won't miss to discover myself, I can't work myself and really a need with differences of strength against import and export, it would be really appreciated if someone can put strong points against Oracle Data pump on legacy import and export.

    Thank you

    Cabbage

    Hello

    a other people have already said the main advantage of datapump is performance - it is not just a little more fast exp/imp it is massively faster (especially when combined with parallel).

    It is also more flexible (once much more) - it will even create users with exports level schema which imp can't do for you (and it is very annoying that he could never).

    It is reusable

    It has an api plsql

    It supports all types of objects and new features (exp is not - and that alone is probably reason to spend)

    There even a 'legacy' at 11.2 mode where most of your old exp parameter file will still work with him - just change exp expdp and impdp print.

    The main obstacle to the transition to datapump seems to be all "what do you mean I have to create a directory so that it works", well who and where is my dumpfile why can't it be on my local machine. These are minor things to go well beyond.

    I suggest do you some sort of demo with real data of one of your large databases - do a full exp and a full expdp with parallel and show them the runtimes for them to compare...

    See you soon,.

    Rich

  • Data pump and the users and developers of Apex_Admin

    Hello

    I use Data Pump to save the schema that I use. It works very well. Now I would like to use Data Pump export and import the users listed in the application of Apex_Admin under workspaces manage / managing the Deverlopers and users.
    Where are stored the users? How to build the EXPDP / statements IMPDP.

    Thanks for your help.

    Hello

    Schema APEX_xxxxxx or FLOWS_xxxxxx are stored users APEX is where all your application metadata and workspace. The scheme name depends on your version of the APEX.
    Maybe you're using APEXExport. Check out this blog of Johns.
    http://Jes.blogs.shellprompt.NET/2006/12/12/backing-up-your-applications/

    Kind regards
    Jari

    http://dbswh.webhop.NET/dbswh/f?p=blog:Home:0

  • Data Pump Export Wizard in TOAD

    Hello
    I am new to the interface of TOAD.

    I would like to export tables from a database and import it into another. I intend to use Data Pump Export / Import Wizard in TOAD.

    I installed 9.1 Toad on WINDOWS XP and I connect to the BOX (DB server) using the customer Oracle UNIX.

    I know that the command line data pump process IE $expdp and $impdp.

    But I don't have a direct access to the UNIX BOX, IE the HOST so I would use TOAD to do the job.


    I would like to know what is the process for this.

    How different it is to use the command line data pump... With the help of TOAD where we create DIRECTORY DATAPUMP?

    I can do it on the local computer?

    Basically, I would like to know the process of import/export using TOAD with no direct access to the UNIX MACHINE.

    Thanks in advance.

    user13517642 wrote:
    Hello
    I am new to the interface of TOAD.

    I would like to export tables from a database and import it into another. I intend to use Data Pump Export / Import Wizard in TOAD.

    I installed 9.1 Toad on WINDOWS XP and I connect to the BOX (DB server) using the customer Oracle UNIX.

    I know that the command line data pump process IE $expdp and $impdp.

    But I don't have a direct access to the UNIX BOX, IE the HOST so I would use TOAD to do the job.

    I would like to know what is the process for this.

    How different it is to use the command line data pump... With the help of TOAD where we create DIRECTORY DATAPUMP?

    I can do it on the local computer?

    Basically, I would like to know the process of import/export using TOAD with no direct access to the UNIX MACHINE.

    Thanks in advance.

    I don't think that you can do with TOAD withouth physically copy of the file on the remote host. However, you have another option. "You can use the link to database to load data * without copying it to the remote host" using NETWORK_LINK parameter, as described below:

    For export:
    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_export.htm#sthref144
    The NETWORK_LINK parameter initiates an export using a database link. This means that the system whereby the customer expdp is connected contact data source referenced by the source_database_link, he extracted data and writes the data to a dump on the connected system file.

    To imprt:
    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_import.htm#sthref320
    The parameter NETWORK_LINK initiates a network import. This means the impdp customer initiates import demand, usually in the local database. This server comes in contact with the remote source database referenced by source_database_link and retrieves the data directly, it rewrites in the target database. There is no involved dump file.

    Kamran Agayev a.
    Oracle ACE
    - - - - - - - - - - - - - - - - - - - - -
    My video tutorials of Oracle - http://kamranagayev.wordpress.com/oracle-video-tutorials/

Maybe you are looking for