Oracle 10g Export Data Pump

Hi, I wrote the following statement to export some objects in my diagram. My question is if the indices and the table_data for export for the charts that I export or all tables in the schema? I want to export index and table_datA only in exported tables. Also, if I want to import anything that has been exported so that should be the statement?

expdp schema/password DIRECTORY = data_pump_dir DUMPFILE = kenya.dmp SCHEMA = schema SQLFILE = kenya.sql
INCLUDE = TABLE: ' like "TB_KEN_ % '"
INCLUDE = INDEX
INCLUDE TABLE_DATA =
INCLUDE = VIEW
INCLUDE = PACKAGE_BODY
INCLUDE = PACKAGE_SPEC
INCLUDE = PROCEDURE

When I posted this response, I did not have access to a database. I just tired it and I've only got the data and the indexes associated with the tables that were specified on the command export.

So YES, it will export only the data and indexes for the specified tables. I will export all the data in the table and all indexes for the specified schema.

Dean

Published by: Dean WINS on August 25, 2009 07:46

Tags: Database

Similar Questions

  • Export data pump job scheduler does not load

    I have a pump a 10.2.0.2.0 data dump file database, and then import on 11.2.0.3.0 fails with this error:
    ORA-39083: Object type PROCOBJ failed to create with error:
    ORA-06502: PL/SQL: numeric or value error: character to number conversion error
    Failing sql is:
    BEGIN 
    dbms_scheduler.create_job('"MY_JOB_NAME"',
    job_type=>'STORED_PROCEDURE', job_action=>
    'MY_SCHEMA.MY_PROCEDURE'
    , number_of_arguments=>0,
    start_date=>'31-JUL-12 02.05.13.782000 PM EUROPE/BERLIN', repeat_interval=> 
    'FREQ=WEEKLY;BYDAY=FRI;BYHOUR=7;BYMINUTE=0;BYSECOND=0'
    , end_date=>NULL,
    job_class=>'"DEFAULT_JOB_CL
    {code}
    
    I extracted the SQL Code from the dump file and it looks like this:
    {code}
    BEGIN 
    dbms_scheduler.create_job('"MY_JOB"',
    job_type=>'STORED_PROCEDURE', job_action=>
    'MY_SCHEMA.MY_PROCEDURE'
    , number_of_arguments=>0,
    start_date=>'31-JUL-12 02.05.13.782000 PM EUROPE/BERLIN', repeat_interval=> 
    'FREQ=WEEKLY;BYDAY=FRI;BYHOUR=7;BYMINUTE=0;BYSECOND=0'
    , end_date=>NULL,
    job_class=>'"DEFAULT_JOB_CLASS"', enabled=>FALSE, auto_drop=>FALSE,comments=>
    'bla bla comment'
    );
    dbms_scheduler.set_attribute('"MY_JOB"','logging_level','HIGH');
    dbms_scheduler.enable('"MY_JOB"');
    COMMIT; 
    END; 
    / 
    {code}
    
    
    
    After the job is defined the second statement fails:
    
    {code}
    SQL> exec dbms_scheduler.set_attribute('"MY_JOB"','logging_level','HIGH');
    BEGIN dbms_scheduler.set_attribute('"MY_JOB"','logging_level','HIGH'); END;
    
    *
    ERROR at line 1:
    ORA-06502: PL/SQL: numeric or value error: character to number conversion error
    ORA-06512: at "SYS.DBMS_SCHEDULER", line 2851
    ORA-06512: at line 1
    {code}
    
    
    From the source I see:
    
    {code}
    SQL> select logging_level from dba_scheduler_jobs where job_name = 'MY_JOB';
    
    LOGG
    ----
    FULL
    {code}
    
    In the docs I see these valid LOGGING_LEVELs:
    http://docs.oracle.com/cd/E14072_01/server.112/e10595/scheduse008.htm#CHDFDFAB
    
    DBMS_SCHEDULER.LOGGING_OFF
    DBMS_SCHEDULER.LOGGING_FAILED_RUNS
    DBMS_SCHEDULER.LOGGING_RUNS
    DBMS_SCHEDULER.LOGGING_FULL
    
    So please help me, I can't find something useful on MOS, what is Data Pump exporting there, and can not import it again
    Maybe I have overseen a known bug?                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                            

    Finally I found the bug myself:

    MOS: Impdp of objects of treatment fails with ORA-39083 and ORA-06502 [ID 780174.1]

    https://support.Oracle.com/epmos/faces/DocContentDisplay?ID=780174.1

  • Kill an Oracle 10 g data pump work

    I have a snap task data pump import that loop forever. Probably because by mistake I started it as SYS instead of a particular schema owner. She correctly creates objects in the schema of right, because of the setting of PATTERNS, but he won't finish.
    How can I kill it? I can see it in select * from dba_datapump_jobs; but not in dba_jobs.
    I can stop it without having to restart the database?

    Impdp comes with aid = clause of y.
    When you issue, you will see the pricing clause and clause kill_job.
    The documentation online at http://tahiti.oracle.com for your unknown version provides more information.

    You try to read the documentation before asking a volunteer to make abstraction of the documentation on your behalf is strongly recommended.

    --------
    Sybrand Bakker
    Senior Oracle DBA

  • Masking of data with Oracle 10 g Data Pump

    Dear all,

    I have to hide certain data confidential all import in specific tables. I am currently using below mentioned oracle version

    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Prod

    In 11g, there is an option in datapump called REMAP_DATA. Is there an equivalent feature in version 10g?

    Kind regards

    S.balraj

    I thought that 'no' was pretty clear.

    DIY - or copy table, the mask sensitive data and then expdp / impdp copied or hide the data after the import.

  • by default export data pump include options.

    Hi members,
    Can anyone tell if datapump schema export export subsidy system, object of subsidies and privileges of the default schemas. ? Any place where can I find a complete list of default values in the documentation? Thanks in advance.

    If you exported using user DBA role, it should have subsidies, privileges. Try SQLFILE option to generate the DDL and see if these SUBSIDIES and privileges are included.

  • I want to learn more about the data pump and table space transferable

    Please notify easy tutorials as I want to know how to import and export between oracle 10 and 11.
    Thank you

    Hello
    Please check this oracle tutorial:
    http://www.exforsys.com/tutorials/Oracle-10G/Oracle-10G-using-data-pump-import.html
    about transportable table spaces, you may consult:
    http://www.rampant-books.com/art_otn_transportable_tablespace_tricks.htm
    Kind regards
    Mohamed
    Oracle DBA

  • Imported a few diagrams of Oracle database 11g oracle 10g

    Hello

    I have the task to import some schemas are basis of oracle 10g (10.2.0.4) in the database to oracle 11 GR 2 (11.2.0.2). Now I only oracle 10g export and import utility (the 11g server is not where I give myself the import duties).

    I can do the export of the g 10 patterns using the 10 g utility to export and then import into database 11 g using the 10 g import utility? I ran a test to do this and found it works but I would like also to have advice from experts on this subject.

    Also another question is: If the schema of the source is 11g - assume - can the utility of export/import of 10g still work in this case? I mean the scheme for the export of db1 say (which is 11 g) using 10 g utility to export and import them into 11 g db2 database (which is still 11 g) using 10 g import? I request because I don't have access to the 11g server and trying to find what I can do with the 10g utilities.

    Thank you

    set up a database connection to the database to version 10 gr 2 in 11 GR 2 database.
    Now using the client impdp GR 11, 2 utility use
    Impdp network_link =

    Fact.
    Your "objection" you have no access to GR 2 11 does not count that you can install a client of 11 GR 2.

    Also in general do not use a version more low imp to imp to a higher version database
    Exp and imp are also deprecated in 10g and more.

    -------------
    Sybrand Bakker
    Senior Oracle DBA

  • Export schema through Oracle data pump with question database Vault enabled

    Hello

    I installed and configured Vault database on an Oracle 11 g - r2 - 11.2.0.3 to protect a specific schema (SCHEMA_NAME) via a Kingdom. I followed the following doc:
    http://www.Oracle.com/technetwork/database/security/TWP-databasevault-DBA-BestPractices-199882.PDF
    to ensure that the system and the network user has sufficient rights to complete a schedule pump data Oracle export operation.

    I.e. I gave sys and system the following:
    execute dvsys.dbms_macadm.authorize_scheduler_user ('sys', 'SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_scheduler_user ('system', 'SCHEMA_NAME');

    execute dvsys.dbms_macadm.authorize_datapump_user ('sys', 'SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_datapump_user ('system', 'SCHEMA_NAME');

    I also create a second domain on the same schema (SCHEMA_NAME) to allow sys and system to manage indexes for the actual tables protected, in order to allow a system to manage indexes for the tables of protected area and sys. This separate realm was created for all types of indexes: Index, Partition of Index and Indextype, sys and system were allowed as the OWNER of this Kingdom.

    However, when I try and complete an operation of Oracle Data Pump export on the schematic, I get two errors directly after the following line appears in the export log:

    Processing object type SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX:
    ORA-39127: unexpected error of the call to export_string: = SYS. DBMS_TRANSFORM_EXIMP. INSTANCE_INFO_EXP('AQ$_MGMT_NOTIFY_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newBlock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS." DBMS_TRANSFORM_EXIMP', line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS." Dbms_metadata", line 9081
    ORA-39127: unexpected error of the call to export_string: = SYS. DBMS_TRANSFORM_EXIMP. INSTANCE_INFO_EXP('AQ$_MGMT_LOADER_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newBlock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS." DBMS_TRANSFORM_EXIMP', line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS." Dbms_metadata", line 9081

    The export is completed, but this error.

    Any help, pointers, suggestions, etc. in fact no matter what will be very welcome at this stage.

    Thank you

    I moved the forum thread on the "database - security. If the document does not help, pl open an SR with Support

    HTH
    Srini

  • 10g to 11 GR 2 upgrade using Data Pump Import

    Hello

    I intend to move a database 10g a windows server to another. However, there is also a requirement to upgrade this database to GR 11, 2. That's why I was going to combine the 2 in one movement.

    1. take an export of complete data to source 10 g data pump
    2. create a new database empty 11 g on the target environment
    3 import the dump file into the target database

    However, I have a couple of queries running in my mind about this approach-

    Q1. What happens with the SYSTEM and SYS SYSTEM and SYSAUX objects when importing? Given that I have in fact already created a new dictionary on the empty target database - import SYS or SYSTEM will simply produce errror messages that should be ignored?

    Q2. should I use EXCLUDE on SYS and SYSTEM (is it EXCLUDE better export or import side)?

    Q3. What happens if there are things like scheduled tasks, etc. on the source system - since these would be stored in the property SYSTEM tables, how I would bring them across to the target 11 g database?

    Thank you
    Jim

    This approach is included in the Upgrade Guide 11 GR 2 - http://docs.oracle.com/cd/E11882_01/server.112/e23633/expimp.htm

    PL, ensure that you use SYSDBA privileges to run commands expdp and impdp. See here the first sections of "Note".

    http://docs.Oracle.com/CD/B19306_01/server.102/b14215/dp_export.htm#sthref57
    http://docs.Oracle.com/CD/E11882_01/server.112/e22490/dp_import.htm#i1012504

    As mentioned, sown schemas (for example SYS etc.) are not exported. See http://docs.oracle.com/cd/B19306_01/server.102/b14215/dp_export.htm#i1006790

    HTH
    Srini

  • Oracle 11 g 2 Standard Edition Data Pump

    Hello

    I understand that Data Pump is fully supported in Oracle 11 g 2, Standard Edition, and the one feature that is not included is "parallel." According to the http://download.oracle.com/docs/cd/B28359_01/license.111/b28287/editions.htm, I noticed that for itself, Flashback Database is "N".

    That being the case, anyone know if I can use expdp with the parameter 'FLASHBACK_TIME' or 'FLASHACBK_SCN' with Oracle 11 g 2 SE? We are a boutique EE here so I will dl/install SE and test it in-house, but if someone has tried already and you know the answer, it will save me a lot of time.

    Thanks in advance.

    Yes, you can use flashback_time and flashback_scn in itself. It is similar to using a query flashback that is supported and self-esteem (SE1).

    Here is an example of flashback_time, but 10 gr 2 (don't have 11 GR 2 run immediately)

    Connected to: Oracle Database 10g Release 10.2.0.4.0 - Production
    Starting "SYSTEM"."SYS_EXPORT_SCHEMA_01":  system/******** dumpfile=dptest_fb.dmp schemas=sample include=table:" = 'CLASS'" flashback_time='2010-03-02:11:30:00'
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 128 KB
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "SAMPLE"."CLASS"                          87.64 KB     696 rows
    Master table "SYSTEM"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
    
  • Data Pump Export/Import

    Hello Forum,

    I have a question regarding imports and exports of data pump, perhaps what I should already know.

    I need to empty a table that has about 200 million lines, that I need to get rid of about three quarters of data.

    My intention is to use the data pump to export the table and and indexes and constraints etc..

    The table has no relationship to any other table, it is composed of approximately 8 columns with constraints not null.

    My plan is

    1 truncate table

    2. disable or remove index

    3 leave the constraints in place?

    4. use data pump to import a lines to keep.

    My question

    will be my clues and imported too much constraints I want to import only a subset of my exported table?

    or

    If I dropped the table after truncation, I'll be able to import my table and indexes, even if I use the sub as part of my statement to import query functionality?

    My table using the query sub in data pump functionality must exist in the database before doing the import

    or handful of data pump import as usual IE will create table indices grants and statistics etc.?

    Thank you for your comments.

    Concerning

    Your approach is ineffective.

    What you need to do is to

    create the table in select foo * bar where the

    bar of truncate table;

    Insert / * + APPEND * / into select bar * foo.

    Rebuild the indexes on the table.

    Fact.

    This whole thing with expdp and impdp is only a waste of resources. My approach generates redo a minimum.

    ----------

    Sybrand Bakker

    Senior Oracle DBA

  • Data Pump: export/import tables in different schemas

    Hi all

    I use Oracle 11.2 and I would like to use the data pump to export / import data into the tables between the different schemas. The table already exists in the source and target schemas. Here are the commands that I use:


    Script working export:

    expdp scott/tiger@db12 schema = source include = TABLE:------"IN (\'TB_TEST1 ', \'TB_ABC')\ 'directory = datapump_dir dumpfile = test.dump logfile = test_exp.log

    Script to import all the desks:

    Impdp scott/tiger@db12 remap_schemas = rΘpertoire source: target = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only table_exists_action = truncate

    Script error for some tables to import:

    Impdp scott/tiger@db12 remap_schemas = source: target include = TABLE:------"IN (\'TB_TEST1')\' directory = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only

    Export is good, I got the folling error when I try to import the table TB_TEST1 only: "ORA-31655: no metadata data_or objectsz selected for employment. The user scott is a DBA role, and it works fine when I try to import all tables being export without the include clause.

    It is possble to import some tables but not all tables in the export file?

    Thanks for the help!

    942572 wrote:

    It's good to import all tables exported through Scott, who do NOT have the clause 'include '.

    I have the error only when I try to import some tables with clause "include".

    Can I import only some table since the export dump file? Thank you!

    you use INCLUDE incorrectly!

    do below yourself

    Impdp help = yes

    INCLUDE

    Include the specific object types.

    For example, INCLUDE TABLE_DATA =.

  • Differences between Data Pump and always by inheritance, import and export

    Hi all

    I work as a junior in my organization dba and I saw that my organization still uses legacy import and export utility instead of Oracle Data Pump.

    I want to convence my manager to change the existing deal with Oracle Data Pump, I have a meeting with them to keep my points and convence them for Oracle utility pump data.

    I have a week very convencing power but I won't miss to discover myself, I can't work myself and really a need with differences of strength against import and export, it would be really appreciated if someone can put strong points against Oracle Data pump on legacy import and export.

    Thank you

    Cabbage

    Hello

    a other people have already said the main advantage of datapump is performance - it is not just a little more fast exp/imp it is massively faster (especially when combined with parallel).

    It is also more flexible (once much more) - it will even create users with exports level schema which imp can't do for you (and it is very annoying that he could never).

    It is reusable

    It has an api plsql

    It supports all types of objects and new features (exp is not - and that alone is probably reason to spend)

    There even a 'legacy' at 11.2 mode where most of your old exp parameter file will still work with him - just change exp expdp and impdp print.

    The main obstacle to the transition to datapump seems to be all "what do you mean I have to create a directory so that it works", well who and where is my dumpfile why can't it be on my local machine. These are minor things to go well beyond.

    I suggest do you some sort of demo with real data of one of your large databases - do a full exp and a full expdp with parallel and show them the runtimes for them to compare...

    See you soon,.

    Rich

  • selective column of data pump export

    Dear Experts,

    I'm using Oracle 11 g. With the help of data pump is it possible to choose only some columns in a table to export.

    Thanks in advance.

    spur230 wrote:
    Dear Experts,

    I'm using Oracle 11 g. With the help of data pump is it possible to choose only some columns in a table to export.

    Thanks in advance.

    It is not possible in data pum export only selective table columns. However you can try next.

    (1) create table export_selective as select c1, c2 from the source table; (you can create it in the source db)

    (2) expdp export_selective table

    (3) export_selective impdp in the target table

    Reverse:
    Dblink allows to obtain this table created with selective columns using DEC

  • Data Pump Export Wizard in TOAD

    Hello
    I am new to the interface of TOAD.

    I would like to export tables from a database and import it into another. I intend to use Data Pump Export / Import Wizard in TOAD.

    I installed 9.1 Toad on WINDOWS XP and I connect to the BOX (DB server) using the customer Oracle UNIX.

    I know that the command line data pump process IE $expdp and $impdp.

    But I don't have a direct access to the UNIX BOX, IE the HOST so I would use TOAD to do the job.


    I would like to know what is the process for this.

    How different it is to use the command line data pump... With the help of TOAD where we create DIRECTORY DATAPUMP?

    I can do it on the local computer?

    Basically, I would like to know the process of import/export using TOAD with no direct access to the UNIX MACHINE.

    Thanks in advance.

    user13517642 wrote:
    Hello
    I am new to the interface of TOAD.

    I would like to export tables from a database and import it into another. I intend to use Data Pump Export / Import Wizard in TOAD.

    I installed 9.1 Toad on WINDOWS XP and I connect to the BOX (DB server) using the customer Oracle UNIX.

    I know that the command line data pump process IE $expdp and $impdp.

    But I don't have a direct access to the UNIX BOX, IE the HOST so I would use TOAD to do the job.

    I would like to know what is the process for this.

    How different it is to use the command line data pump... With the help of TOAD where we create DIRECTORY DATAPUMP?

    I can do it on the local computer?

    Basically, I would like to know the process of import/export using TOAD with no direct access to the UNIX MACHINE.

    Thanks in advance.

    I don't think that you can do with TOAD withouth physically copy of the file on the remote host. However, you have another option. "You can use the link to database to load data * without copying it to the remote host" using NETWORK_LINK parameter, as described below:

    For export:
    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_export.htm#sthref144
    The NETWORK_LINK parameter initiates an export using a database link. This means that the system whereby the customer expdp is connected contact data source referenced by the source_database_link, he extracted data and writes the data to a dump on the connected system file.

    To imprt:
    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_import.htm#sthref320
    The parameter NETWORK_LINK initiates a network import. This means the impdp customer initiates import demand, usually in the local database. This server comes in contact with the remote source database referenced by source_database_link and retrieves the data directly, it rewrites in the target database. There is no involved dump file.

    Kamran Agayev a.
    Oracle ACE
    - - - - - - - - - - - - - - - - - - - - -
    My video tutorials of Oracle - http://kamranagayev.wordpress.com/oracle-video-tutorials/

Maybe you are looking for