little doubt about import datapump

Hello

OracleVersion:10.2.0.1
Operating system: linux

Here, I have a small doubt please help me it is to say

I took a backup of tables emp and dept now I need to import only the table emp based on condition specified in another schema
select * from emp where deptno in (select deptno from dept where loc='DALLAS')
Here is my script to import that I had tried failed. Please help me how to
E:\oracle\dbdump>impdp sample/sample directory=dbdump dumpfile=TABLES.DMP logfile=tales_imp.log remap_schema=scott:sample tables=emp remap_tablespace=users:sample query=\"where deptno in \(select deptno from dept where loc='DALLAS')\"

Import: Release 10.2.0.1.0 - Production on Thursday, 29 October, 2009 17:59:05

Copyright (c) 2003, 2005, Oracle.  All rights reserved.

Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Master table "SAMPLE"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
Starting "SAMPLE"."SYS_IMPORT_TABLE_01":  sample/******** directory=dbdump dumpfile=TABLES.DMP logfile=tales_imp.log remap_schema=scott:sample tables=emp remap_tablespace=users:sample query="where deptno in \(select deptno from dept where loc='DALLAS')"
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
ORA-31693: Table data object "SAMPLE"."EMP" failed to load/unload and is being skipped due to error:
ORA-00911: invalid character
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/POST_TABLE_ACTION
ORA-31685: Object type POST_TABLE_ACTION failed due to insufficient privileges. Failing sql is:
BEGIN
 SYS.DBMS_SNAPSHOT_UTL.SYNC_UP_LOG('SCOTT','EMP');
 END;

Job "SAMPLE"."SYS_IMPORT_TABLE_01" completed with 2 error(s) at 17:59:15

SIDDABATHUNI wrote:
Hello

I'm looking for here is I'll have the release of the full scheme. Now, I want to import only a single table based on a specific condition rather than import the full dump.

I get the error when you use parfile as you suggest.

Not too, I've said to parfile "+ no need to back slashes +"...?

Nicolas.

Tags: Database

Similar Questions

  • little doubt about serialization?

    Hello
    I'm serialization of an object that implements serializable using ObjectOutputStream but this object class is not a constructor argument. When it is deserialized object is being built back again. But how is that possible? "Class.forName" also requires the class object has no constructor args. so in the deserialization how the object is built back again even if she has no constructor args? I'm trying to understand how this is possible? can someone help me understand the logic?

    If it is not compatible with JDK serialization, it is a bug in Android.

  • Little doubt about VMware Fault Tolerance

    I would like to know, in case if the Virtual Machine goes down or get corrupted, the setting of Virtual Machine replica on the secondary host is going live?

    Else FT is applicable only when the broken material / host, please provide details?

    Both machines use the same storage and the second machine is basically running the same instructions as the first. If the first has been tampered on disk or a memory error, it appears in the second. FT is realtime protection against hardware failures.

  • doubt import DataPump

    Hello

    Oracle Version: 10.2.0.1
    Operating system: Linux

    Can we put two datapump both on discharge even import process in the same machine at the same time.

    Thank you and best regards,
    Poorna Prasad.

    Neerav999 wrote:
    NO, you cannot create import datapump as
    Import process starts with the creation of master process, the work process, the customer and the shadow process no import is possible until these processes are free

    I don't know what you mean here? You don't mean that we cannot run two parallel import process as the process will be not free? Please see below,

    E:\>time
    The current time is: 18:24:17.04
    Enter the new time:
    
    E:\>impdp system/oracle directory=expdpdir dumpfile=dumpfile.dmp remap_schema=hr:hr2
    
    Import: Release 11.1.0.6.0 - Production on Saturday, 06 February, 2010 18:24:25
    
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    
    Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "SYSTEM"."SYS_IMPORT_FULL_02" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_02":  system/******** directory=expdpdir dumpfile=dumpfile.dmp remap_schema=hr:hr2
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "HR2"."COUNTRIES"                           6.375 KB      25 rows
    . . imported "HR2"."DEPARTMENTS"                         7.015 KB      27 rows
    . . imported "HR2"."EMPLOYEES"                           16.80 KB     107 rows
    . . imported "HR2"."JOBS"                                6.984 KB      19 rows
    . . imported "HR2"."JOB_HISTORY"                         7.054 KB      10 rows
    . . imported "HR2"."LOCATIONS"                           8.273 KB      23 rows
    . . imported "HR2"."REGIONS"                             5.484 KB       4 rows
    Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/COMMENT
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/VIEW/VIEW
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
    Job "SYSTEM"."SYS_IMPORT_FULL_02" successfully completed at 18:24:44
    

    And the second session "

    E:\Documents and Settings\aristadba>time
    The current time is: 18:24:19.37
    Enter the new time:
    
    E:\Documents and Settings\aristadba>impdp system/oracle directory=expdpdir dumpfile=dumpfile.dmp remap_schema=hr:hr
    
    Import: Release 11.1.0.6.0 - Production on Saturday, 06 February, 2010 18:24:22
    
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    
    Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01":  system/******** directory=expdpdir dumpfile=dumpfile.dmp remap_schema=hr:h
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "HR1"."COUNTRIES"                           6.375 KB      25 rows
    . . imported "HR1"."DEPARTMENTS"                         7.015 KB      27 rows
    . . imported "HR1"."EMPLOYEES"                           16.80 KB     107 rows
    . . imported "HR1"."JOBS"                                6.984 KB      19 rows
    . . imported "HR1"."JOB_HISTORY"                         7.054 KB      10 rows
    . . imported "HR1"."LOCATIONS"                           8.273 KB      23 rows
    . . imported "HR1"."REGIONS"                             5.484 KB       4 rows
    Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/COMMENT
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/VIEW/VIEW
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
    Job "SYSTEM"."SYS_IMPORT_FULL_01" successfully completed at 18:24:44
    

    You see the time difference? Its just 2 seconds before the work has been done!

    You would be so kind to explain the statement you gave earlier since very well are the chances that I can not understand it.

    HTH
    Aman...

  • Doubts about event handlers

    Hello

    I had some doubts about the event handlers in the IOM 11.1.1.5...

    (1) I want to use the same event handler for the message insert and update Post task... Can I use the same handler for this... If Yes, then how can I make...

    (2) can I create the single class of Plugin.xml and add all the jar files in IE single lib folder and zip them all together... If yes then what changes I need to do? Need only add that the plugin tags for different class in the plugin.xml file files? OR need to do something extra too...?

    (3) if I need to change something in any class handler... Is it need to unregister the plugin and register again...?
    If Yes... Is it need to delete the event handler using the weblogicDeleteMetadata command?

    (4) that we import the event handler of the path as event manager/db /... If we add all the evetn handler.xml files in this folder... As when importing weblogicImportMetadata called recursively all files in this folder... Now, if I need to change anything in one of the event handler class... so if import us from the same event manager/db folder... What to do... Create the copy of the eventhandlers? OR should I not add Eventhandler.xml files to class files, I made the changes...

    (5) given that I need to create emails on the creation of the user while recon and identification of email updated as a first name or surname updates... I had to use in the event handler.xml (entity-type = 'User' operation = "CRΘER") or something else...


    Help me clarify my doubts...

    Yes, on the update post you need to be check first if the first and last name change to update the mail electronic id, rather then calculation always email identification. So, you can check the path name are updated through the previous code.

    -Marie

  • I opened a suspicious link and it kept is refreshing and nothing loaded, I'm little worried about being hacked, what I can do to fix things?

    Hello

    I opened a suspicious link and it kept is refreshing and nothing loaded, I'm little worried about being hacked, what I can do to fix things?

    I'm new to iphones I hope you could help me

    Thank you.

    There is no virus known for a non jailbroken iphone.

    Conclusion of the app, you are in double tap home button and close the app.

  • Doubts about licenses

    Hi all

    I have a few doubts about the price of licenses.

    I understand, I can deploy an APEX Server 11g XE free of charge, but what happens, if I want to install, a version no XE?

    Imagine a billing application, for 10 users, and I will assume that a Standard is sufficient. With the help of [this price list | http://www.oracle.com/us/corporate/pricing/technology-price-list-070617.pdf], how much exactly will cost?

    I understand I can get a license by user or server, or I have to license user and server too?

    Kind regards.

    Hello
    metric license is named plu user or license CPU (see the table of the core).

    for a quote, you can take a look in the oracle store or ask your dealer for an exact price oracle.

    concerning
    Peter

  • I have a doubt about the file .folio and publications

    Hello, I m new here.

    I want to start working with DPS, but I have a doubt about which version to buy.

    At the moment I have one customer just wants to publish a magazine, but my intention is to have more customers and publish more magazines.

    If I buy the unique edition of DPS, I read that I can publish a single file .folio. What it means? Each folio file represents a publication?

    Please, I need help to understand this before you purchase the software.

    Thank you very much

    Paul

    Here's a quick blog I wrote to compare the simple edition and

    multifolio apps:

    http://boblevine.us/Digital-Publishing-Suite-101-single-Edition-vs-multi-Folio-apps/

    Bob

  • Doubt about appsutil.zip in R12

    Hi all
    I have doubts about the application of rapid Clone on 12.1.3.I the latest patches have applied the fix using adpatch. After that, it must synchronize directories appsutil
    in RDBMS oracle home. I created appsutil.zip in the application layer and copied in the RDBMS oracle home. If I move the old appsutil to appsutil.old and extract appsutil.zip, the new appsutil directory should not constituted by the context file (I think). So, I have to run the automatic configuration based on the old cotextfile. Below, I have summarized the steps that I follow. Please check and correct me if I'm wrong.

    Copy appsutil.zip to $INST_TOP/admin/out of RDBMS oracle home
    CP $CONTEXT_FILE /tmp/mytest_vis.xml
    MV appsutil appsutil.orig
    unzip appsutil.zip
    Run autoconfig based on/tmp/mytest_vis.xml.


    Thank you
    Jay

    Jay,

    Is there a reason why do not use the old file context? What is the difference between the context file that will be generated by adbldxml.pl and the old file context?

    If there are updates in the application, it will be updated in the new xml file generated by adbldxml.sh, but he's not in the old file.

    So it is always best to run adbldxml.sh and autoconfig.

    Amulya

  • Doubts about RAC infrastructure with a disk array

    Hello everyone,

    I am writing because we have a doubt about the correct infrastructure to implement RAC.

    Please, let me first explain the current design we use for storage Oracle DB. Currently, we are conducting multiple instances in multiple servers, all connected to a SAN disk storage array. As we know that it is a single point of failure so we have redundant controlfiles, archiveds and Orde in the table and in the internal drive of each server, in which case table has completely failed us 'just' need to recover cold backup nightly, applied hoops and Oder and everything is ok. This is possible because we have autonomous bodies and we can assume that this downtime of 1 hour.

    Now, we want to use these servers and implementing this table to a RAC solution and we know that this table is our only point of failure and wonder if it is possible to have a RAC multi-user solution (not AS a node) with controlfiles/archs/oder redundant internal drives. Is it possible to have each written full node RAC controlfiles/archs/oder in drives internal and applies these files systematically when the ASM filesystem used for CARS is restorations (i.e. with a softlink in an internal drive and using a single node)? Maybe the recommended solution is to have a second table to avoid this single point of failure?

    Thank you very much!

    CSSL wrote:

    Maybe the recommended solution is to have a second table to avoid this single point of failure?

    Fix. It is the right solution.

    In this case, you can also decide to simply use the distribution on both tables and mirror of the array1 array2 on table data using the ASM redundancy options.

    Keep in mind that the redundancy is also necessary for connectivity. If you need at least 2 switches to connect on two tables and two HBA ports on each server, 2 fibers running, one to each switch. You will need driver multichannel s/w on the server to deal with the multiple I/O paths for storing same lun.

    Similarly, you will need to repeat this step for your Interconnect. 2 private switches, 2 cards on each server which are pasted. Connect then these 2 network cards on the 2 switches, one NETWORK card per switch.

    Also, don't forget to spare parts. Spare switches (one for the storage and interconnection). Spare cables - fiber and everything that is used for the interconnection.

    Bottom line - not a cheap to have a redundancy solution. What we can do is to combine the layer of Protocol/connection of storage with the interconnection layer and run both on the same architecture. Oracle database machine and Exadata storage to servers. You can run your storage Protocol (e.g. PRSS) and your Protocol (TCP or RDS) interconnection on the same 40 GB Infiniband infrastructure.

    As well as 2 switches Infiniband are needed for redundancy, plus 1 spare. With each server running a dual port HCA and one cable for each of these 2 switches.

  • Oracle import datapump fix corruption of block?

    Hi all
    I'm having a corruption of data on the production block. I want to export this DB and import it into a test using datapump environment, in order to do some tests on it.

    However, I'm concerned if impdp will solve this scenario... and that's why I'm going to lose my test scenario...
    It will be Oracle import datapump fix block corruption?

    Thank you

    Oracle does not have an import of block by block, including blocks corrupt so that the answer to your question is NO, corruption cannot be reproduced in this way.

    Concerning

  • ERROR importing DATAPUMP

    Hello


    Oracle Release version: 10.2.0.3.0
    Server: Linux


    We get the following error when importing the datapump. Here we export the dump on the server even and then he was trying to import in the same server, we get the following error in our servers.yesterday also import datapump production works very well, but today, we get the error.


    We tried to import on different servers too, but the same problem we face



    ORA-39014: one or more workers have left prematurely.
    ORA-39029: worker 1 with the process name "DW01' end prematurely
    ORA-31672: process work DW01 died suddenly.


    Please help me.

    Published by: SIDDABATHUNI on June 1st, 2009 21:46

    Hello

    When you have attached the work in the prompt before running continue_client.
    can you issue the command status and check which object it was important during the crash.

    now run a new job with parameter exclude to exclude this one object.

    Thank you.

  • Reg: Objects & missing partitions while import datapump

    Hi Techies,

    I need urgent assistance on the case below I am facing currently.

    I imported a schema using DATAPUMP and it failed with the error "Failed to extend the tablespace" and for this dynamically when requested, I added a space.

    Its a grand scheme to export 120G dump file.

    The import log showed as completed successfully and when the AD team working on it they they found misssing some of the objects & partitions and it will be about 1000 + partitions.

    The one you suggest how to solve this problem without refreshing again...

    Thanks in advance...
    Satya

    user10972727 wrote:
    Hi Techies,

    I need urgent assistance on the case below I am facing currently.
    .. misssing some objects & partitions...
    The one you suggest how to solve this problem without refreshing again...

    Say your team AD to set "a little." exactly which opposes and what partitions are missing. With this information, you can start new export only objects and the partitions that are missing, then import them and you will have the full data set, you don't have to import in the first test.

    The one you suggest how to solve this problem without refreshing again...

    There's no other way to do it logically correct. You could possibly play and have fun with links to the database and to add the missing items through some SQLs, but something as I have explained in a few lines above is certainly the more normal way to do it when you have scenario you described

    Thanks in advance...
    Satya

  • Parallel degree importing Datapump

    Hello

    For several imports that I ran, I observed a constant degree of parallelism = 10 when creating index.

    Moreover, I parallel_degree_limit = en parallel_degree_policy = MANUAL CPU.

    The first parameter specifies that Oracle sets up to an upper limit on the DOP.

    If this explains the value observed?

    Kind regards

    Hello

    At the same time goes like this:

    all the tables, plsql definitions are created in series

    All data in the table of loading in parallel - so not parallel insertion into a table but a work process separate datapump by table - so EMP would be charged by slave1 and DEPT by slave2 - but at the same time. IN your case 10 separate tables would all be imported at the same time.

    The indexes are built at the same time you specify

    Constraints are made in series one and the same process, I think

    I wrote a short blog about it earlier that can be useful: it contains pictures! :-)

    Oracle DBA Blog 2.0: What happens during an import parallel datapump?

    See you soon,.

    Rich

  • Question about importing video footage in Premiere Pro

    I'm new to Premiere Pro and have a question about it. I need to move all my videos on my new computer (Windows) from my old computer (iMac running iMovie). Where should I store my images so I can access inside Premiere Pro? I just need to store everything on a hard drive, and then create inside Premiere Pro projects when I need to import images in, or should I create a 'library' to store all my movies in? Setup bridge would be another beneficial way to view my videos (I use Lightroom for my photo library)? I'm just trying to discern the way I need to store and organize everything in order to get started with Premiere Pro. Thank you!

    Premiere Pro has no capability of database for the media in general, only in a specific project. 'Import' is not like in Lightroom, it does in PrPro is tell her where to look for the media while working on this specific project.

    So... you need create your materials on a particular disc, presented in a way that you can find them in a folder structure you design.

    A typical use of PrPro spreads parts on several disks. OS/programs on the system drive, a disk each cache PrPro & cache of database files; preview files; and project files. Media probably on another drive and exports on a final drive.

    Access the disk cache & cache database files, preview files and project files are the two ways of reading & writing, in addition to being constant. Multimedia files are used unalterable and exports only Scripture, their connections can be a little less robust.

Maybe you are looking for