Specified Import Tables in conjunction with schema remap

I have a problem that I had to face for a while now.  I export some tables using DBMS_DATAPUMP of a schema (say, a test environment) and I would like to just a SINGLE import of this dump file table in another schema (say, a dev environment).  At the same time, I'm remapping the source table to a temporary table with the same structure.

Let me start by saying, I used this script to run the export and import in the SAME pattern and it worked fine.  This problem only when I went to import the data into another schema, using METADATA_REMAP.  Here's the import code.

BEGIN
      SELECT TO_CHAR (SYSDATE, 'YYYYMMDDHH24MISS') INTO L_JOB_NUM FROM DUAL;
      SELECT TO_CHAR (SYSDATE, 'YYYYMMDD') INTO L_SHORT_DT FROM DUAL;
      V_JOB_NUM :=
         DBMS_DATAPUMP.OPEN (OPERATION   => 'IMPORT',
                             JOB_MODE    => 'TABLE',
                             JOB_NAME    => 'BMF_CASE_IMP_' || L_JOB_NUM,
                             VERSION     => 'COMPATIBLE');
                             
      DBMS_DATAPUMP.SET_PARALLEL (HANDLE => V_JOB_NUM, DEGREE => 1);
      DBMS_DATAPUMP.ADD_FILE (
         HANDLE      => V_JOB_NUM,
         FILENAME    => 'BMF_CASE_IMP_BATCH_' || L_SHORT_DT || '.LOG',
         DIRECTORY   => G_DUMP_DIRECTORY,
         FILETYPE    => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE); 
      
                                     
      DBMS_DATAPUMP.METADATA_FILTER (HANDLE   => V_JOB_NUM,
                                     NAME     => 'NAME_EXPR',
                                     VALUE    => q'|in ('BATCH')|',
                                     OBJECT_PATH => 'TABLE');
                                     
      DBMS_DATAPUMP.METADATA_REMAP (HANDLE      => V_JOB_NUM,
                                    NAME        => 'REMAP_TABLE',
                                    OLD_VALUE   => 'BATCH',
                                    VALUE       => 'BATCH_TMP');
                                    
                                     
      d('Remapping from schema '|| G_FROM_SCHEMA || ' to ' || G_TO_SCHEMA );
      DBMS_DATAPUMP.METADATA_REMAP (HANDLE      => V_JOB_NUM,
                                    NAME        => 'REMAP_SCHEMA',
                                    OLD_VALUE   => G_FROM_SCHEMA,
                                    VALUE       => G_TO_SCHEMA);
      DBMS_DATAPUMP.ADD_FILE (
         HANDLE      => V_JOB_NUM,
         FILENAME    => 'BMF_CASE_EXP_' || i_case_control_id || '.DMP',
         DIRECTORY   => G_DUMP_DIRECTORY,
         FILETYPE    => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE);          
      DBMS_DATAPUMP.SET_PARAMETER (HANDLE   => V_JOB_NUM,
                                   NAME     => 'INCLUDE_METADATA',
                                   VALUE    => 0);
      DBMS_DATAPUMP.START_JOB (HANDLE         => V_JOB_NUM,
                               SKIP_CURRENT   => 0,
                               ABORT_STEP     => 0);

If I remove the filter from the BATCH table metadata and run this, it ends and I get the following output in the LOG file:

...

. . imported "CMR2_DEV." "' NTC_ACTION ': 'SYS_P1932' 13.84 KB 0 rows

. . imported "CMR2_DEV." "' BATCH_TMP ': 'SYS_P343' 16.70 KB 1 lines

(.. .and documents for all tables in the dump file)

However, as soon as I activate the filter NAME_EXPR or NAME_LIST, I get nothing imported.  Just the following errors:

-ORA-31627: API call succeeded, but more information

-ORA-31655: no data or metadata of objects selected for employment

It worked when I was not moving between the schemas so is there another way, I need to write my table filter expression, which will identify the BATCH table when a remapping of schema is used?

Thanks in advance.

Adam

I think that my advice was not correct. The name_list filter only takes a table name.  If you do not have a list of schema filter, then the owner of the default table for the schema to run the task.  I think you need to add a filter of schema specifying the table owner.

If you can't understand it, I can try to see if I can find the right calls, but it may take me a while.

Dean

Tags: Database

Similar Questions

  • How can I import tables to a different schema in the relational model existing... to add these tables in the existing model? PLSS help

    How can I import tables from a different schema in the relational model... to add these tables in the existing relational/logic model? PLSS help

    Notes; I already have a schema ready relational/logic model... and I need to add more tables to this relational/logic model

    can I import the same way like I did before?

    But even if I do the same how can I add it in the template? as the logic model has been designed...

    Help, please...

    Thank you

    To view the diagram of logic model in Bachman notation, you right-click on a blank area of the diagram and select the rating > Bachman Notation.

    David

  • "import tables to a different scheme."

    Hello
    I use oracle 10g R2 and Ii have created 1 dmp file with the help of oracle export utility that contains the objects of the two schemas (user1 and user2). I have now created a user "gab10" now I want to import the schema tables both.
    Can anyone tell me is it possible using oracle import utility and how.



    Greetings and thanks
    Vikash Chauradia

    Vikash Chauradia (DBA) wrote:
    Hello
    I use oracle 10g R2 and Ii have created 1 dmp file with the help of oracle export utility that contains the objects of the two schemas (user1 and user2). I have now created a user "gab10" now I want to import the schema tables both.
    Can anyone tell me is it possible using oracle import utility and how.

    Yes it is possible, as below

    imp gab10/pass fromuser=(user1,user2) touser=(gab10,gab10) file='your_dmp_file'
    
  • How can I stop DW CS5.5 to import tables from word with a paragraph tag?

    I am importing a bunch of text from MS Word in Dreamweaver to clean front of publshing to a CMS.

    Seriously, each card contains simple tables like this:

    Order

    Rodentia

    Family

    Although

    Kind

    Rattus

    Species

    norvegicu s

    Every word in every cell is surround by paragraph tag.

    Word of < tr > < td > < p > < /p > < table > < td > < p > < /p > < table > < /tr > word2

    It is not only unnecessary, it's playing with my styles.

    Is there a setting somewhere that will solve this problem, or I'll simply have to import all tables and remove manually by clicking on each p in the path of markup, right-clicking and then clicking "remove the tag?

    If it is not any easy setting, is there a faster way to mass - edit this?

    Or maybe I should look at it the other way around:
    Is there something I can do in word to facilitate easier importation of tables in DW?

    Find & replace > specific tag.  See screenshot.

    Nancy O.

  • Data Pump: export/import tables in different schemas

    Hi all

    I use Oracle 11.2 and I would like to use the data pump to export / import data into the tables between the different schemas. The table already exists in the source and target schemas. Here are the commands that I use:


    Script working export:

    expdp scott/tiger@db12 schema = source include = TABLE:------"IN (\'TB_TEST1 ', \'TB_ABC')\ 'directory = datapump_dir dumpfile = test.dump logfile = test_exp.log

    Script to import all the desks:

    Impdp scott/tiger@db12 remap_schemas = rΘpertoire source: target = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only table_exists_action = truncate

    Script error for some tables to import:

    Impdp scott/tiger@db12 remap_schemas = source: target include = TABLE:------"IN (\'TB_TEST1')\' directory = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only

    Export is good, I got the folling error when I try to import the table TB_TEST1 only: "ORA-31655: no metadata data_or objectsz selected for employment. The user scott is a DBA role, and it works fine when I try to import all tables being export without the include clause.

    It is possble to import some tables but not all tables in the export file?

    Thanks for the help!

    942572 wrote:

    It's good to import all tables exported through Scott, who do NOT have the clause 'include '.

    I have the error only when I try to import some tables with clause "include".

    Can I import only some table since the export dump file? Thank you!

    you use INCLUDE incorrectly!

    do below yourself

    Impdp help = yes

    INCLUDE

    Include the specific object types.

    For example, INCLUDE TABLE_DATA =.

  • Having problems to import tables to a new user with new tablespace

    Hi all

    Here is my scenario:

    I have a user, the latter having data stored in storage - say q, w, e.
    I would like to export all the tabledata and import it to a new user USER_B. All data imported to B must be kept in the USER_B_PERM tablespace.

    I tried a lot of things yet, but imp always try importing the tables for the original tablespaces.

    What have I done now:


    create new databases:

    create tablespace USER_B_PERM
    DATAFILE ' / oradata_big, path, USER_B, USER_B_PERM. DBF' SIZE 126 M
    SIZE UNIFORM LOCAL 125M MANAGEMENT MEASURE;

    ALTER DATABASE DATAFILE ' / oradata_big, path, USER_B, USER_B_PERM. DBF'
    AUTOEXTEND ON;


    create temporary tablespace USER_B_TEMP
    TEMPFILE ' / oradata_big, path, USER_B, USER_B_TEMP. DBF' SIZE 126 M
    SIZE UNIFORM LOCAL 125M MANAGEMENT MEASURE;

    ALTER DATABASE TEMPFILE ' / oradata_big, path, USER_B, USER_B_TEMP. DBF'
    AUTOEXTEND ON MAXSIZE 3000M;


    CREATE user B:


    -SQL USER
    CREATE USER USER_B IDENTIFIED BY USER_B
    DEFAULT TABLESPACE "USER_B_PERM".
    TEMPORARY TABLESPACE "USER_B_TEMP";

    -ROLES
    GRANT "RESOURCE" TO USER_B;
    'CONNECT' GRANT TO USER_B.
    ALTER USER USER_B ROLE BY DEFAULT "RESOURCE", "CONNECT";

    -SYSTEM PRIVILEGES
    GRANT CREATE ANY VIEW TO USER_B;
    REVOKE A USER_B UNLIMITED TABLESPACE;
    change the quota USER_B unlimited user on USER_B_PERM.

    export of old data
    owner file = exp_user_a.exp exp = USER_A compatible exp.log = LINES Y = Y = log


    new user import
    IMP file = exp_user_a.exp fromuser = USER_A, USER_B log = imp.log ROWS = touser = Y



    Because I revoked the UNLIMITED TABLESPACE privilege imp creates tables in the original storage spaces (q, w, e) but the data import failed:
    ORA-01536: space-Programmvielfalt fur Tablespace 'q' passed.

    Any ideas? I don't see what is different to the docs:
    http://docs.Oracle.com/CD/B19306_01/server.102/b14215/exp_imp.htm#i1023312


    We are using oracle 11.2.0.1 and we have still a few columns long raw if it matters.

    Thanks in advance,
    Andreas

    I tried a lot of things yet, but imp always try importing the tables for the original tablespaces.

    If the table was created with the specific tablespace clause, then it will go in particular tablespace.

    I would like to export all the tabledata and import it to a new user USER_B. All data imported to B must be kept in the USER_B_PERM tablespace.

    easy to solve this problem in impdp datapump, use clause REMAP_TABLESPACE.

    http://docs.Oracle.com/CD/B19306_01/server.102/b14215/dp_import.htm

  • Data pump import a table in to another schema in 11g

    Hi all

    I have an Oracle 11.2 database and I have a requirement to import a few tables in a new scheme using my export from the previous month. I can't import any scheme as it is very large. I check REMAP_TABLE option, but it is enough to create the table in the same pattern and only rename the table.


    For example, I TABLE GAS. EMPLOYE_DATA I want to import the to GASTEST. EMPLOYEE_DATA

    Is there a way to do it using datapump?



    Appriciate any advice.

    Hello

    You can use the parameter INCLUDE in order to only select one Table:

    REMAP_SCHEMA=SCOTT:SCOTT_TEST
    INCLUDE=TABLE:"IN ('EMP')"
    

    Hope this helps.
    Best regards
    Jean Valentine

  • Large table must be moved schema

    Hi guru Oracle

    IM using AIX 5.3 with oracle 11.2.0.2

    I just finished a job where I took a LOW partition of a partitioned table on our production database and installs all the data table into its own not partitioned our database archive.  This table on Archive is 1.5 to.

    Just to give you a brief overview of what I've done, successfully:

    -Created a partition no table on Production with exactly the same structure as the partitioned table set apart from the partitions.  Then, I moved all the subject segments in the same tablespace as the table to make it TRANSPORTABLE.

    -J' then took an expdp of the metadata tables by using the transport_tablespaces parameter.

    -Take the tablespace to read write I used the cp command to transfer the data files to the new directory.

    -Then on the database to ARCHIVE, I used impdp to import metadata and direct them to new files.

    parfile =

    DIRECTORY = DATA_PUMP_DIR

    DUMPFILE = dumpfile.dmp

    LOGFILE logfile.log =

    REMAP_SCHEMA = schema1:schema2

    DB11G = "xx", "xx", "xx"...

    My problem now is that there is some confusion and I traced the wrong schema, this isn't a major problem, but I wouldn't like to groom, so instead of saying REMAP_SCHEMA = schema1:schema2 must be REMAP_SCHEMA = schema1:schema3

    To the question: what is the best way to occupy the table schema3 (1.5 to in scheam2).


    To the question: what is the best way to occupy the table schema3 (1.5 to in scheam2).

    The easiest way is to use EXCHANGE PARTITTION to just 'swap' in the segment.

    You can only 'swap' between a partitioned table and a non-partitioned table, and you already have a non-partitioned table.

    So, create a table that is partitioned with a score in the new schema and table not partitioned in the same pattern again.

    Then exchange the old schema with the partition table in the new schema. Then share this new partition with the new table not partitioned in the new scheme. No data should move at all - just a data dictionary operation.

    Using patterns HR and SCOTT. Suppose you have a copy of the EMP table in SCOTT that you want to move to HR.

    - as SCOTT
    grant all on emp_copy HR

    - as HR

    CREATE TABLE EMP_COPY AS SELECT * FROM SCOTT. EMP_COPY WHERE 1 = 0;

    -create a partitioned temp table with the same structure as the table
    CREATE TABLE EMP_COPY_PART
    PARTITION OF RANGE (empno)
    (partition ALL_DATA values less than (MAXVALUE)
    )
    AS SELECT * FROM EMP_COPY;

    -the Bazaar in the segment of the table - very fast
    ALTER TABLE EXCHANGE PARTITION ALL_DATA EMP_COPY_PART TABLE SCOTT. EMP_COPY;

    -now share again in the target table
    ALTER TABLE EXCHANGE PARTITION ALL_DATA WITH TABLE EMP_COPY EMP_COPY_PART;

  • How to import tables, index, tablespace, etc. of a DMP.

    Hello

    I would like to know how to import tables, index, tablespace, etc. of an export an Oracle 10.2.0.1 to apply on an Oracle 11.2.0 DMP. When I import the DMP file a unique Tablespace, for example, will users tablespace data.

    On the basis of origin, we have different tablespace each separated with different content (tables, index) and I would like to know if it is posible to import the same schema, tables, index, tablespace quite with a DMP alone, Export. I can't use DBUA because the database on the 10.2 software is missing, I have only one file (24 GB) DMP DMP!.

    Thanks for the reply,

    Sorry for my English :).

    Kind regards.

    The standard solution is

    -Make sure that the target user doesn't have the UNLIMITED TABLESPACE privilege, using REVOKE
    -Make sure that the target user has no QUOTA on the default tablespace using ALTER USER... QUOTA
    -Make sure that there is quota on the tablespace (s) of the target, using ALTER USER... QUOTA
    -import indexes = n
    -empty the index orders in a separate file
    IMP indexfile =...
    -run this file

    --------------
    Sybrand Bakker
    Senior Oracle DBA

  • Newb question: SQL query on an imported table

    Hi all. I'm a newb, that this issue will probably tell you!
    But I am very interested to familiarize themselves with the Oracle environment.

    I installed 10g Express.
    I was able to set up a new user (schema) profile?

    I created a new table, by importing an Excel file, saved as a CSV file. It has a PK.

    When I log in as an administrator, I can see the table 'discovers' the content.
    However, I can't run a SQL command with the new table.
    It is said Table or view does not exist.

    The table is called "Customers" and contains a column called "City" and I type City Select From Customers.

    Like the Adminisistrator, I have "given" all the rights for the new table to my profile newly created (schema).
    When I connect as long as this new person, I can't see the new table imported into the list of tables, yet alone to run an application or a SQL to it. What did I miss?

    Any tips will be appreciated. Glyn gray...

    Published by: user12501005 on February 7, 2010 11:26

    Hello

    You wrote:
    The original table is still visible to the diagram of the system, but he reports yet again as saying that the table does not exist when I try SQL.
    So is there a reason why a table imported is visible to the user DBA, but SQL does not exist?
    The other imported tables, those the DBA can run SQL still aren't visible to other users (schemas).
    The DBA has "tuned" EVERYTHING to the default HR schema.
    When I connect like HR, I'm unable to find the given table and SQL reports that the table does not exist.

    If I understand that you have created a Table in the schema of the SYSTEM, and you cannot question him by SQL.

    For example, if you then create a table A in the diagram of SYSTEM, you can view the table A if you connect to the database
    SYSTEM or any intended user he has the privilege to CHOOSE ANY TABLE.

    If you are connected to a user (for example, HR), which does not have this privilege, you can not query the Table.

    If you have a privilege to HR as follows:

    connect system/
    grant select on A to HR;
    

    Then you can query the Table has to HR:

    connect HR/
    select * from system.A;
    

    Hope this helps.
    Best regards
    Jean Valentine

  • name.table in ODI datastore schema name

    Hello

    We have a scheme-> others-> HFM_ESB-> TIME table.

    Now, I need to connect to HFMESB. TIME table. When I try to give this table in the ODI data store, I get the message as invalid table name.

    Any suggestions on how to call and reverse engineer these tables in ODI.

    We were previously using synonyms and therefore sees no problems. But now we cannot synonyms, where to get this working...

    Appreciate the quick response that few urgent.

    Thanks in advance

    Hello

    Apparently HFM_ESB is your schema.

    First, you create the server data in ODI, if you have not done: go in the topology tab in ODI Studio, then under Physical Architecture done right click on Oracle and choose new server data. Enter a name for your database server for example TRG_MyName and specify the username/password, for example, USER: ODI_STAGING PASS: MyPass. On the JDBC specify the driver JDBC and URL, for example: oracle.jdbc.OracleDriver and jdbc:oracle:thin:@localhost:1521:orcl respectfully. Save and close the window.

    Second, on the right, click on the (new) database server and click new schema of Phisical , specify a name and then set the schema: HFM_ESB and the scheme of work: ODI_STAGING

    Please note: If you do not have ODI_STAGING diagram you can specify your HFM_ESB as a USER and working scheme

    Thirdly, under the logical Architecture right-click on Oracle technology and create new logical schema specifies a name and you can leave the Global context and choose the given (new) physical schema in the physical patterns column.

    Fourth, go to the tab of the topology to the Designer tab. In the bar of models , click on the folder and new model of. Now, on the definition tab, enter a name for your model, choose Oracle as the technology, choose the logical schema that we just created. Go to the tab of bone , here you can leave everything in the Standard/table report which means do the standard reverse engineering for all tables in the given schema. If you want only to reconstruct the table PERIOD comes down to the click tab Sélective Reverse-Engineering on all the boxes at the top, then, under the control of column name of the Table field only PERIOD box and finally click on the button of boning (at the top left).

  • Georaster: Import very large images with sdo_geor.importFrom

    Hello

    What is the best way to import images of very large (> 50 GB) (GeoTIFF, TIFF, etc.) in a table georaster. Actually I don't have the image in a single file, but in raster 1 GB files.

    Approach 1:

    Import each image 1 GB with sdo_geor.importFrom block into small tiles (block settings) within the oracle and merge 50 rasters then into a large raster object using SDO_GEOR.mosaic. It is a recommended method?

    Approach 2:

    Create the actual tiles of all images with GDAL. Is it possible to import directly the tiles with meta-information into a table of raster data?

    Thanks for your suggestions on this matter.

    Yes, for Linux, you will need to compile it yourself. I did it without too much trouble on Centos.

    But the most straightforward approach is to download and install the latest QGIS for Windows. The grouped GDAL is up-to-date and includes the hooks of the OIC.

    See you soon,.

    Paul

  • Import Table or copy of rows, columns, or cells

    Hi all

    Since MS Excel, how can I import 10 cells in the Framemaker Table, where I have already 10 empty cells.

    Also, is it possible to copy several lines or columns in MS Excel (or MS Word) in the Framemaker?

    Thank you.

    You can copy and paste Excel cells in a FrameMaker paragraph, to create a new table of FrameMaker with these cells. You can then copy the new FrameMaker table cells into an existing table in FrameMaker and remove the new table. In particular, in Excel, copy the desired cells. Then, in paragraph FM, use edit > paste special to paste the cells under the RTF Format.

    -Lynne

  • Change all the triggers table in the oracle schema by using a script - possible?


    Is it possible to modify all the triggers table in the oracle schema using a single script or modify each trigger table separately is the only way?

    A couple of alternatives come to mind.

    (1) you can go in SQL Develolper, go to the schema, expand the node for triggers and select all the triggers that you want to change.  Right-click and choose the fast DDL--> save spreadsheet.  Find and replace in the worksheet and then run the script.

    (2) if the trigger is the same for all 70 tables, you can move the PL/SQL out of the trigger and in a procedure in a package and just call the procedure of 70 triggers.  Now, the code is kept in the same place.  For new tables, you can add a call to the procedure and you are done.  If all 70 triggers are not the same, see if you can pass parameters to allow a generalized procedure that they can all use.

    You have not indicated what your triggers.  Are insert you columns of auditing, archiving data, inserting into a table of log, update another table or something else?  What type of trigger is it?  What you trying to accomplish with your triggers?

    Marcus Bacon

  • Try to import tables datapump file that makes use of the transportable mode



    Hi using impdp on oracle 11.2.0.3 and have a dumpfile that contains export of tables which makes use of the transportable tablespace mode.



    To import 3 of the cobncerned file just form tables in another database using DME, but does not


    Error

    ORA-39002: invalid operation

    ORA-39061: import conflicts FULL mode with the TRANSPORTABLE export mode


    {code}

    UserID = archive / MDbip25

    DIRECTORY = TERMSPRD_EXTRACTS

    DUMPFILE = archiveexppre.964.dmp

    LOGFILE = por_200813.log

    PARALLEL = 16

    TABLES = ZPX_RTRN_CDN_STG_BAK, ZPX_RTRN_STG_BAK, ZPX_STRN_STG_BAK

    REMAP_TABLESPACE = BI_ARCHIVE_DATA:BI_ARCHIVE_LARGE_DATA

    REMAP_TABLESPACE = BI_ARCHIVE_IDX:BI_ARCHIVE_LARGE_IDX

    {code}

    All ideas

    A transportable export must be imported using an import of transportable.  Complete using = is not a valid option.  You might be able to pass in an expression to include for the tables that you want to, but the work must always be transportable.  Your import command should look like:

    Impdp directory of the user/password transport_datafiles=/path1/pathx/dir1/dir5/filename.dbf dpump_dir dumpfile = your_dumpfile.dmp = include = table: 'In ('a', 'b', 'c') '.

    I see you are using the api, but this would be the type of command line to import.

    I hope this helps.

    Dean

Maybe you are looking for

  • The module doubtful - dictionary French - bloody nuisance - why?

    French downloaded add-on from Mozilla spelling dictionary. Didn't get any dictionary but a flag attached to my bookmarks-toolbar, which I cannot remove. By clicking on the tricolour links to a Blog on physiotherapy. NOT GOOD, and if it can not be del

  • AC adapter dv6-3143sa light always on?

    Hi, I recently got a HP dv6, I noticed that the ac power light is always on, even when the laptop is turned off, is this normall? The light turns off only when the power is off.

  • Upgrade and repair black screen CQ61

    Hello I have a laptop Presario CQ61-214TU.  Upgrading to Windows 7 from Vista. Push the power button, I get a black screen and the fans and leds of power.  No MESSAGE or BIOS.  The LED caps lock flashes very slowly 3 times and then the unit resets.  

  • SideWinder Vista compatibility?

    I have a Dell with Windows 64-bit Vista Home Premium SP1.I have a steering wheel, Microsoft Sidewinder Force Feedback Wheel from 2002-2003. I'll buy Test Drive Unlimited for PC. The wheel will work with my operating system?

  • Question of PCI driver

    My has a Decive PCI driver! mark next to it, so I tried to update, and it wouldn't let me. So my question is the following how can I fix this problem? Y at - it an update for him or what I need to uninstall and re install it or what I can do? Thank y