Data pump import - can I can the name of the table when importing only?

It is possible to use data pump export on a table name, and then import (add lines) at a table of different table with the same structure of exact column? I don't see a way to change only the name of the table when importing.

Hello

From 11.1 you can remap a table name. Use:

remap_table = old_name:new_name

Thank you

Dean

Tags: Database

Similar Questions

  • Data pump import only indexes

    Hello I dumpfile having 100 million documents. I was loading to another server using data pump import utility. It loads the table of data in half an hour or less but it take 7hr when importing indexes. Finally I have to kill jobs and to reveal the index Btree alone was created successfully on all the table any failure of plan B to create now I have the dump file which have all data and metadata. is it possible in utility Datapump to import only the missing indexes or import all indexes only no table and other things? If I use the Include command simply index? Please suggest me a solution

    Oracle Studnet wrote:
    Right, but there is only one way to solve the problem I want to do? can I extract only the index of the dump file and import them rather table and other objects

    Impdp HR/hr dumpfile = directory = data_pump_dir hrdp.dmp include = index content = metadata_only

  • I am creating a stamp of Certification of drawing dynamic shop with fields of text 'Name' and 'Date '. I tried different ways within Javascript, but I can't seem to make it work. The Date and time remains static on the stamp when it is inserted and

    I am creating a stamp of Certification of drawing dynamic shop with fields of text 'Name' and 'Date '. I tried different ways within Javascript, but I can't seem to make it work. The Date and time remains static on the stamp when inserted and always reflect when the stamp was created and the name is always my name. I try to have the user name or the username inserted so that he who is the insertion of the postmark of the stamp automatically inserts their name and what day and time the stap is inserted. I can't get this dynamic stamp works like the default Adobe Acrobat Dynamic stamps. Can anyone help with this one. Thank you

    Have you created a page template for your stamp? Did name the template of the page correctly? It's the right format:

    #InternalStampName = display name of stamp

  • Data pump import

    I cannot using the data pump import tool. I work with Oracle 10.2.4.0 on Windows Server 2008. I am trying to import a large amount of data (several patterns, thousands of paintings, hundreds of GB of data) of the same version of Oracle on Linux. Since there is so much data and will be if long to import, I try to make sure that I can make it work for a table or a diagram before I try. So I will try to import the TEST_USER. Table 1 table using the following command:

    Dumpfile = logfile = big_file_import.log big_file.dmp Impdp.exe tables = test_user.table1 remap_datafile=\'/oradata1/data1.dbf\':\'D:\oracle\product\10.2.0\oradata\orcl\data1.dbf\'

    I provide sys as sysdba to connect when he asks (not sure how you get this info in the original order). However, I get the following error:

    the "TEST_USER" user does not exist

    My understanding is that the data pump utility would create all the necessary patterns for me. Is it not the case? The target database is a new facility for the source schemas do not exist here.

    Even if I create the schema of test_user by hand, then I get an error message indicating that the tablespace does not exist:

    ORA-00959: tablespace "TS_1" does not exist

    Then even to do it, I don't want to have to do first, does not work. Then he gives me something about how the user has no privileges on tablespace.

    The data pump utility is not supposed to do that sort of thing automatic? What I really need to create all schemas and storage by hand? That will take a long time. I'm doing something wrong here?

    Thank you
    Dave

    tables = test_user.table1

    The "TABLES" mode does NOT create database accounts.

    The FULL mode creates space storage and database accounts before importing the data.

    PATTERNS mode creates the database accounts before importing data - but expect Tablespaces to exist before so that tables and indexes can be created in the correct storage spaces.

    Hemant K Collette
    http://hemantoracledba.blogspot.com

  • "Resume" a Data Pump import Execution failure

    First of all, here is a "failure" Data Pump Import I want to "resume":
    ] $ impdp directory dumpfile=mySchema%U.dmp "" / as sysdba "" = DATA_PUMP_DIR logfile = Import.log patterns parallel = MYSCHEMA = 2

    Import: Release 10.2.0.3.0 - 64 bit Production Tuesday, February 16, 2010 14:35:15

    Copyright (c) 2003, 2005, Oracle. All rights reserved.

    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64 bit Production
    With the options of Real Application Clusters, partitioning, OLAP and Data Mining
    Table main 'SYS '. "' SYS_IMPORT_SCHEMA_01 ' properly load/unloaded
    Departure 'SYS '. "' SYS_IMPORT_SCHEMA_01 ': ' / * AS SYSDBA"dumpfile=mySchema%U.dmp directory = DATA_PUMP_DIR logfile = Import.log patterns parallel = MYSCHEMA = 2
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Object type SCHEMA_EXPORT/SYNONYM/SYNONYM of treatment
    Object type SCHEMA_EXPORT/TYPE/TYPE_SPEC of treatment
    Object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE of treatment
    Processing object type SCHEMA_EXPORT/SEQUENCE/EXCHANGE/OWNER_GRANT/OBJECT_GRANT
    Object type SCHEMA_EXPORT/TABLE/TABLE processing
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "MYSCHE"...
    ...
    ... 0 KB 0 rows
    Processing object type SCHEMA_EXPORT/TABLE/SCHOLARSHIP/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    ORA-39126: worker unexpected fatal worker error of $ MAIN. LOAD_METADATA [INDEX: "MONSCHEMA".] ["" OBJECT_RELATION_I2 "]
    SELECT process_order, flags, xml_clob, NVL (dump_fileid,: 1), NVL (dump_position,: 2), dump_length, dump_allocation, constituent, object_row, object_schema, object_long_name, processing_status, processing_state, base_object_type, base_object_schema, base_object_name, goods, size_estimate, in_progress 'SYS '. "" SYS_IMPORT_SCHEMA_01 "process_order where: 3: 4 AND processing_state <>: 5 AND double = 0 ORDER BY process_order
    ORA-06502: PL/SQL: digital error or value
    ORA-06512: at "SYS." "MAIN$ WORKER", line 12280
    ORA-12801: error reported in the parallel query P001, instance pace2.capitolindemnity.com:bondacc2 Server (2)
    ORA-30032: the suspended order (trade-in) has expired
    ORA-01652: unable to extend segment temp of 128 in tablespace OBJECT_INDX
    ORA-06512: at "SYS." DBMS_SYS_ERROR', line 105
    ORA-06512: at "SYS." "MAIN$ WORKER", line 6272
    -PL/SQL call stack-
    the line object
    serial number of handle
    package body 14916 0x1f9ac8d50 SYS. MAIN$ WORKER
    body 0x1f9ac8d50 6293 package SYS. MAIN$ WORKER
    body 0x1f9ac8d50 3511 package SYS. MAIN$ WORKER
    body 0x1f9ac8d50 6882 package SYS. MAIN$ WORKER
    body 0x1f9ac8d50 1259 package SYS. MAIN$ WORKER
    0x1f8431598 anonymous block 2
    Job 'SYS '. "' SYS_IMPORT_SCHEMA_01 ' stopped because of the fatal 23:05:45
    As you can see this work is not without treatment statistics, constraints, etc. from PL/SQL. I want to do is make another impdp command but ignore objects that were imported with success as shown above. Is it possible to do (using the EXCLUDE parameter maybe?) with impdp? If so, what would it be?

    Thank you

    John

    Why you just restart this work. It ignores everything that has already been imported.

    Impdp "" / as sysdba "" attach = "SYS." "" SYS_IMPORT_SCHEMA_01 ".

    Import > keep

    Dean

  • Error data pump import

    Hi all

    I get errors below when you try to use the data pump import a table into the dump of the file (of a pre-production database) in a different database (prod_db) even version. I used the expdp to generate the dump file.

    Gettings errors-
    ORA-39083
    ORA-00959
    ORA-39112

    Any suggestions or advice would be appreciated.

    Thank you


    Import: Release 10.2.0.1.0 - 64 bit Production

    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    ;;;
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64 bit Production
    With partitioning, OLAP and Data Mining options
    Main table 'SYSTEM '. "' SYS_IMPORT_TABLE_01 ' properly load/unloaded
    Start "SYSTEM". "" SYS_IMPORT_TABLE_01 ": System / * DIRECTORY = DATA_PUMP_DIR DUMPFILE = EXP_Mxxxx_1203.DMP LOGFILE = Mxxxx_CLAIMS. LOG TABLES = CLAIMS
    Object type SCHEMA_EXPORT/TABLE/TABLE processing
    ORA-39083: object of type THAT TABLE could not create with the error:
    ORA-00959: tablespace "OXFORD_DATA_01" does not exist
    Because sql is:
    CREATE TABLE 'xxx '. "" CLAIMS "("CLAIM_NUM"VARCHAR2 (25) NOT NULL ACTIVATE, ACTIVATE THE"LINE_NUM"NUMBER (3.0) NOT NULL, 'PAID_FLAG' CHAR (1), CHAR (1) 'ADJ_FLAG', 'CLAIM_TYPE' VARCHAR2 (20), 'MEM_ID' VARCHAR2 (20), VARCHAR2 (20) 'MEM_SUF', DATE 'MEM_BEG_DATE', 'MEM_REL' CHAR (2), 'MEM_NAME' VARCHAR2 (40), DATE 'MEM_DOB', 'MEM_SEX' CHAR (1),"REC_DATE"DATE, DATE OF THE"PAID_DATE") 'FROM_DATE' DATE "
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    ORA-39112: type of dependent object INDEX: 'xxx '. «CLAIMS_IDX1 ' ignored, base object type TABLE: 'Xxx'.» "" Creation of CLAIMS ' failed
    ORA-39112: type of dependent object INDEX: 'xxx '. «CLAIMS_PK ' ignored, base object type TABLE: 'Xxx'.» "" Creation of CLAIMS ' failed
    Object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS of treatment
    ORA-39112: dependent object type skipped INDEX_STATISTICS, object-based INDEX type: 'xxx '. "' CLAIMS_IDX1 ' failed to create
    ORA-39112: dependent object type skipped INDEX_STATISTICS, object-based INDEX type: 'xxx '. "" Creation of CLAIMS_PK ' failed
    Object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS treatment
    ORA-39112: dependent object type skipped TABLE_STATISTICS, object-based ARRAY type: "xxx". "" Creation of CLAIMS ' failed
    Work 'SYSTEM '. "" SYS_IMPORT_TABLE_01 "completed with error (s) from 6 to 13:49:33


    Impdp name username/password DIRECTORY = DATA_PUMP_DIR DUMPFILE = EXP_Mxxxx_1203.DMP LOGFILE is Mxxxx_CLAIMS. LOG TABLES = CLAIMS

    Before you create the tablespace or use clause REMAP_TABLESPACE for the import.

  • How to update the table when change list item in the classic report

    Hello
    I worked with apex 4.2 and I create normal classic report with list (named loved) select a column, now I want to update the table when the user changes the list with the new value, I can't create a dynamic action to do this, I create checkbox with the primary key and the loop for check point to update the table but I can not get the value of the list item. and for more speed, the user want to do it when changing the value from the list.

    My question
    1. how to do it in javascript and get the value of the list item and update the table with the new value
    2. do I have to use the API to create the list item so I can get the value of the report item or what.





    Thank you

    Ahmed

    You can find a lot of information in this forum (and outside in google) when you search for AJAX processes and demand. However, the tutorial in the link below should be useful:
    http://www.Oracle.com/WebFolder/technetwork/tutorials/OBE/DB/hol08/apexweb20/ajax_otn.htm

    BTW, if we answer your question, don't forget to mark the appropriate post as correct. It will help all of us in the forum.

  • Update of data by Data Pump Import

    Hi all!

    I want to update the data in my database using Full Import Data Pump to a different database. But I do not know which option I should use when executing import for the second time? Or I can run full import without encompassing any option again?

    Thank you

    Tien Lai

    What if all you want to do is update of data and if the tables already exist, you can use this:

    content of user/password = data_only =(truncate or append) table_exists_action Impdp...

    If you use Add, then add new data to the existing data. If you use truncate then existing data will be deleted and the data in the dumpfile will be imported.

    There will be problems if you have a referential constraint. Say that table 1 references given in table 2, if datapump loads data into Database2 first, and then it truncates the data again. This truncation may fail because there is a referential constraint on it. If you have a ref, then you must disable them before you run impdp and then allow them once impdp is completed.

    If all you want after the first import only the data, you can add the

    content = data_only

    your order expdp. He would complete much faster.

    Do not forget that your statistics on the tables and the indexes will not be reloaded if you use table_exists_action = truncate or replace existing statistics so would probably obsolete.

    If you want to replace the table and the data, then the command would be:

    Impdp username/password table_exists_action = replace...

    This will remove the table and re-create it, and then load the data, then create all dependent objects for tables.

    I hope this helps.

    Dean

  • How can I insert the apex user id in the table when connecting?

    Hello

    I created a database with SSO on application. As you know because, single-sign - on my users don't need to enter name of user and password to log into the application. When a user in my application login ID (email add) automatically appears in the upper right automatically. My question is how can I capture user ID in the user's my table when a connection of the user in the application?

    Appreciate any help.

    Thank you

    Mohammad

    Maybe my solution:

    1. Application-shared logic components, you have processes of the article.

    2. create processes with process Point: on the new Instance and text:

    declare
    l_exists integer;
    begin
    select count(*) into l_exists from user_objects
      where object_type = 'TABLE'
      and object_name = 'AUDIT_USERS' AND ROWNUM=1;
    if l_exists = 0 then
    execute immediate 'CREATE TABLE  AUDIT_USERS
           (WHEN DATE,
              USER_NAME VARCHAR2(30)
           )';
    end if;
    insert into AUDIT_USERS values (sysdate, v('APP_USER'));
    end;
    

    Concerning

    Ziut

  • 10g to 11 GR 2 upgrade using Data Pump Import

    Hello

    I intend to move a database 10g a windows server to another. However, there is also a requirement to upgrade this database to GR 11, 2. That's why I was going to combine the 2 in one movement.

    1. take an export of complete data to source 10 g data pump
    2. create a new database empty 11 g on the target environment
    3 import the dump file into the target database

    However, I have a couple of queries running in my mind about this approach-

    Q1. What happens with the SYSTEM and SYS SYSTEM and SYSAUX objects when importing? Given that I have in fact already created a new dictionary on the empty target database - import SYS or SYSTEM will simply produce errror messages that should be ignored?

    Q2. should I use EXCLUDE on SYS and SYSTEM (is it EXCLUDE better export or import side)?

    Q3. What happens if there are things like scheduled tasks, etc. on the source system - since these would be stored in the property SYSTEM tables, how I would bring them across to the target 11 g database?

    Thank you
    Jim

    This approach is included in the Upgrade Guide 11 GR 2 - http://docs.oracle.com/cd/E11882_01/server.112/e23633/expimp.htm

    PL, ensure that you use SYSDBA privileges to run commands expdp and impdp. See here the first sections of "Note".

    http://docs.Oracle.com/CD/B19306_01/server.102/b14215/dp_export.htm#sthref57
    http://docs.Oracle.com/CD/E11882_01/server.112/e22490/dp_import.htm#i1012504

    As mentioned, sown schemas (for example SYS etc.) are not exported. See http://docs.oracle.com/cd/B19306_01/server.102/b14215/dp_export.htm#i1006790

    HTH
    Srini

  • Data pump import a table in to another schema in 11g

    Hi all

    I have an Oracle 11.2 database and I have a requirement to import a few tables in a new scheme using my export from the previous month. I can't import any scheme as it is very large. I check REMAP_TABLE option, but it is enough to create the table in the same pattern and only rename the table.


    For example, I TABLE GAS. EMPLOYE_DATA I want to import the to GASTEST. EMPLOYEE_DATA

    Is there a way to do it using datapump?



    Appriciate any advice.

    Hello

    You can use the parameter INCLUDE in order to only select one Table:

    REMAP_SCHEMA=SCOTT:SCOTT_TEST
    INCLUDE=TABLE:"IN ('EMP')"
    

    Hope this helps.
    Best regards
    Jean Valentine

  • Generate data that are not available in the table

    I have a table that has monthly data for agents. Example of table structure is the following:

    create table agt_dum)

    agent_id number,

    number of months,

    Commission number)

    Examples of data include:

    AGENT_ID MONTHS COUNCIL

    1150
    12100
    1625
    2110
    2220

    Month value can be from 1 to 12.

    So as you can see these data is missing for the agents a few months.

    Here, the requirement is I need to generate data for each agent for 12 months. If the data is not already in the table and are should include agent_id, month number and commission is equal to zero.

    The output for the above data would be:

    AGENT_ID MONTHS COUNCIL

    1150
    12100
    130
    140
    150
    1625
    170
    180
    190
    1100
    1110
    1120
    2110
    2220
    230
    240
    250
    260
    270
    280
    290
    2100
    2110
    2120

    I tried SQL query with full outer join. I was able to generate data for columns of MONTHS and the COMMISSION using the NVL function. But I have been unable to generate the AGENT_ID.

    Please share your suggestions.

    I'm using Oracle 11 g 2.

    Thank you

    Sudhanshu

    If you need something like:

    WITH agt_dum (agent_id, MONTH, commission)

    (SELECT 1, 1, 50 IN all double union)

    Select 1, 2, 100 of all the double union

    Select 1, 6, 25 Union double all the

    Select double union all 2, 1, 10

    SELECT 2, 2, 20 DOUBLES)

    all_months as (select rownum as LUN double connect by rownum<=>

    SELECT *.

    To all_months:

    partition outer join agt_dum ad bequeathed (agent_id) on (ad.month = am.mon)

    /

    It is not yet complete, but I'm sure you can handle the rest...

    HTH

  • The columns appear in the data store that are not in the table

    Hi guru,.

    I have a table that I built in Oracle and added to ODI by refactoring it, but when I add it as a source to my interface, it contains columns that are not physically in the Oracle table that I can look into a TOAD. I dropped the prior of the table deleted-> interface-> reverse engineering table but I get an additional column. Y at - it an explanation for that? I have the table listed in a stored procedure - should I delete this step as well before reverse engineering, or is something else.

    Thanks in advance.

    Mike

    Yes. There is a possible explanation. How do you are engineering it? If you use Oracle RKM, you may be using a customized version of this KM which adds a column of your table name...

  • data not getting not inserted into the table in pl/sql

    Hi all

    Here is my code to the procedure:
    CREATE or REPLACE PROCEDURE GENERATE_SHIFTS (p_plant_fk_key in NUMBER,
    p_start_date IN this update,
    p_end_date IN DATE)
    IS
    -declation of local variable
    l_plant_fk_key NUMBER;
    l_sysdate DATE: = sysdate; - sysdate variable
    l_last_update_system_id NUMBER;
    l_last_update_date DATE;
    l_plant_pk VARCHAR2 (120);
    l_start_time varchar2 (8); - variable to store the start time
    l_end_time VARCHAR2 (8); - variable to store end_time
    l_graveyard VARCHAR2 (30);
    number of l_shift_num;
    l_shift_name VARCHAR2 (240);
    l_line_num NUMBER;
    l_start_date DATE;
    l_end_date DATE;
    l_equipment_fk_key NUMBER;
    NUMBER of l1_plant_fk_key;
    Number of N;
    l_creation_system_id NUMBER;
    l_organization_code VARCHAR2 (120);
    l_system_fk_key NUMBER;
    l_site_id NUMBER;
    l_entity_pk_key NUMBER;
    l_entity_name VARCHAR2 (100);
    l_entity_type VARCHAR2 (30);
    l_production_entity VARCHAR2 (1);
    l_shift_workday_pk_key NUMBER;
    DATE of l1_shift_date;
    l_from_date DATE;
    l_to_date DATE;
    BEGIN

    l_plant_fk_key: = p_plant_fk_key;
    l_start_date: = p_start_date;
    l_end_date: = p_end_date;

    DELETE FROM mth_workday_shifts_D WHERE plant_fk_key = l_plant_fk_key and shift_date > = l_start_date and shift_date < = l_end_date;

    Select value start_time, end_time, cemetery, shift_num, line_num, shift_name l_start_time, l_end_time, l_graveyard, l_shift_num, l_line_num, l_shift_name of

    mth_site_shift_definitions where plant_fk_key = l_plant_fk_key;
    Select plant_pk in the mth_plants_d l_plant_pk where plant_pk_key = l_plant_fk_key;
    SELECT organization_code, system_fk_key IN l_organization_code, l_system_fk_key FROM mth_organizations_l WHERE plant_fk_key is
    l_plant_fk_key;
    N: = l_end_date-l_start_date;

    because me in 1... N
    LOOP


    l_last_update_system_id: = - 99999.
    l_last_update_date: = l_sysdate;
    l_creation_system_id: = - 1;

    INSERT INTO mth_workday_shifts_D (shift_workday_pk_key, shift_workday_pk,
    shift_date, shift_date_julian, plant_fk_key, line_num, shift_type, graveyard_shift, from_date, to_date, shift_num, shift_name, source_org_code, system_fk_key,
    CREATION_DATE, last_update_date, creation_system_id,
    last_update_system_id)
    VALUES (mth.mth_workdays_shifts_s.nextval, l_start_date |) » -'|| l_shift_num | » -'|| l_plant_pk, l_start_date, TO_NUMBER (TO_CHAR (l_start_date, 'J')),
    l_plant_fk_key, l_line_num, null, l_graveyard, l_start_date + ((TO_NUMBER (SUBSTR (l_start_time, 1, 2)) + (TO_NUMBER (SUBSTR (l_start_time, 4, 2)) / 60) + (TO_NUMBER (SUBSTR (l_start_time, 7, 2)) / 3600)) / 24).
    l_start_date + ((TO_NUMBER (SUBSTR (l_end_time, 1, 2)) + (TO_NUMBER (SUBSTR (l_end_time, 4, 2)) / 60) + (TO_NUMBER (SUBSTR (l_end_time, 7, 2)) / 3600)) / 24), l_shift_num, l_shift_name, l_organization_code, l_system_fk_key,.
    l_sysdate, l_sysdate, l_creation_system_id, l_last_update_system_id);
    l_start_date: = l_start_date + 1;


    END LOOP;
    COMMIT;


    DELETE FROM mth_equipment_shifts_d WHERE availability_date > = l_start_date AND availability_date < = l_end_date
    AND equipment_fk_key IN (SELECT DISTINCT (Nvl(a.equipment_fk_key,0))
    OF mth_equipment_shifts_d a, b mth_equipments_d WHERE the b.equipment_pk_key = a.equipment_fk_key AND b.plant_fk_key = l_plant_fk_key);

    INSERT INTO mth_equipment_shifts_d (equipment_fk_key, availability_date, shift_workday_fk_key, from_date, To_Date, line_num, availability_flag, entity_type, creation_date, last_update_date, creation_system_id,
    (last_update_system_id)

    SELECT b.entity_pk_key entity_pk_key, a.shift_date shift_date, a.shift_workday_pk_key shift_workday_fk_key, a.from_date from_date, To_Date, line_num, 'Y', a.line_num a.to_date
    b.entity_type entity_type, SYSDATE-1, SYSDATE,-99999

    -IN l_site_id, l_entity_pk_key, l_entity_name, l_entity_type, l_production_entity,.
    -l_shift_workday_pk_key, l1_shift_date, l_from_date, l_to_date
    Mth_workday_shifts_d a.,
    (
    SELECT site_id plant_fk_key, entity_pk_key, entity_name, entity_type, production_entity production_entity
    OF mth.mth_equip_entities_mst
    UNION ALL
    SELECT plant_pk_key site_id, plant_pk_key entity_pk_key, plant_name entity_name, "Site", production_site production_entity
    OF mth.mth_plants_d
    UNION ALL
    SELECT plant_fk_key site_id, resource_pk_key entity_pk_key, nom_entite resource_name, "Resource", production_resource production_entity
    OF mth.mth_resources_d
    UNION ALL
    SELECT plant_fk_key site_id, equipment_pk_key entity_pk_key, equipment_name entity_name, "Equipment", production_equipment production_entity
    OF mth.mth_equipments_d
    ) b
    WHERE b.site_id = l_plant_fk_key
    AND a.plant_fk_key = l_plant_fk_key
    AND shift_date > = l_start_date
    AND shift_date < = l_end_date);

    COMMIT;
    END GENERATE_SHIFTS;

    data are not get inserted in the camera moves d table. If I use the same query off 25000 procedure lines get inserted. Can someone tell me what is the problem with him.

    Kind regards
    Amrit

    The start date you passed is changed before EXECUTION of your insert on this table

    l_start_date := l_start_date+1;
    

    Maybe she go back to the original value before the isnert

    l_start_date := l_start_date-1;
    

    See you soon!
    Bobin

  • How insert/DML data in the table when the data in the related table changes

    Hello guys!

    I came across a problem that I need to get fixed. Because I don't know how to start and get it resolved I wanted to ask you for your expertise.

    The scenario is as follows:

    I have a table 'a' in my 10g database and a view "ab" which combined table 'a' with 'b' table in a view. However, the 'b' table is a table in another schema Manager database. and accessible (read only right) via a database link.

    Now here it is: whenever the data changes in table "b", for example 2 new sets of data is inserted, I need to insert automatically the 2 values of these 2 sets of data in my table "a". Same procedure for update and delete in table "b".

    The action that inserts data into the table 'a' must be initialized in my database, I have limited access to the other. Can I somehow use a trigger my reviews of "ab" to insert data into the table "a"? Or is it possible to use the "change notification procedure database" using the view as the reference?

    Desperately need help and example of all suspicion/code greatly appreciated. I am very new to Oracle and not very fond of PL/SQL routines. So please be so kind as to give me more details.

    Thanks in advance - I hope you have any ideas how I can get this problem resolved.

    Sebastian

    >

    ... it does not, since the DDL operations are not permitted on the remote databases (ORA-02021). I can't create the trigger on a view either. :-(
    So what ways are left to insert data into the table 'a' when the related table changes?

    Please, help if you have an idea!

    Yes,
    You can't perform the DDL (create the trigger...) on remote databases as you can see...
    Try to create this trigger in the local database that will make DML (insert into...) on the remote database.

        CREATE OR REPLACE TRIGGER local_forward_pt_after_insert
         AFTER INSERT
             ON N2K_INV_PT
             FOR EACH ROW
    
         BEGIN
             -- Insert records into table "a"
             INSERT INTO TBL_PUNKTDATEN@remote_database_sid
              ( INT_NUMMER,
                STR_GEBIET
                 )
             VALUES
              ( :new.INT_INV_PT_NR,
                :new.GEBIET );
         END;
    

    Thank you

    Good luck

Maybe you are looking for

  • Flash player is not installed

    Hello I was using Internet Explorer 9 as my default browser. I downloaded Firefox and all imported via, but Adobe Flash Player is not on Firefox. It is required to manually install the plugin and when I click the icon, it does not install. Since I ha

  • New Skype - group not chat messages

    Awhile back Skype had a big update. Since chat messages and then typed by one of the same person is grouped together make impossible to see which text belongs to what message.before I could see a timestamp on each message, but the same user id now di

  • No sound on Satellite Pro 6000

    Hello! I have a problem with my sound on a satellite pro 6000.my rather is an audio Ali and is properly installed. (you do see something no yellow icon ore.)I also installed the device and rebooted the computer, but nothing happens. When I check my h

  • DST check short failure

    HelloIt is Aneesh. I m using HP Pavilion TouchSmart PC 14-n296tx laptop. A few days ago I had a problem in opening the file manager. I was unable to open all files or run any application. Then I did a check of the system. I got the following errorVer

  • Pavilion dv6-7135nr: loss of network connectivity and wireless after upgrade to Windows 10 cable

    After the upgrade to Windows 10, I lose network connectivity after a restart of the computer. The operating system fails to load drivers Broadcom 4313 n. The driver date is May 8 202. The reason code is 31 on failure to load the drivers. I returned t