Incremantal loading data:

I have two tables  EMP1,EMP2_HIST is a history table
EMp1 is an OLTP table,EMP2_HIST is a history table
daily incremental data is loaded from EMP1 to EMP2_HIST tables
how can i do this one through PL/SQL 

How can i achieve this one .Can you Please explain me on this 

Hello

you want to say that you will get all the days of data and you want to transfer it into the test table and then compare it with you live table, if the record exists then it should be updated and if records does not exist then it must insert that correct the records.

Hope you have created primary key on you live comparing table would be easy with the test table.

for this, I think you have two ways...
(1) merger of use
(2) cursor.

How to use the merge.

create or replace procedure test
as
begin
MERGE
      INTO  live_table a
       USING test_table b
       ON  ( a.primary_key = b.primary_key......you can specify multiple column name here. )
    WHEN MATCHED
    THEN
       UPDATE
       SET   a.column_name1 = b.column_name1,
            a.column_name2 = b.column_name2
   WHEN NOT MATCHED
   THEN
      INSERT ( a.List_of_Column_names_you_need_to_Insert)
      VALUES ( a.List_of_Column);
Exception ---if you catch any exception here
end;

cursor,

create or replace procedure test
as
cursor cur is select * From test_table;
v_1 number;
begin
for rec in cur
loop
select count(*) into v_1 from live_table a
where  a.primary_key=rec.primary_key;
if v_1 >0 then
update statement;
else
insert statement;
exception if you have any;
end;

Note: -.
Its recommended to use the statement of merger or Clean updates on a cursor For loop.

Tags: Database

Similar Questions

  • built-in photo editor called (photos) does not. It says "cannot load data.» Check your connection.

    Can someone please help, when I click on a photo to change and I chose the second Editor (photos), that it loads a bit, but then he said can not load data, check your connection. But it used to work before! and I am connected to wifi!

    Try to force stop the app and erase the data. It might be a little difficult because it does not find in the settings-applications-all. However, you should be able to open the app in app menu, then press home. After that, select recent application manager and longer tap on the pictures, you should get info app.

  • Loading data from the external XML file

    Hello people!

    I have a XML file on my server, updated by cron every 10 minutes and I want to load data to my webworks her application, but the jquery ajax function do not support the folded areas. So here's my question - how should this information? Can I somehow download this file and then use ajax?

    Welcome!

    in your config.xml file, add an authorization to access your server:

    subdomains of http://www.yourserver.com"="true"/ >

    This will get by cross-origin issues.

  • ODI - 1228:Task Load Data - LKM File to SQL-fails on the connection target: table or view does not exist

    While performing a mapping (present in the package) that loads the file to table data, my mapping is being failed in the step - LKM file with above mentioned SQL error.

    This task is running for 30 candy Mint and loading data about 30 to 40 million for the temporary table of C$ ODI.

    Before the completion of the task is to make failure and also C$ table is also get deleted.

    Any possible resolution for above mentioned the issue?

    Problems have been solved.

    In our case, the prefix of all the data store name has been SRC_ so the nickname of all the data store became SRC, and the table name C$ depends on the daatastore Alias.

    So for executing two mapping tables $ CAN have been getting dropped by other mapping due to the same table name $ CAN.

    Change the Alias name giving it a unique name solve the problem.

  • FDMEE LOAD DATA shows RUNNING status

    We are NOT able to reset the status of a task to load data FDMEE. It shows running since yesterday, we tried to restart the Foundation & FDMEE.

    I tried to update the tables AIF_PROCESSES and AIF_PROCESS_DETAILS corresponding to SUCCESS employment status and restarted the FDMEE but in vain. Surprisingly in the PROCESS MONITOR section it shows as successful. Dorsal and ODI all these jobs listed at 'SUCCESS' but on data FDMEE load rule page it shows 'How TO'.

    This limits us the extra charge for the reloading of data running.

    Version: 11.1.2.3.700

    Application: HFM

    Take a look at the AIF_BALANCE_RULES table and the STATUS column

    See you soon

    John

  • Error in loading data FDMEE

    Hello all i am trying to load data using FDMEE but I got these errors but I do not understand what it is

    "

    2016-01-25 12:34:31, 792 INFO [AIF]: beginning of the process FDMEE, process ID: 104

    2016-01-25 12:34:31: 792 [AIF] INFO: recording of the FDMEE level: 4

    2016-01-25 12:34:31, 793 [AIF] INFO: user:

    2016-01-25 12:34:31, 793 INFO [AIF]: place: APC_Data_location (Partitionkey:11)

    2016-01-25 12:34:31, 793 [AIF] INFO: period name: Dec-2015 (period key: 12/31/15 12:00 AM)

    2016-01-25 12:34:31, 793 INFO [AIF]: name of the category: real (category key: 1)

    2016-01-25 12:34:31, 793 [AIF] INFO: name rule: Data_loadRule_1 (rule ID:13)

    2016-01-25 12:34:33, 792 [AIF] INFO: Version FDM: 11.1.2.4.000

    2016-01-25 12:34:33, 792 INFO [AIF]: connect the file encoding: UTF-8

    2016-01-25 12:34:34, 997 [AIF] INFO: - START IMPORT STEP -

    2016-01-25 12:34:35, 157 INFO [AIF]: run the following script: /u02/Oracle/Middleware/EPMSystem11R1/products/FinancialDataQuality/bin/FusionCloud/FusionCloudAdapter.py

    2016-01-25 12:34:35, INFO 171 [AIF]: FusionCloudAdapter.importDataFromFusion - START

    2016-01-25 12:40:57, 601 INFO [AIF]: output Standard: INFO: from script FusionCloudAdapter...

    Proxy configuration for deployment in Production SEEP

    Main program of starting FusionAdapter...

    mode of production: importDataFromFusion, pid: 104

    FusionAdapter initialized during initialization.

    fusionGlWebServiceWsdl: http://

    fusionGlWebServiceUser: sysadmin

    fusionProductType: GL

    The main program of FusionAdapter is complete.

    INFO: The FusionCloudAdapter script failed as described above.

    2016-01-25 12:40:57, 601 INFO [AIF]: SD: com.sun.xml.ws.wsdl.parser.InaccessibleWSDLException: 2 InaccessibleWSDLException heads.

    java.io.IOException: unable to proxy tunnel. Proxy returns "HTTP/1.1 502 cannotconnect.

    java.io.IOException: unable to proxy tunnel. Proxy returns "HTTP/1.1 502 cannotconnect.

    at com.sun.xml.ws.wsdl.parser.RuntimeWSDLParser.tryWithMex(RuntimeWSDLParser.java:182)

    at com.sun.xml.ws.wsdl.parser.RuntimeWSDLParser.parse(RuntimeWSDLParser.java:153)

    at com.sun.xml.ws.client.WSServiceDelegate.parseWSDL(WSServiceDelegate.java:284)

    to com.sun.xml.ws.client.WSServiceDelegate. < init > (WSServiceDelegate.java:246)

    to com.sun.xml.ws.client.WSServiceDelegate. < init > (WSServiceDelegate.java:197)

    to com.sun.xml.ws.client.WSServiceDelegate. < init > (WSServiceDelegate.java:187)

    to weblogic.wsee.jaxws.spi.WLSServiceDelegate. < init > (WLSServiceDelegate.java:86)

    to weblogic.wsee.jaxws.spi.WLSProvider$ ServiceDelegate. < init > (WLSProvider.java:632)

    at weblogic.wsee.jaxws.spi.WLSProvider.createServiceDelegate(WLSProvider.java:143)

    at weblogic.wsee.jaxws.spi.WLSProvider.createServiceDelegate(WLSProvider.java:117)

    at weblogic.wsee.jaxws.spi.WLSProvider.createServiceDelegate(WLSProvider.java:88)

    to javax.xml.ws.Service. < init > (Service.java:77)

    to com.hyperion.aif.ws.client.ErpIntegrationService.ErpIntegrationService_Service. < init > (ErpIntegrationService_Service.java:74)

    to com.hyperion.aif.fusion.FusionAdapter. < init > (FusionAdapter.java:177)

    at com.hyperion.aif.fusion.FusionAdapter.main(FusionAdapter.java:85)

    2016-01-25 12:40:57, ERROR 602 [AIF]: the script failed to run:

    2016-01-25 12:40:57, FATAL 605 [AIF]: error in Comm.executeJythonScript

    Traceback (most recent call changed):

    Folder "< string >", line 528, in executeJythonScript

    File "/ u02/Oracle/Middleware/EPMSystem11R1/products/FinancialDataQuality/bin/FusionCloud/FusionCloudAdapter.py", line 163, < module >

    fusionCloudAdapter.importDataFromFusion)

    File "/ u02/Oracle/Middleware/EPMSystem11R1/products/FinancialDataQuality/bin/FusionCloud/FusionCloudAdapter.py", line 60, in importDataFromFusion

    raise RuntimeError

    RuntimeError

    2016-01-25 12:40:57, 692 FATAL [AIF]: error in import data Pre COMM

    2016-01-25 12:40:57, 697 [AIF] INFO: end process FDMEE, process ID: 104

    Thank you

    The script attempts to connect to a WSDL URL and therefore cannot error

    "java.io.IOException: unable to tunnel by proxy." «Proxy returns «HTTP/1.1 502 cannotconnect»»

    I see that it is from SEEP, maybe you have not set the correct details for Fusion Cloud, are you sure that you configure the WSDL URL appropriate in the connection source Configuration section in FDMEE

    See you soon

    John

  • Load data into a table

    Hi friends,

    I'm trying to load records into the rules of the product table of the table with the following...

    create table product)

    prod_id varchar2 (20).

    prod_grp varchar2 (20).

    from_amt number (10),

    to_amt number (10),

    share_amt number (10)

    );

    Insert into product (prod_id, prod_grp, from_amt, share_amt) Values ('10037', "STK", 1, 18);

    Insert into product (prod_id, prod_grp, from_amt, share_amt) Values ('10037', "NSTK", 1: 16.2);

    Insert into product (prod_id, prod_grp, from_amt, to_amt, share_amt) Values ('10038', "NSTK", 1, 5000, 12);

    Insert into product (prod_id, prod_grp, from_amt, to_amt, share_amt) Values ('10038', "STK", 5001, 10000, 16);

    Insert into product (prod_id, prod_grp, from_amt, share_amt) Values ('10038', "STK", 10001, 20);

    Insert into product (prod_id, prod_grp, from_amt, to_amt, share_amt) Values ('10039', "NSTK", 1, 8000, 10);

    Insert into product (prod_id, prod_grp, from_amt, share_amt) Values ('10039', "STK", 8001, 12);

    create table rules)

    rule_id varchar2 (30),

    rule_grp varchar2 (10),

    rate_1 number (10),

    point_1 number (10),

    rate_2 number (10),

    point_2 number (10),

    rate_3 number (10),

    point_3 number (10)

    );

    Criteria of loading in the rules of the table:

    rule_id - "RL" | Product.prod_id

    rule_grp - product.prod_grp

    rate_1 - product.share_amt where from_amt = 1

    point_1 - product.to_amt

    rate_2 - if product.to_amt in point_1 is not NULL, then find product.share_amt of the next record with the same rule_id/prod_id where from_amt (of the next record) = to_amt (current record -

    point_1) + 1

    point_2 - if product.to_amt in point_1 is not NULL, then find product.to_amt of the next record with the same rule_id/prod_id where from_amt (of the next record) = to_amt (current record - )

    point_1) + 1

    rate_3 - if product.to_amt in point_2 is not NULL, then find product.share_amt of the next record with the same rule_id/prod_id where from_amt (of the next record) = to_amt(current )

    Enregistrement-point_2) + 1

    point_3 - if product.to_amt in point_2 is not NULL, then find product.to_amt of the next record with the same rule_id/prod_id where from_amt (of the next record) = to_amt (current record - )

    point_2) + 1

    I tried to load the first columns (rule_id, rule_grp, rate_1, point_1, rate_2, point_2) via the sql loader.

    SQL > select * from product;

    PROD_ID PROD_GRP FROM_AMT TO_AMT SHARE_AMT

    -------------------- -------------------- ---------- ---------- ----------

    10037                STK                           1                    18

    10037                NSTK                          1                    16

    1 5000 12 NSTK 10038

    10038 5001-10000-16 STK.

    10038 10001 20 STK.

    10039 1 8000 10 NSTK

    10039                STK                        8001                    12

    produit.dat

    PROD_ID | PROD_GRP | FROM_AMT | TO_AMT | SHARE_AMT

    "10037' |'. STK' | 1. 18

    "10037' |'. NSTK' | 1. 16.2

    '10038' |' NSTK' | 1. 5000 | 12

    '10038' |' STK' | 5001 | 10000 | 16

    '10038' |' STK' | 10001 | 20

    "10039' |'. NSTK' | 1. 8000 | 10

    "10039' |'. STK' | 8001 | 12

    Product.CTL

    options (Skip = 1)

    load data

    in the table rules

    fields ended by ' |'

    surrounded of possibly ' '.

    trailing nullcols

    (rule_id POSITION (1) ""RL"|: rule_id")

    rule_grp

    from_amt BOUNDFILLER

    point_1

    share_amt BOUNDFILLER

    , rate_1 ' BOX WHEN: from_amt = 1 THEN: share_amt END.

    , rate_2 expression "(sélectionnez pr.share_amt de produit pr où: point_1 n'est pas null et pr.prod_id=:rule_id et: point_1 =: from_amt + 1)" "

    , expression point_2 "(sélectionnez pr.to_amt de produit pr où: point_1 n'est pas null et pr.prod_id=:rule_id et: point_1 =: from_amt + 1)" "

    )

    He has not any support only values in rate_2, point_2... no error either... Not sure if there is another method to do this...

    Please give your suggestions... Thank you very much for your time

    Hello

    Thanks for posting the CREATE TABLE and INSERT instructions for the sample data; It's very useful!

    Don't forget to post the exact results you want from this data in the sample, i.e. what you want the rule table to contain once the task is completed.

    As ground has said, there is no interest to use SQLLDR to copy data from one table to another in the same database.  Use INSERT, or perhaps MERGE.

    2817195 wrote:

    Thank you for your answers... I thought it would be easier to manipulate the data using sql loader... I tried to use insert but do not know how to insert values in point_2, rate_3, rate_2, point_3, columns... For example, when point_1 is not null, need to do a find for the next with the same rule_id record and if the inserted record = pr.from_amt + 1 point_1 then RATE_2 should be inserted with this pr.share_amt of this record...

    SQL > insert into the rules)

    2 rule_id,

    rule_grp 3,.

    rate_1 4,.

    point_1 5,.

    rate_2 6,.

    point_2 7,.

    rate_3 8,.

    9 point_3)

    10. Select

    11 'RL ' | PR.prod_id RULE_ID,

    12 pr.prod_grp RULE_GRP,

    13 CASES WHEN END of pr.from_amt = 1 THEN pr.share_amt RATE_1,

    14 pr.to_amt POINT_1,

    15 (select pr.share_amt from product pr where point_1 is not null and rules.rule_id = pr.prod_id and point_1 = pr.from_amt + 1) RATE_2,

    16 (select pr.to_amt from product pr where point_1 is not null and rules.rule_id = pr.prod_id and = pr.from_amt + 1 point_1) POINT_2,.

    17 (select pr.share_amt from product pr where point_2 is not null and rules.rule_id = pr.prod_id and = pr.from_amt + 1 point_2) RATE_3,.

    18 (select pr.to_amt from product pr where point_2 is not null and rules.rule_id = pr.prod_id and = pr.from_amt + 1 point_2) POINT_3

    19 product pr;

    (select pr.share_amt from product pr where point_1 is not null and point_1 = pr.from_amt + 1) RATE_2,

    *

    ERROR on line 15:

    ORA-00904: "POINT_1": invalid identifier

    Help, please... Thank you very much

    This is what causes the error:

    The subquery on line 15 references only 1 table in the FROM clause, and this table is produced.  There is no point_1 column in the product.

    A scalar subquery like this can be correlated to a table in the Super request, but the only table in the FROM (line 19) clause is also produced.  Since the only table that you read is produced, only columns that you can read are the columns of the product table.

    You use the same table alias (pr) to mean different things 5. It's very confusing.  Create aliases for single table in any SQL statement.  (What you trying to do, I bet you can do without all these subqueries, in any case.)

  • Question to load data using sql loader in staging table, and then in the main tables!

    Hello

    I'm trying to load data into our main database table using SQL LOADER. data will be provided in separate pipes csv files.

    I have develop a shell script to load the data and it works fine except one thing.

    Here are the details of a data to re-create the problem.

    Staging of the structure of the table in which data will be filled using sql loader

    create table stg_cmts_data (cmts_token varchar2 (30), CMTS_IP varchar2 (20));

    create table stg_link_data (dhcp_token varchar2 (30), cmts_to_add varchar2 (200));

    create table stg_dhcp_data (dhcp_token varchar2 (30), DHCP_IP varchar2 (20));

    DATA in the csv file-

    for stg_cmts_data-

    cmts_map_03092015_1.csv

    WNLB-CMTS-01-1. 10.15.0.1

    WNLB-CMTS-02-2 | 10.15.16.1

    WNLB-CMTS-03-3. 10.15.48.1

    WNLB-CMTS-04-4. 10.15.80.1

    WNLB-CMTS-05-5. 10.15.96.1

    for stg_dhcp_data-

    dhcp_map_03092015_1.csv

    DHCP-1-1-1. 10.25.23.10, 25.26.14.01

    DHCP-1-1-2. 56.25.111.25, 100.25.2.01

    DHCP-1-1-3. 25.255.3.01, 89.20.147.258

    DHCP-1-1-4. 10.25.26.36, 200.32.58.69

    DHCP-1-1-5 | 80.25.47.369, 60.258.14.10

    for stg_link_data

    cmts_dhcp_link_map_0309151623_1.csv

    DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2

    DHCP-1-1-2. WNLB-CMTS-03-3,WNLB-CMTS-04-4,WNLB-CMTS-05-5

    DHCP-1-1-3. WNLB-CMTS-01-1

    DHCP-1-1-4. WNLB-CMTS-05-8,WNLB-CMTS-05-6,WNLB-CMTS-05-0,WNLB-CMTS-03-3

    DHCP-1-1-5 | WNLB-CMTS-02-2,WNLB-CMTS-04-4,WNLB-CMTS-05-7

    WNLB-DHCP-1-13 | WNLB-CMTS-02-2

    Now, after loading these data in the staging of table I have to fill the main database table

    create table subntwk (subntwk_nm varchar2 (20), subntwk_ip varchar2 (30));

    create table link (link_nm varchar2 (50));

    SQL scripts that I created to load data is like.

    coil load_cmts.log

    Set serveroutput on

    DECLARE

    CURSOR c_stg_cmts IS SELECT *.

    OF stg_cmts_data;

    TYPE t_stg_cmts IS TABLE OF stg_cmts_data % ROWTYPE INDEX BY pls_integer;

    l_stg_cmts t_stg_cmts;

    l_cmts_cnt NUMBER;

    l_cnt NUMBER;

    NUMBER of l_cnt_1;

    BEGIN

    OPEN c_stg_cmts.

    Get the c_stg_cmts COLLECT in BULK IN l_stg_cmts;

    BECAUSE me IN l_stg_cmts. FIRST... l_stg_cmts. LAST

    LOOP

    SELECT COUNT (1)

    IN l_cmts_cnt

    OF subntwk

    WHERE subntwk_nm = l_stg_cmts (i) .cmts_token;

    IF l_cmts_cnt < 1 THEN

    INSERT

    IN SUBNTWK

    (

    subntwk_nm

    )

    VALUES

    (

    l_stg_cmts (i) .cmts_token

    );

    DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_cmts (i) .cmts_token);

    ON THE OTHER

    DBMS_OUTPUT. Put_line ("token is already present'");

    END IF;

    WHEN l_stg_cmts EXIT. COUNT = 0;

    END LOOP;

    commit;

    EXCEPTION

    WHILE OTHERS THEN

    Dbms_output.put_line ('ERROR' |) SQLERRM);

    END;

    /

    output

    for dhcp


    coil load_dhcp.log

    Set serveroutput on

    DECLARE

    CURSOR c_stg_dhcp IS SELECT *.

    OF stg_dhcp_data;

    TYPE t_stg_dhcp IS TABLE OF stg_dhcp_data % ROWTYPE INDEX BY pls_integer;

    l_stg_dhcp t_stg_dhcp;

    l_dhcp_cnt NUMBER;

    l_cnt NUMBER;

    NUMBER of l_cnt_1;

    BEGIN

    OPEN c_stg_dhcp.

    Get the c_stg_dhcp COLLECT in BULK IN l_stg_dhcp;

    BECAUSE me IN l_stg_dhcp. FIRST... l_stg_dhcp. LAST

    LOOP

    SELECT COUNT (1)

    IN l_dhcp_cnt

    OF subntwk

    WHERE subntwk_nm = l_stg_dhcp (i) .dhcp_token;

    IF l_dhcp_cnt < 1 THEN

    INSERT

    IN SUBNTWK

    (

    subntwk_nm

    )

    VALUES

    (

    l_stg_dhcp (i) .dhcp_token

    );

    DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_dhcp (i) .dhcp_token);

    ON THE OTHER

    DBMS_OUTPUT. Put_line ("token is already present'");

    END IF;

    WHEN l_stg_dhcp EXIT. COUNT = 0;

    END LOOP;

    commit;

    EXCEPTION

    WHILE OTHERS THEN

    Dbms_output.put_line ('ERROR' |) SQLERRM);

    END;

    /

    output

    for link -.

    coil load_link.log

    Set serveroutput on

    DECLARE

    l_cmts_1 VARCHAR2 (4000 CHAR);

    l_cmts_add VARCHAR2 (200 CHAR);

    l_dhcp_cnt NUMBER;

    l_cmts_cnt NUMBER;

    l_link_cnt NUMBER;

    l_add_link_nm VARCHAR2 (200 CHAR);

    BEGIN

    FOR (IN) r

    SELECT dhcp_token, cmts_to_add | ',' cmts_add

    OF stg_link_data

    )

    LOOP

    l_cmts_1: = r.cmts_add;

    l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));

    SELECT COUNT (1)

    IN l_dhcp_cnt

    OF subntwk

    WHERE subntwk_nm = r.dhcp_token;

    IF l_dhcp_cnt = 0 THEN

    DBMS_OUTPUT. Put_line ("device not found: ' |") r.dhcp_token);

    ON THE OTHER

    While l_cmts_add IS NOT NULL

    LOOP

    l_add_link_nm: = r.dhcp_token |' _TO_' | l_cmts_add;

    SELECT COUNT (1)

    IN l_cmts_cnt

    OF subntwk

    WHERE subntwk_nm = TRIM (l_cmts_add);

    SELECT COUNT (1)

    IN l_link_cnt

    LINK

    WHERE link_nm = l_add_link_nm;

    IF l_cmts_cnt > 0 AND l_link_cnt = 0 THEN

    INSERT INTO link (link_nm)

    VALUES (l_add_link_nm);

    DBMS_OUTPUT. Put_line (l_add_link_nm |) » '||' Has been added. ") ;

    ELSIF l_link_cnt > 0 THEN

    DBMS_OUTPUT. Put_line (' link is already present: ' | l_add_link_nm);

    ELSIF l_cmts_cnt = 0 then

    DBMS_OUTPUT. Put_line (' no. CMTS FOUND for device to create the link: ' | l_cmts_add);

    END IF;

    l_cmts_1: = TRIM (SUBSTR (l_cmts_1, INSTR (l_cmts_1, ',') + 1));

    l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));

    END LOOP;

    END IF;

    END LOOP;

    COMMIT;

    EXCEPTION

    WHILE OTHERS THEN

    Dbms_output.put_line ('ERROR' |) SQLERRM);

    END;

    /

    output

    control files -

    DOWNLOAD THE DATA

    INFILE 'cmts_data.csv '.

    ADD

    IN THE STG_CMTS_DATA TABLE

    When (cmts_token! = ") AND (cmts_token! = 'NULL') AND (cmts_token! = 'null')

    and (cmts_ip! = ") AND (cmts_ip! = 'NULL') AND (cmts_ip! = 'null')

    FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "

    TRAILING NULLCOLS

    ('RTRIM (LTRIM (:cmts_token))' cmts_token,

    cmts_ip ' RTRIM (LTRIM(:cmts_ip)) ")". "

    for dhcp.


    DOWNLOAD THE DATA

    INFILE 'dhcp_data.csv '.

    ADD

    IN THE STG_DHCP_DATA TABLE

    When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')

    and (dhcp_ip! = ") AND (dhcp_ip! = 'NULL') AND (dhcp_ip! = 'null')

    FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "

    TRAILING NULLCOLS

    ('RTRIM (LTRIM (:dhcp_token))' dhcp_token,

    dhcp_ip ' RTRIM (LTRIM(:dhcp_ip)) ")". "

    for link -.

    DOWNLOAD THE DATA

    INFILE 'link_data.csv '.

    ADD

    IN THE STG_LINK_DATA TABLE

    When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')

    and (cmts_to_add! = ") AND (cmts_to_add! = 'NULL') AND (cmts_to_add! = 'null')

    FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "

    TRAILING NULLCOLS

    ('RTRIM (LTRIM (:dhcp_token))' dhcp_token,

    cmts_to_add TANK (4000) RTRIM (LTRIM(:cmts_to_add)) ")" ""

    SHELL SCRIPT-

    If [!-d / log]

    then

    Mkdir log

    FI

    If [!-d / finished]

    then

    mkdir makes

    FI

    If [!-d / bad]

    then

    bad mkdir

    FI

    nohup time sqlldr username/password@SID CONTROL = load_cmts_data.ctl LOG = log/ldr_cmts_data.log = log/ldr_cmts_data.bad DISCARD log/ldr_cmts_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &

    nohup time username/password@SID @load_cmts.sql

    nohup time sqlldr username/password@SID CONTROL = load_dhcp_data.ctl LOG = log/ldr_dhcp_data.log = log/ldr_dhcp_data.bad DISCARD log/ldr_dhcp_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &

    time nohup sqlplus username/password@SID @load_dhcp.sql

    nohup time sqlldr username/password@SID CONTROL = load_link_data.ctl LOG = log/ldr_link_data.log = log/ldr_link_data.bad DISCARD log/ldr_link_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &

    time nohup sqlplus username/password@SID @load_link.sql

    MV *.log. / log

    If the problem I encounter is here for loading data in the link table that I check if DHCP is present in the subntwk table, then continue to another mistake of the newspaper. If CMTS then left create link to another error in the newspaper.

    Now that we can here multiple CMTS are associated with unique DHCP.

    So here in the table links to create the link, but for the last iteration of the loop, where I get separated by commas separate CMTS table stg_link_data it gives me log as not found CMTS.

    for example

    DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2

    Here, I guess to link the dhcp-1-1-1 with balancing-CMTS-01-1 and wnlb-CMTS-02-2

    Theses all the data present in the subntwk table, but still it gives me journal wnlb-CMTS-02-2 could not be FOUND, but we have already loaded into the subntwk table.

    same thing is happening with all the CMTS table stg_link_data who are in the last (I think here you got what I'm trying to explain).

    But when I run the SQL scripts in the SQL Developer separately then it inserts all valid links in the table of links.

    Here, she should create 9 lines in the table of links, whereas now he creates only 5 rows.

    I use COMMIT in my script also but it only does not help me.

    Run these scripts in your machine let me know if you also get the same behavior I get.

    and please give me a solution I tried many thing from yesterday, but it's always the same.

    It is the table of link log

    link is already present: dhcp-1-1-1_TO_wnlb-cmts-01-1

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-02-2

    link is already present: dhcp-1-1-2_TO_wnlb-cmts-03-3
    link is already present: dhcp-1-1-2_TO_wnlb-cmts-04-4

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-5

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-01-1

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-8
    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-6
    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-0

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-03-3

    link is already present: dhcp-1-1-5_TO_wnlb-cmts-02-2
    link is already present: dhcp-1-1-5_TO_wnlb-cmts-04-4

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-7

    Device not found: wnlb-dhcp-1-13

    IF NEED MORE INFORMATION PLEASE LET ME KNOW

    Thank you

    I felt later in the night that during the loading in the staging table using UNIX machine he created the new line for each line. That is why the last CMTS is not found, for this I use the UNIX 2 BACK conversion and it starts to work perfectly.

    It was the dos2unix error!

    Thank you all for your interest and I may learn new things, as I have almost 10 months of experience in (PLSQL, SQL)

  • Insert the page elements when loading data

    Hi all

    I use APEX 5.0 and using wizard to load data to download the data.

    My table has 7 fields and I need to insert 3 fields in the table while loading data.

    I have created a process after Parse the data downloaded as below.

    The result is that the 5th field will be First Row column as 'Y' and no 6th and 7th column.  There is value of 5th field only.

    When click on next to data validation page, it shows the 5th field with column field and 6th without the name of the column.

    Please help me to the point where I must change my code.

    BEGIN
    APEX_COLLECTION. ADD_MEMBER)
    p_collection_name = > 'PARSE_COL_HEAD ',.
    p_c001 = > 'HUBCODE ',.
    p_c002 = > 'UPLOAD_FILENAME ',.
    p_c003 = > 'UPLOAD_DATE');

    FOR UPLOAD_ROW IN (SELECT SEQ_ID
    OF APEX_COLLECTIONS
    WHERE COLLECTION_NAME = "SPREADSHEET_CONTENT")
    LOOP
    APEX_COLLECTION. () UPDATE_MEMBER_ATTRIBUTE
    p_collection_name = > 'SPREADSHEET_CONTENT ',.
    p_seq = > UPLOAD_ROW. SEQ_ID,
    p_attr_number = > '5',.
    p_attr_value = >: P2_HUB_CODE);

    APEX_COLLECTION. () UPDATE_MEMBER_ATTRIBUTE
    p_collection_name = > 'SPREADSHEET_CONTENT ',.
    p_seq = > UPLOAD_ROW. SEQ_ID,
    p_attr_number = > '6'.
    p_attr_value = >: P2_FILE_NAME);

    APEX_COLLECTION. () UPDATE_MEMBER_ATTRIBUTE
    p_collection_name = > 'SPREADSHEET_CONTENT ',.
    p_seq = > UPLOAD_ROW. SEQ_ID,
    p_attr_number = > '7'.
    p_attr_value = > SYSDATE);

    END LOOP;
    END;

    Want to close this loop and hope that it will be useful for others.

    In the end, I managed download document using the latest plugin for APEX 5.0 excel2collections.

    As MK pointed out, I can control my data using the plugin.

    Another point more directly is users can download xls or xlsx files, do not need to convert them to csv format before downloading.

    I thank all of you for the help.

  • Dynamo admin: method to start loading data?

    Hello

    The files get generated in the folder you want. Now I need to test the load of data takes place successfully rather than wait until the next run time.

    Which is the method in the admin of dynamo I call inorder to start loading data?

    I tried to call the method loadAllAvailable() method under each Chargers, but it keeps returning 0 even if there are a lot of log files in the folder.

    Please let me know.

    Thank you

    Saud

    Hello

    Finally managed to find where I had made the mistake. Eventhough the ARF.base has been added to the modules for construction of production, it was not in the list of modules in the startup for the production instance script.

    Thus the LogRotationSink component was not enabled in dynamoMessagingSystem.xml

    Once the ARF.base module has been included in the startup script, everything worked like a charm

    Thank you

    Saud

  • How to load data into the planning your application through FDMEE of Multicurrecny?

    Hi all

    I am new to FDMEE, I got hit with the loading of data, here's my problem,

    My application is the application of multicurrency with USD as the currency of presentation and a few more currencies like GBP, etc. INR currency basis to individual entities. The requirement is to load real historical data in application of planning in local currencies and in USD.

    I have a data file for USD. However, I have several files for local currencies, all data files will be in the same format. When I tried to load with the configuration under configuration of integration in location of given functional currency as USD and the data value as USD , but still only "Local" data loading, when I checked the recovery of data in Smart mode by keeping the dimension currency in Page and selecting USD, it shows no data , but if I choose Local it retrieves data!  Is it possible to load data from both local USD currency?

    Any help on this is much appreciated, thanks in advance.

    Kind regards

    Kamal

    Hi all

    Now, I managed to load time USD (presentation currency) and Local (different base for the entity currency). Before that I did not forget the currency (USD / Local) in the functional currency of the location field.

    Thank you all,.

    Kind regards

    Kamal

  • Loading data into the env DWH

    Hello

    We have an obligation to delete and load once a week: 20 million records. Some bitmap indexes are available on the table.

    Please suggest me which option I should consider to load the data effectively.

    (1) drop index dynamically, load data and re-create indexes gather table / index stats after that load data.

    (2) set the unusable index and rebuild, gathering statistics for table / index after loading data. (reference https://asktom.oracle.com/pls/apex/f?p=100:11:0:P11_QUESTION_ID:2680568300346966968 ).

    If I choose option 2 is required to run command below during the loading process?

    alter session set skip_unusable_indexes=true;

    Thank you very much in advance for your help.

    -Suri

    A small question - since we are to transfer data in the main table that is the need to build on the work table index?

    Indexes on the partitioned table, the non-partitioned table must be compatible.

    I've never found useful to mark as unusable bitmap indexes need to be rebuilt anyway. Each bitmap index entry has bits for SEVERAL lines. And setting a bit in an entry means that the ILO should be DISABLED in all other entries.

    So it does not make much sense to try to use an unusable bitmap index and try to update to make it current.

    We did not score on the main table. Can I still use the technique of swap partition to load the data.  (Sorry for asking questions without trying, I'll try tomorrow)

    One of the tables must be partitioned. Given that the main table is NOT partitioned, then you partition the work table. The only working table has a partition segment. This segment is swapped with a stretch of the non-partitioned table.

    On the other hand note which option is recommended to load the data when we have no license to use partition.

    Delete the index and rebuild them. Like I said more high index bitmap needs to be rebuilt anyway.

  • load data from csv file into table

    Hello

    I'm working on Oracle 11 g r2 on UNIX platform.

    We have an obligation to load data to the table to a flat file, but on one condition, need to make the comparison between filed primary key if the file is available then update another filed and if the recording is not available, we need to insert.

    How can achieve us.

    SQLLoader to load the CSV file data into a staging Table.

    Then use the MERGE SQL command to the lines of insert/update of table setting for the target table.

    Hemant K Collette

  • Issue while loading data using the file Rules Essbase

    Hi all

    I am facing problem while loading data using the Rules file. In the rules file, I rejected several members in two areas (two dimensions). Now if I load the data using the rules file I'm getting errors for all members in the dataload.err file. If I reject mutiple members of a single field, the data load without settling errors in the dataload.err file.

    I want to know how rmany members of several fields of ejection for loading data using the file Rules Essbase? Is it possible?

    Okay, okay... I think that you must assign Global Select / reject Boolean in the parameters of loading data as 'Or':

  • Loading data from Microsoft Dynamics AX via FDM

    Hello world

    I was wondering if anyone was using FDM to load data from Microsoft Dynamics AX into HFM via FDM, and how this was done. We are currently on version 11.1.2.2.0, thanks for any idea.

    The easiest way would be to get the MS Dynamics team to provide a flat file extracts the required data and then use FDM to import in HFM. There is not out of the box adapters to interface directly with MS Dynamics, but another option would be to build an integration script to interface directly with the data in the relevant MS Dynamics via FDM tables but it would be more involved and require a knowledge of the underlying database MS Dynamics, FDM scripting, dry and would be much more difficult to maintain for the average user.

Maybe you are looking for