ignore the insert errors in a merge statement?
Hi allI wonder if it is possible, somehow, to ignore the failings of the insert statement in a merge statement of bulk? The error encountered is the violation of a primary key. What is happening is that the different transactions share the same plsql procedure. Therefore, it could happen that different transactions will try to insert two rows with the same primary key.
MERGE INTO table
Using the collection
WE (table.id = collection.id)
WHEN MATCHED THEN
UPDATE...
WHEN NOT MATCHED THEN
INSERT...-> it may fail for duplicates here
The only solution would be to use 'updated massive day + bulk insert (ignore errors)' instead of a merger in bulk?
Any help is very appreciated,
WF
What about using [DML Error Logging | http://www.oracle-base.com/articles/10g/DmlErrorLogging_10gR2.php#merge]?
Tags: Database
Similar Questions
-
Question about the transition from string values to the Partition clause in a merge statement
Hi all
I use the code to update the data of specific secondary partition using oracle merge statements below.
I'm getting the name of the secondary partition and pass this string to the secondary partition clause.
The Merge statement is a failure, indicating that the specified secondary partition does not exist. But the partition under do exists for the table.
We use a server Oracle 11 GR 2.
Here is the code I use to fill in the data.
declare
ln_min_batchkey PLS_INTEGER;
ln_max_batchkey PLS_INTEGER;
lv_partition_name VARCHAR2 (32767).
lv_subpartition_name VARCHAR2 (32767).
Start
FOR m1 IN (SELECT (year_val + 1) AS year_val, year_val AS orig_year_val)
FROM (SELECT DISTINCT
To_char (batch_create_dt, 'YYYY') year_val
OF stores_comm_mob_sub_temp
ORDER BY 1)
ORDER BY year_val)
LOOP
lv_partition_name: =.
() scmsa_handset_mobility_data_build.fn_get_partition_name
nom_table_p = > 'STORES_COMM_MOB_SUB_INFO ',.
p_search_string = > m1.year_val);
FOR m2
IN (SELECT DISTINCT
'M' || To_char (batch_create_dt, 'MM') AS month_val
OF stores_comm_mob_sub_temp
WHERE TO_CHAR (batch_create_dt, 'YYYY') = m1.orig_year_val)
LOOP
lv_subpartition_name: =.
() scmsa_handset_mobility_data_build.fn_get_subpartition_name
nom_table_p = > 'STORES_COMM_MOB_SUB_INFO ',.
p_partition_name = > lv_partition_name,
p_search_string = > m2.month_val);
DBMS_OUTPUT. Put_line (' lv_subpartition_name = > ' | lv_subpartition_name |' and lv_partition_name = > ' | lv_partition_name);
IF lv_subpartition_name IS NULL
THEN
DBMS_OUTPUT. Put_line ("to the INTERIOR of FI = > ' |") M2.month_val);
INSERT INTO STORES_COMM_MOB_SUB_INFO (T1)
T1.ntlogin,
T1.first_name,
T1.last_name,
T1.job_title,
T1.store_id,
T1.batch_create_dt)
SELECT t2.ntlogin,
T2.first_name,
T2.last_name,
T2.job_title,
T2.store_id,
T2.batch_create_dt
OF stores_comm_mob_sub_temp t2
WHERE TO_CHAR (batch_create_dt, 'YYYY') = m1.orig_year_val
AND'M '. To_char (batch_create_dt, 'MM') =
M2.month_val;
ELSIF lv_subpartition_name IS NOT NULL
THEN
DBMS_OUTPUT. Put_line (' INSIDE ELSIF = > ' | m2.month_val);
MERGE (SELECT *)
OF stores_comm_mob_sub_info
SUBPARTITION (lv_subpartition_name)) T1
USING (SELECT *)
OF stores_comm_mob_sub_temp
WHERE TO_CHAR (batch_create_dt, 'YYYY') =
M1.orig_year_val
AND'M '. To_char (batch_create_dt, 'MM') =
M2.month_val) T2
WE (T1.store_id = T2.store_id
AND T1.ntlogin = T2.ntlogin)
WHEN MATCHED
THEN
GAME UPDATE
T1.postpaid_totalqty =
(NVL (t1.postpaid_totalqty, 0))
(+ NVL (t2.postpaid_totalqty, 0));
T1.sales_transaction_dt =
LARGEST)
NVL (t1.sales_transaction_dt,
T2.sales_transaction_dt),
NVL (t2.sales_transaction_dt,
T1.sales_transaction_dt)),
T1.batch_create_dt =
LARGEST)
NVL (t1.batch_create_dt, t2.batch_create_dt),
NVL (t2.batch_create_dt, t1.batch_create_dt))
WHEN NOT MATCHED
THEN
INSERT (t1.ntlogin,
T1.first_name,
T1.last_name,
T1.job_title,
T1.store_id,
T1.batch_create_dt)
VALUES (t2.ntlogin,
T2.first_name,
T2.last_name,
T2.job_title,
T2.store_id,
T2.batch_create_dt);
END IF;
END LOOP;
END LOOP;
COMMIT;
end;
/
Really appreciate your input here.
Thank you
MK.Hello
You can use "immediate execution" what works.
Thank you
-
Ignore the digital error, with the EXCEPTION
Hello
I have a PROC that reads all of the records from one table and INSERT them into another table, quite simple. The only thing is I want to IGNORE digital errors. So, if a get a (digital) mistake, I want Oracle to IGNORE the error and continue in my FOR LOOP (Selecting records) and the insertion in the table. What seems to happen, is that if I get a DIGITAL error, it exits the for... Here is the Code:
Here are reports that it is currently reading:PROCEDURE Insert_Current_Data( p_create_date IN DATE, p_StatusId OUT NUMBER ) IS e_constraint_error EXCEPTION; e_numeric_error EXCEPTION; PRAGMA EXCEPTION_INIT (e_constraint_error, -2291); PRAGMA EXCEPTION_INIT (e_numeric_error, -1722); /* to_number(NOC_CODE) NOC_CODE, to_number(ECONOMIC_REGION_CODE) ECONOMIC_REGION_CODE, */ CURSOR ei_claimant_ext_cur IS SELECT NOC_CODE, ECONOMIC_REGION_CODE, CASE EI_PROV_CODE WHEN '00' THEN 1 WHEN '01' THEN 4 WHEN '02' THEN 2 WHEN '03' THEN 3 WHEN '04' THEN 5 WHEN '05' THEN 6 WHEN '06' THEN 7 WHEN '07' THEN 8 WHEN '08' THEN 9 WHEN '09' THEN 10 WHEN '10' THEN 11 WHEN '11' THEN 12 ELSE 13 END EI_PROV_CODE, POSTAL_CODE FROM ei_claimant_external WHERE ei_prov_code <> 12 AND economic_region_code <> 99 AND postal_code is NOT NULL; -- AND ROWNUM < 1000; ei_claimant_ext_rec ei_claimant_ext_cur%ROWTYPE; v_create_date VARCHAR2(20); v_econ_reg_prov NUMBER; BEGIN dbms_output.put_line('Date passed: '||p_create_date); dbms_output.put_line('INSERT Current Data '); -- p_StatusId := 0; FOR claimant_row IN ei_claimant_ext_cur LOOP /* OPEN ei_claimant_ext_cur; LOOP FETCH ei_claimant_ext_cur INTO ei_claimant_ext_rec; -- v_staging_count := v_staging_count + 1; EXIT WHEN ei_claimant_ext_cur%NOTFOUND; */ BEGIN --- Get Econ Region Province SELECT PROVINCE_ID INTO v_econ_reg_prov FROM cd_econ_regions WHERE ECONOMIC_REGION_ID = ei_claimant_ext_rec.ECONOMIC_REGION_CODE; EXCEPTION WHEN NO_DATA_FOUND THEN v_econ_reg_prov := 0; END; BEGIN IF v_econ_reg_prov = ei_claimant_ext_rec.EI_PROV_CODE THEN INSERT INTO ei_claimant_curr_year VALUES (EI_SEQ.nextval, ei_claimant_ext_rec.noc_code, ei_claimant_ext_rec.EI_PROV_CODE, ei_claimant_ext_rec.ECONOMIC_REGION_CODE, p_create_date, --- CURRENT Month ei_claimant_ext_rec.POSTAL_CODE ); COMMIT; END IF; EXCEPTION WHEN e_constraint_error THEN dbms_output.put_line('CONSTRAINT Error '); fwutil_pkg.logerror (SQLERRM||' '||ei_claimant_ext_rec.noc_code||' Prov Code: '||ei_claimant_ext_rec.EI_PROV_CODE||' Econ Region: '||ei_claimant_ext_rec.ECONOMIC_REGION_CODE|| ' Postal Code: '||ei_claimant_ext_rec.POSTAL_CODE||' Date: '||p_create_date, SQLCODE, NULL, 'Populate_EI_Claimant_Main -- Insert into EI_Claimant_curr_year ' ); WHEN e_numeric_error THEN dbms_output.put_line('NUMERIC Error On INSERT'); fwutil_pkg.logerror (SQLERRM||' '||ei_claimant_ext_rec.noc_code||' Prov Code: '||ei_claimant_ext_rec.EI_PROV_CODE||' Econ Region: '||ei_claimant_ext_rec.ECONOMIC_REGION_CODE|| ' Postal Code: '||ei_claimant_ext_rec.POSTAL_CODE||' Date: '||p_create_date, SQLCODE, NULL, 'Populate_EI_Claimant_Main -- Insert into EI_Claimant_curr_year ' ); WHEN OTHERS THEN fwutil_pkg.logerror (SQLERRM||' '||ei_claimant_ext_rec.noc_code||' Prov Code: '||ei_claimant_ext_rec.EI_PROV_CODE||' Econ Region: '||ei_claimant_ext_rec.ECONOMIC_REGION_CODE|| ' Postal Code: '||ei_claimant_ext_rec.POSTAL_CODE||' Date: '||p_create_date, SQLCODE, NULL, 'Populate_EI_Claimant_Main -- Insert into EI_Claimant_curr_year ' ); END; END LOOP; -- CLOSE ei_claimant_ext_cur; EXCEPTION WHEN e_constraint_error THEN dbms_output.put_line('CONSTRAINT Error '); fwutil_pkg.logerror (SQLERRM||' '||ei_claimant_ext_rec.noc_code||' Prov Code: '||ei_claimant_ext_rec.EI_PROV_CODE||' Econ Region: '||ei_claimant_ext_rec.ECONOMIC_REGION_CODE|| ' Postal Code: '||ei_claimant_ext_rec.POSTAL_CODE||' Date: '||p_create_date, SQLCODE, NULL, 'Populate_EI_Claimant_Main -- Insert into EI_Claimant_curr_year ' ); WHEN e_numeric_error THEN dbms_output.put_line('NUMERIC Error ON Cursor'); fwutil_pkg.logerror (SQLERRM||' '||ei_claimant_ext_rec.noc_code||' Prov Code: '||ei_claimant_ext_rec.EI_PROV_CODE||' Econ Region: '||ei_claimant_ext_rec.ECONOMIC_REGION_CODE|| ' Postal Code: '||ei_claimant_ext_rec.POSTAL_CODE||' Date: '||p_create_date, SQLCODE, NULL, 'Populate_EI_Claimant_Main -- Insert into EI_Claimant_curr_year ' ); END Insert_Current_Data;
It inserts the first 2 records, ignoring the record with "0" (which is good). Then, he leaves the LOOP for when it happens to registration with a bunch of "A."00094440903E4P1T5 00000000000000000 00066230703E1E1G4 AAAAAAAAAAAAAAAAA 00012210903E4K2K8 00082620803E5V1L3 99999999999999999 00084410803E5G2J3 00074120903E4N2E3 ----------------- 00094630903E4N1B7 00082620903E4N2J6 ))))))))))))))))) 00082620903E4M2C4
Help, please!
Thanks in advance.First fix your code... FOR THE LOOP has not been implemented correctly... I think that u first tried with SLIDER OPEN... LOOP... PICK UP... WHEN THE OUTPUT... END OF LOOP... CLOSE THE CURSOR... then the same LOOP FOR converted... END LOOP and I forgot to change the code like
-WHERE ECONOMIC_REGION_ID is ei_claimant_ext_rec. ECONOMIC_REGION_CODE;.Try this type of code...
DECLARE
Mixed_typ TYPE IS a TABLE OF VARCHAR2 (1)
INDEX BY PLS_INTEGER;mixed_ty mixed_typ;
e_numeric_error EXCEPTION;
PRAGMA EXCEPTION_INIT (e_numeric_error,-6502);
v_num NUMBER;
BEGIN
mixed_ty (1): = '1';
mixed_ty (2): = 'A ';
mixed_ty (3): = '2';
mixed_ty (4): = '3';
BECAUSE me in 1... mixed_ty. COUNTING LOOP
BEGIN
v_num: = TO_NUMBER ('11' | mixed_ty (i));
DBMS_OUTPUT. Put_line (v_num);
EXCEPTION
WHEN e_numeric_error THEN
DBMS_OUTPUT. Put_line (' ignore ' |) SQLERRM);
END;
END LOOP;
EXCEPTION
WHILE OTHERS THEN
DBMS_OUTPUT. PUT_LINE (SQLERRM);
END; -
The interface when the insertion error
Hello world
I have the script with several variables because I need to run several queries.
I have a table with a primary key + a column for each variable within the scenario of ODI.
The ODI process must be run once a month and create one record in this table each time.
So, I've created an interface in the ODI scenario. I filled these variables destination data store. I have no other stores data as a source.
I use IKM SQL command append.
During the execution of this interface, I get the following error:
In step Insert flow in I$ table I get: java.sql.SQLSyntaxErrorException: ORA-00928: lack of SELECT keyword
The code in this step is:
insert into NLANGAI.II_ORG_DATOS_ACTIVIDAD (
)
Select Of where (1 = 1) I don't know how to solve this problem, or if it's the best way to do what I want.
Thanks in advance
Hello 956682,
Your explanation is a bit confusing, I think you mixed "ODI scénario" with another ODI concept, maybe package or even project (although data warehouses are not stored at the project level). Could you use specific terms of ODI?
I also suspect that you are working with ODI 11 g here.
Why not use just an ODI procedure to load your table target that you don't have a real data source? I think that it is not a good idea to use an interface here because you would have to use a workaround to make it work.
Kind regards
JeromeFr
-
MERGE statement making FTS results too slow
merge into big_table
following merge statement do Full Table Scan on the table of 40 GB.
using (select..., cases where so,...) a
on (... condition1 condition2,...)
When
then matched
Update statement
When
do not match then
Insert)
() values
Statement of na not out after 5 hours and 10% in longops display.
the two tables are analyzed only the day before.
Please suggest how to grant this request, where to start?The query you gave is nothing else than the doc.
I'm speechless. You give nothing - started by your query (unless you are not allowed to paste here) - which could help us understand and you advise something else than to take alongside a map of the explain command.
Started by your query structure tabe/index, then.
Oracle version?
OPERATING SYSTEM?
Statistics up-to-date?
What is the 40 GB partitioned table? Is this the only table involved? You have somewhere clause?
Then run explain plan for paste_your_real_query
Select * from table (dbms_xplan.display)Nicolas.
-
Ignore the failure of generation of incorrect size for Member Parent (3307)
Hello
Is it possible to ignore the following error message:
Bad Parent [XYZ] for ABC member (3307)
The error appears because the Member is in outline already. Is it possible to ignore this error?
Thanks in advance
The reason for the error is that you have several lines in your dimension generate file with contradictory information. A line said to associate the parent member and another line to go with Parent B. TimGsuggestion set up fiery to allow movements could get rid of the error, but the Member will be associated with the last line of your data. If you do not allow movements (as it appears you have) there is no way to disable the error. In your case, that's what I call a soft error, you know about it, but don't want to do something. I suggest that you clean your data so it does not cause this problem. It of awkward for me when the incoming data this type of inconsistency and the solution is to ignore it
-
Get error in the merge statement
Version of Oracle, Oracle Database 11g Express Edition Release 11.2.0.2.0 - Production
Hi gurusI am trying to perform the merge statement using the clause, but I get the error message, see below my sample data:
Examples of data
DROP TABLE OF BILLING;
DROP TABLE GROUP_MST;
DROP TABLE age_max;
CREATE ARRAY OF BILLING
(
BILLING_ID NUMBER (5),
RATE NUMBER (5),
MAXIMUM_AGE NUMBER (5),
package_id NUMBER (5)
);
INSERT
IN BILLING
(SELECT 11,1,10,100 FROM DUAL
UNION ALL
SELECT 12,2,15,100 FROM DUAL
UNION ALL
SELECT 13,3,20,100 FROM DUAL
);
CREATE TABLE GROUP_MST
(group_number NUMBER (5), package_id NUMBER (5))
);
INSERT INTO GROUP_MST
(SELECT DOUBLE 5 000 100
);
-------
CREATE TABLE AGE_MAX
(group_number NUMBER (5), max_age NUMBER (5))
);
INSERT
IN AGE_MAX
(SELECT 5000,60 FROM DUAL
);
Merge statement
MERGE INTO DRC USING billing
(WITH datum1 AS
(SELECT A.group_number,
b.billing_id,
b.rate,
b.maximum_age,
b.package_id
OF group_mst A,.
b billing
WHERE A.package_id = b.package_id
),
datum2 AS
(SELECT max_age, group_number FROM age_max
)
SELECT * FROM A, b WHERE A.group_number = b.group_number datum2 datum1
)
CBC ON (drc.billing_id = src.billing_id)
WHEN MATCHED THEN
UPDATE SET drc.maximum_age = src.max_age - 1;
Error
Error from the 33 in the command line:
MERGE INTO DRC USING billing
(WITH datum1 AS
(SELECT A.group_number,
b.billing_id,
b.rate,
b.maximum_age,
b.package_id
OF group_mst A,.
b billing
WHERE A.package_id = b.package_id
),
datum2 AS
(SELECT max_age, group_number FROM age_max
)
SELECT * FROM A, b WHERE A.group_number = b.group_number datum2 datum1
)
CBC ON (drc.billing_id = src.billing_id)
WHEN MATCHED THEN
UPDATE SET drc.maximum_age = src.max_age - 1
Error in the command line: 50 column: 1
Error report:
SQL error: ORA-00918: column ambiguously defined
00918 00000 - "ambiguously defined column.
* Cause:
* Action:
Thanks in advance
Concerning
Shu
WITH datum1 AS (SELECT A.group_number,
b.billing_id,
b.rate,
b.maximum_age,
b.package_id
OF group_mst A, b billing
WHERE A.package_id = b.package_id),
datum2 AS (SELECT max_age, group_number FROM age_max)
SELECT A.*, b.max_age, b.group_number as group_number_b - ERROR HERE
OF datum1 A, b datum2
WHERE A.group_number = b.group_number
-
Error in the Merge statement using dblink
Hello
I am facing the following error when you use the merge statement using the dblink.
ORA-02069: global_names must be set to true for this operation parameter.
I can use the same dblink in my select insert and update statements but when I try to use merge then he invites the error said, is also not any syntax error as same statement can be used on the same tables of database instead of dblink.
Please suggest any help will be much appreciated.
Thanks in advance
Hi Aqeel
If insert and update statements work well through links db, then it should not be a problem with the merge statement. But if you are faced with the question, so please check the entire sql statement with tnsnames on both sides. Please share the tnsnames for the two dbs with dblink ddl.
Concerning
Jihane Narain Sylca
-
Person registering in the Merge statement error!
Is it possible to save indivudual error in the MERGE statement (Update / Insert).
I'm unable to save these error. instead of MERGING, if I update the table in the cursor loop then I am able to record the individual error but the process takes time.
Thanks in advance.
DebaHi Deba,
Logging of DML errors:
SQL> create table tab1 (x number(1)); Table created. SQL> exec dbms_errlog.create_error_log('tab1') PL/SQL procedure successfully completed. SQL> SQL> merge into tab1 t 2 using (select 1 x from dual union all 3 select 112 x from dual) s 4 on (t.x = s.x) 5 when not matched 6 then insert (x) values (s.x) 7 log errors into err$_tab1 reject limit unlimited; 1 row merged. SQL> SQL> COL x for 9999 SQL> select * from tab1; X ----- 1 SQL> COL x for a4 SQL> select ora_err_number$, X from err$_tab1; ORA_ERR_NUMBER$ X --------------- ---- 1438 112 SQL>
Concerning
Peter -
insert/update County in the merge statement
In the Merge statement, we can count the number of inserted rows and updated?
ROWCOUNT SQL % simply returns the number of rows merged.
It does not tell us the separately the line inserted and updated.# To insert the rowcount is the number of inserted rows for update is the number of rows updated, for the merger, that's the number of merged lines.
-
Ignore the duplicates on Insert statement
I'm trying to create a script to import a CSV file into a MySQL database. The script works very well. However, it occurred to me that there could be duplicates to a certain point in time.
How do I get this to only insert records that are not already in the database? Please keep in mind that an excel sheet may have tens or hundreds of thousands of records when inserting.
Any ideas? Here is my code so far:
<?php if(isset($_POST["Import"])) { $host="localhost"; $db_user="***"; $db_password="***"; $db='test'; $conn=mysql_connect($host,$db_user,$db_password) or die (mysql_error()); mysql_select_db($db) or die (mysql_error()); echo $filename=$_FILES["file"]["tmp_name"]; //echo $ext=substr($filename,strrpos($filename,"."),(strlen($filename)-strrpos($filename,"."))); if($_FILES["file"]["size"] > 0) { $file = fopen($filename, "r"); while (($emapData = fgetcsv($file, 10000, ",")) !== FALSE) { $sql = "INSERT into leads2(fname, lname, dog) values('$emapData[0]','$emapData[1]','$emapData[2]')"; mysql_query($sql); } fclose($file); echo "SUCCESS!"; } else echo "FAILED!"; } ?>
Two ways I can think of. First is to check if the primary key exists before executing the insert statement. This is going to be an expensive operation with 100K of records! A better solution would be to use a load operation into a temporary table and then use an INSERT INTO... SELECT FROM... Statement from the temporary table to the table live. You can use a where clause, group by clause clause or the DISTINCT keyword in select it remove all dupes.
at http://www.mysqlfaqs.net/MySQL-FAQs/Data-back-up/Import-Data/How-to-use-LOAD-DATA-INFILE-s
-
Return in the MERGE statement clause
Hello
I use Oracle 10 g Version
I tried the code using UPDATE with return Clause & MERGER with return below Clause.
I have not found ANY error while working with the UPDATE statement. Here is the code with the UPDATE statement
DECLARE TYPE empno_list IS TABLE OF emp.empno%TYPE; vempno_list empno_list; BEGIN UPDATE emp SET comm = 11 WHERE deptno IN (SELECT deptno FROM dept) RETURNING empno BULK COLLECT INTO vempno_list; FOR i IN vempno_list.FIRST .. vempno_list.LAST LOOP DBMS_OUTPUT.put_line ('Values of EMP ' || vempno_list (i)); END LOOP; END;
But PL/SQL error: ORA-00933: SQL not correctly completed command when you work with the MERGE statement
declare type empno_list is table of emp.empno%type; vempno_list empno_list; begin merge into emp tgt using dept src on (src.deptno =tgt.deptno) when matched then update set tgt.comm=12 returning tgt.empno bulk collect into vempno_list ; for i in vempno_list.first .. vempno_list.last loop dbms_output.put_line('Values of EMP '||vempno_list(i) ) ; end loop; end;
Please suggest me
Probably because BACK IN clause does not belong to the MERGE statement. It is available only for INSERT, UPDATE, and DELETE. Here's the quote from Oracle Documentation:
The static method
RETURNING
INTO
article belongs to aDELETE
,INSERT
, orUPDATE
statement. The dynamicRETURNING
INTO
article belongs to aEXECUTE
IMMEDIATE
statement.And here is the link.
It will be useful.
Ishan
-
ORA 22813 in the merge statement
Hi the gems... Good afternoon...
My version of the database is 11.2.0.1.0 Solaris 64-bit OS.
I am facing a ' ORA-22813: value of the operand exceeds the limits of the system "during the execution of a procedure.
I used loggers and found that it is getting failed in a MERGE statement.
This merge statement is used to merge a table with a collection. the code is as below:
Function GET_BALANCE_HIST (V_MERGE_REC) parameter is an array type.MERGE /*+ INDEX(P BALANCE_HISTORIC_INDEX) */ INTO BALANCE_HOLD_HISTORIC P USING TABLE(GET_BALANCE_HIST(V_MERGE_REC)) M ON (P.CUSTOMER_ID = M.CUSTOMER_ID AND P.BOOK_ID = M.BOOK_ID AND P.PRODUCT_ID = M.PRODUCT_ID AND P.SUB_BOOK_ID = M.SUB_BOOK_ID AND) WHEN MATCHED THEN UPDATE <set .....> WHEN NOT MATCHED THEN INSERT<.....>
Now GET_BALANCE_HIST (V_MERGE_REC) is a function in the pipeline and we used that because the V_MERGE_REC collection can become huge with data.
This process worked very well since the beginning, but since yesterday, it was constantly throwing 22813 ORA error in this line.
Help, please... Thanks in advance...Gogol wrote:
The code flow is as below:There is a sql query, the output is a set of some several lakes of records. Now we take the result of the sql query in a collection with a limit of 1000 (FETCH cur_sql COLLECT LOOSE v_balance_rec LIMIT 1000). Then we do a lot of processing, the calculation of this collection.
After the treatment and calculation, we are filling another collection this collection transformed (V_BALANCE_REC) (V_MERGE_REC). Thus, if the loop iterates for 1000 times, then the V_MERGE_REC is filled with records of 1000 * 500 = 500000 processed.
And then we go to this huge V_MERGE_REC collection as a parameter to this function.
Don't know what are the Lakes. Please use International English.
Why can't do you math in a SQL query?
If it's really too hard to do in SQL (SQL queries can perform a treatment fairly complex), then I would look to Scripture
results of a table and use it. -
Different result by using the MERGE statement
I have 2 SQL statements:
MERGE INTO PT_CQS_AGGR APC
With the HELP of PT_CQS_AGGR_TEMP Lady
WE (concat (concat(pca.from_city_id,pca.to_city_id), pca.query_timestamp_hh) = concat (concat(pcah.from_city_id,pcah.to_city_id), pcah.query_timestamp_hh))
WHEN MATCHED THEN
Setting a DAY SET pca.search_count = pca.search_count + pcah.search_count
WHEN NOT MATCHED THEN
VALUES of INSERTION (to_city_id, query_timestamp_hh, from_city_id, search_count) (pcah.from_city_id, pcah.to_city_id, pcah.query_timestamp_hh, pcah.search_count);
AND
MERGE INTO PT_CQS_AGGR APC
With the HELP of PT_CQS_AGGR_TEMP Lady
WE (pca.from_city_id = pcah.from_city_id AND pca.to_city_id = pcah.to_city_id AND pca.query_timestamp_hh = pcah.query_timestamp_hh)
WHEN MATCHED THEN
Setting a DAY SET pca.search_count = pca.search_count + pcah.search_count
WHEN NOT MATCHED THEN
VALUES of INSERTION (to_city_id, query_timestamp_hh, from_city_id, search_count) (pcah.from_city_id, pcah.to_city_id, pcah.query_timestamp_hh, pcah.search_count);
First statement correctly the merged data, but use FULL TABLE SCANS, so it's rather slow. Second statement using a UNIQUE INDEX on the columns from_city_id, to_city_id, and query_timestamp_hh on the MAP of EXPLAIN, but during the execution of I've got ORA-00001: unique constraint (PODOWNER. IDX_CQS_AGGR_3COL) violated. What is the problem with the second statement - especially with this line:
WE (pca.from_city_id = pcah.from_city_id AND pca.to_city_id = pcah.to_city_id AND pca.query_timestamp_hh = pcah.query_timestamp_hh) Can I use a more complex condition in the MERGE statement? Because it seems that not all the conditions have been taken...
???
Select pcah.from_city_id, pcah.to_city_id, PT_CQS_AGGR_TEMP Lady pcah.query_timestamp_hh
less
Select pca.from_city_id, pca.to_city_id, pca.query_timestamp_hh PT_CQS_AGGR pca
somewhere the conditions limiting the time.
(or whatever the columns make up the stress of failing)
In addition, select for each column in the constraint of failure where the column is null. If one of these columns is null, equality fails, even if both are null, because null <> null. So you will get a "no match" if all values are the same and it is set to null.
Edit: Oh, slap forehead, that is why one works and the complete analysis. As SomeoneElse said, it normalizes the type of data and concatenates, comparison null to no effect and refusing all usual index.
-
Problem with the MERGE statement
Hi all
We'll cover the following:
create table xx_test1 (invoice_id number, line_group_number number, LINE_TYPE_LOOKUP_CODE VARCHAR2(25) ,LINE_NUMBER NUMBER(15) );
Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,1,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,1,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,2,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,2,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,3,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,3,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,4,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,4,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,5,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,5,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,12,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,12,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,7,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,7,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,8,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,8,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,9,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,9,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,10,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,10,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,11,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,11,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,12,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,12,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,13,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,13,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,14,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,14,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,15,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,15,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,16,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,16,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,17,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,17,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,18,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,18,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,19,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,19,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,20,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,20,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,21,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,21,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,22,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,22,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,23,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,23,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,24,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,24,'TAX',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,25,'ITEM',null); Insert into XX_TEST1 (INVOICE_ID,LINE_GROUP_NUMBER,LINE_TYPE_LOOKUP_CODE,LINE_NUMBER) values (131549,25,'TAX',null);
So, if we want to MERGE this table and update the lines with below:
merge into xx_test1 DST USING (SELECT LINE_GROUP_NUMBER, LINE_TYPE_LOOKUP_CODE, ROW_NUMBER() over(order by LINE_TYPE_LOOKUP_CODE, LINE_GROUP_NUMBER) as LINE_NUM from xx_test1 where INVOICE_ID = 131549 ) SRC ON (DST.LINE_GROUP_NUMBER = SRC.LINE_GROUP_NUMBER AND dst.LINE_TYPE_LOOKUP_CODE = src.LINE_TYPE_LOOKUP_CODE AND dst.invoice_id = 131549 ) WHEN MATCHED THEN update set DST.LINE_NUMBER = SRC.LINE_NUM ;
I get an error message:SQL error: ORA-30926: failed to get a stable set of rows in the source tables
30926 00000 - "impossible to get a stable set of rows in the source tables.
* Cause: A stable set of rows could not be achieved due to the large dml
activity or one not deterministic where clause.
* Action: Remove any non deterministic of the clauses and reissue of the dml.
I don't know why I'm getting this. Any ideas?
Version: 11g
Kind regards
Alex
whenever I have this error I convert my fusion in a select to see what is duplication
select dst.LINE_GROUP_NUMBER, dst.LINE_TYPE_LOOKUP_CODE, count(*) from xx_test1 DST join (SELECT LINE_GROUP_NUMBER, LINE_TYPE_LOOKUP_CODE, ROW_NUMBER() over(order by LINE_TYPE_LOOKUP_CODE, LINE_GROUP_NUMBER) as LINE_NUM from xx_test1 where INVOICE_ID = 131549 ) SRC ON (DST.LINE_GROUP_NUMBER = SRC.LINE_GROUP_NUMBER AND dst.LINE_TYPE_LOOKUP_CODE = src.LINE_TYPE_LOOKUP_CODE AND dst.invoice_id = 131549 ) group by dst.LINE_GROUP_NUMBER, dst.LINE_TYPE_LOOKUP_CODE having count(*) > 1
you have 2 dst. LINE_GROUP_NUMBER, summer time. Combination of LINE_TYPE_LOOKUP_CODE this duplication.
Maybe you are looking for
-
I save webpages as pdf as the page indicating what books and when files due to my library. After the upgrade to Firefox 10.0.1 I could no longer do. I can do from my Safari browser. I use this feature a lot, and it really bothers me. It's a deal brea
-
How to reconnect airport extreme wifi after installing the new modem cable
-
What is the difference between scanning and sscanf
When to receive the result of the instrument, I find RS give us two function,. one: If ((status = viRead (instrSession, buf, 50, & retCnt))return the situation;If (Scan (buf, "%s > f", MaxBurstPower)! = 1).Return MB8820B_status; Return MB8820B_status
-
I'm evaluating mappoint 2011, When you use the mappoint control, mappoint_ocx, I try to capture the onmousemove event to get the x / y, but the event does not fire for me. the mousemove event fires when I zoom or change the selection on the map. but
-
How much should I pay to upgrade my computer laptop Windows vista to Win7 ultimate form?
I'm not happy with my Win Vista Home premium SP1 base system so I want to upgrade to Win7 Ultimate, but would like to know how much I have to pay for it.