Redirect data to another table using before insert trigger.
Dear all,How can I redirect the data to another table in a before Insert trigger? My database is Oracle10g.
I have a table EMP (EMP_ID, LAST_NAME, SALARY).
I have another EMP_COPY table with the same structure. I also have a before Insert trigger on the EMP table.
Based on a condition that I have to redirect the data in table EMP_COPY. Let's say the condition is EMP_ID = 100.
I fire an insert on EMP table for example INSERT IN EMP(EMP_ID,LAST_NAME,SALARY) VALUES(100,'Dev',500).
On the inside of the front Insert trigger on the EMP table, I have the code
IF EMP_ID = 100 THEN
INSERT INTO EMP_COPY (EMP_ID, LAST_NAME, SALARY)
VALUES(:NEW.) EMP_ID,: NEW. LAST_NAME,: NEW. SALARY);
COMMIT;
ON THE OTHER
NULL;
END IF;
But the problem here is that data goes to EMP table of origin also although I don't want. He should do an Insert into EMP table only if EMP_ID! = 100.
One way has been to raise a user-defined exception inside the If statement and not handle it so that the original insert on table EMP fails but INSERT comes in the EMP_COPY table. But in this solution since the inside the trigger unhandled exception, it propagates to the calling environment. And I can't stand outside relaxation as the calling environment is a form of Oracle Apps standard that cannot be customized.
Any kind of help will be highly appreciated that I am fighting for more than two weeks.
Thanks in advance
Dev
Remove the autonomous transaction pragma... and then try again.
Tags: Database
Similar Questions
-
How can I insert data from another table into a table containing a timestamp column
How you insert data from another table in a table if the target table contains a timestamp column. I tried to set the default value of GETDATE() column in the target table, but it does not work.
I use MS SQLSorry, I managed to get around this by inserting null as the value
-
Build the value of 100 mb of data in the table using the loop
DB version: 11.2
How can I create about 100 MB of test data in a table using minum number of records.
If I use the below for loop. It takes 100,000 records to fill only 2 MB.
SQL> create table a2 (mynum1 number, mynum2 number); Table created. begin for i in 1..100000 loop insert into a2 values(i,i*2); end loop; end; select segment_name, bytes/1024/1024 MB from dba_segments where segment_name = 'A2' AND owner='SCOTT' SEGMENT BYTES/1024/1024 -------- --------------- A2 2
Hello
Is that what you are looking for?
SQL> CREATE TABLE tbl1(c1 CHAR(1024)); -- ~1KB per row Table created. Elapsed: 00:00:00.00 SQL> INSERT /*+ APPEND */ INTO tbl1 SELECT 'X' FROM dual CONNECT BY LEVEL < 100000; -- 100K rows * 1KB = 100MB + some additional overhead 99999 rows created. Elapsed: 00:00:13.02 SQL> SELECT bytes/1024/1024 MB FROM user_segments WHERE segment_name = 'TBL1'; MB ---------- 113.75
Lukasz
-
SQL Loader loading data into two Tables using a single CSV file
Dear all,
I have a requirement where in I need to load the data into 2 tables using a simple csv file.
So I wrote the following control file. But it loads only the first table and also there nothing in the debug log file.
Please suggest how to achieve this.
Examples of data
Source_system_code,Record_type,Source_System_Vendor_number,$vendor_name,Vendor_site_code,Address_line1,Address_line2,Address_line3
Victor, New, Ven001, Vinay, Vin001, abc, def, xyz
Control file script
================
OPTIONS (errors = 0, skip = 1)
load data
replace
in the table1 table:
fields ended by ',' optionally surrounded "" "
(
Char Source_system_code (1) POSITION "ltrim (rtrim (:Source_system_code)),"
Record_type tank "ltrim (rtrim (:Record_type)),"
Source_System_Vendor_number tank "ltrim (rtrim (:Source_System_Vendor_number)),"
$vendor_name tank "ltrim (rtrim (:Vendor_name)),"
)
in the Table2 table
1 = 1
fields ended by ',' optionally surrounded "" "
(
$vendor_name tank "ltrim (rtrim (:Vendor_name)),"
Vendor_site_code tank "ltrim (rtrim (:Vendor_site_code)),"
Address_line1 tank "ltrim (rtrim (:Address_line1)),"
Address_line2 tank "ltrim (rtrim (:Address_line2)),"
Address_line3 tank "ltrim (rtrim (:Address_line3)).
)the problem here is loading into a table, only the first. (Table 1)
Please guide me.
Thank you
Kumar
When you do not provide a starting position for the first field in table2, it starts with the following after a last referenced in table1 field, then it starts with vendor_site_code, instead of $vendor_name. So what you need to do instead, is specify position (1) to the first field in table2 and use the fields to fill. In addition, he dislikes when 1 = 1, and he didn't need anyway. See the example including the corrected below control file.
Scott@orcl12c > test.dat TYPE of HOST
Source_system_code, Record_type, Source_System_Vendor_number, $vendor_name, Vendor_site_code, Address_line1, Address_line2, Address_line3
Victor, New, Ven001, Vinay, Vin001, abc, def, xyz
Scott@orcl12c > test.ctl TYPE of HOST
OPTIONS (errors = 0, skip = 1)
load data
replace
in the table1 table:
fields ended by ',' optionally surrounded "" "
(
Char Source_system_code (1) POSITION "ltrim (rtrim (:Source_system_code)),"
Record_type tank "ltrim (rtrim (:Record_type)),"
Source_System_Vendor_number tank "ltrim (rtrim (:Source_System_Vendor_number)),"
$vendor_name tank "ltrim (rtrim (:Vendor_name)).
)
in the Table2 table
fields ended by ',' optionally surrounded "" "
(
source_system_code FILL (1) POSITION.
record_type FILLING,
source_system_vendor_number FILLING,
$vendor_name tank "ltrim (rtrim (:Vendor_name)),"
Vendor_site_code tank "ltrim (rtrim (:Vendor_site_code)),"
Address_line1 tank "ltrim (rtrim (:Address_line1)),"
Address_line2 tank "ltrim (rtrim (:Address_line2)),"
Address_line3 tank "ltrim (rtrim (:Address_line3)).
)
Scott@orcl12c > CREATE TABLE table1:
2 (Source_system_code VARCHAR2 (13),)
3 Record_type VARCHAR2 (11),
4 Source_System_Vendor_number VARCHAR2 (27),
5 $vendor_name VARCHAR2 (11))
6.
Table created.
Scott@orcl12c > CREATE TABLE table2
2 ($vendor_name VARCHAR2 (11),)
3 Vendor_site_code VARCHAR2 (16).
4 Address_line1 VARCHAR2 (13),
5 Address_line2 VARCHAR2 (13),
Address_line3 6 VARCHAR2 (13))
7.
Table created.
Scott@orcl12c > HOST SQLLDR scott/tiger CONTROL = test.ctl DATA = test.dat LOG = test.log
SQL * Loader: release 12.1.0.1.0 - Production on Thu Mar 26 01:43:30 2015
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: classics
Commit the point reached - the number of logical records 1
TABLE1 table:
1 row loaded successfully.
Table TABLE2:
1 row loaded successfully.
Check the log file:
test.log
For more information on the charge.
Scott@orcl12c > SELECT * FROM table1
2.
RECORD_TYPE SOURCE_SYSTEM_VENDOR_NUMBER $VENDOR_NAME SOURCE_SYSTEM
------------- ----------- --------------------------- -----------
Victor Ven001 new Vinay
1 selected line.
Scott@orcl12c > SELECT * FROM table2
2.
$VENDOR_NAME VENDOR_SITE_CODE ADDRESS_LINE1 ADDRESS_LINE2 ADDRESS_LINE3
----------- ---------------- ------------- ------------- -------------
Vinay Vin001 abc def xyz
1 selected line.
Scott@orcl12c >
-
Is inserting form temporary table in another table uses tablepsace temp?
Hi all
I am inserting data in the temporary table in ds_info.
I am inserting data of 16.5 GB size in the table.
I have 15 GB of temp tablespace.
Please confirm have I not enough space TEMP_TS.
OR
Please clarify the question.
Is temp tablespace uses comes in insert statement.
Thanks in advance,An insert into a table when querying an another table/tables can involve the use of the TEMP tablespace:
If the INSERTION involves either
a. ORDER BY
b. GROUP BY
c. a JOIN between two or more source tables that is run like a Hash Join OR as a merge join and sorting operation
d. ADDING an indicator on the target index table being present on the target table (fusion of the index for new lines entries is postponed if an ADDITION is used)Hemant K Collette
http://hemantoracledba.blogspot.com -
How to publish data from a table using the form
Oracle forms6i
Hai All
I have download the data to the table of forms...
My table T1 columns are in, out, intrinsically, introut, empname, empno
and another table T2 consist of three columns is empno, date, time
In table T2 time fields consist of symbol I, o... I mean IN, O OUT
My Question is when an employee consist of 3 I - the three IN the time
0815 I, 1200 I, 1415 I and OUT 3 times O 1245, 1445 O, O 1715 and empno is 001
If there is no record for empno 001 then insert first 0815 I in table T1 in column IN time then
to update 1200 intrinsically and 1245 update to Introut and 1415 up-to-date addin and 1445 to update addout
Finally in 1715 to outtime
If it is possible to do without Hardcoding the moment Pls tell with some good example
Thanks in advance
Srikkanth.MThere now things are clear... whenever I have free I'll post the code u how to do this...
for now, a few tips can help u
-create the table to store operations travel so then u can't fix your code
-Create the cursor on the T1 line through all the records.
-Control what empno and action i.e. the current cursor (I or O) exists on the same date in T2 or not
-otherwise exists insert a new record check also the SHIFT is coming in time IF is between MAJ 0815 and 1645 and action is 'I' then insert record and column values accordingly
s ' there is, then update the record of the place where empno = cur.empno and attendance_date = mydate;
It may be useful
Baig,
[My Oracle Blog | http://baigsorcl.blogspot.com/] -
How to add data to the table using Manager POST for restful Apex application
Hi all
I managed to create a service application web Manager restful using GET for the Restful service module. I am able to get the data in row on the presentation of a table row id in the application. But I can't find an appropriate example, how the new data in the table can be posted or deleted. I created a POST handler for a URI scheme and look forward on how to proceed. Any help would be really appreciated.
Source for the POST Manager:
Start
insert into ALL_BOOKS values(:id,:book);
end;
Also created 2 parameters id and the book.
Hi jerry2134,
jerry2134 wrote:
I managed to create a service application web Manager restful using GET for the Restful service module. I am able to get the data in row on the presentation of a table row id in the application. But I can't find an appropriate example, how the new data in the table can be posted or deleted. I created a POST handler for a URI scheme and look forward on how to proceed. Any help would be really appreciated.
Source for the POST Manager:
Start
insert into ALL_BOOKS values(:id,:book);
end;
Also created 2 parameters id and the book.
Check out the following tutorials OBE, that explains the creation of GET and POST RESTful Web Services and how to use them in the APEX.
- Creation and use of RESTful Web application services Express 4.2
- Creation and use of RESTful Web application services Express 5.0
Also what yo mean "looking forward on how to proceed? Do you want to or created for use/consume in your Oracle APEX application hosted RESTful web services?
If Yes, in your Application, you must create a RESTful Web Service reference -> shared components. Then, create a form/report based on Web Service reference.
Kind regards
Kiran
-
XML data in the table using sql/plsql
Hi experts,
Could you please help with the following requirement. I have the tags xml (.xml on a server file) below. I need to access this file and read the XML and insert into the db table using sql and plsql. Is it possible with the cdata below? And there is a nested this table.
Could someone please guide me if you have a sample code file and xml.
<? XML version = "1.0" encoding = "UTF-8"? >
< generation_date > <! [CDATA [17/11/2015]] > < / generation_date >
< generated_by > <! [CDATA [Admin Admin]] > < / generated_by >
< year > <! [CDATA [2015]] > < / year >
< month > <! [CDATA [01]] > < / month >
< author >
< author > <! [CDATA [user author]] > < / author > < author_initial > <! [CDATA [user]] > < / author_firstname > < author_country > <! [CDATA [author]] > < / author_lastname >
< author_email > <! [CDATA [[email protected]]] > < / author_email >
< author_data_01 > <! [CDATA []] > < / author_data_01 >
< author_data_02 > <! [CDATA []] > < / author_data_02 >
< items >
< article_item >
< article_id > <! [CDATA [123456]] > < / article_id >
< publication > <! [CDATA [Al Bayan]] > < / publication >
< section > <! [CDATA [Local]] > < / section >
< issue_date > <! [CDATA [11/11/2015]] > < / issue_date >
< page > <! [CDATA [2]] > < / print this page >
< article_title > <! [CDATA [title.]] > < / article_title > < number_of_words > <! [CDATA [165]] > < / number_of_words >
< original_price > <! [CDATA [200]] > < / original_price >
< original_price_currency > <! [CDATA [DEA]] > < / original_price_currency >
< price > <! [CDATA [250]] > < / price >
< price_currency > <! [CDATA [DEA]] > < / price_currency >
< / article_item >
< / articles >
< total_amount > <! [CDATA [250]] > < / total_amount >
< total_amount_currency > <! [CDATA [DEA]] > < / total_amount_currency >
< / author >
< / xml >
Thanks in advance,
Suman
XMLTABLE using...
SQL > ed
A written file afiedt.buf1 with t (xml) as (select xmltype ('))
2 ") of the double)"
3
4
5
6
7
8
9
10
11
[[12[email protected]]] >
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34-
35 end of sample data
36-
37 - assumptions:
(38 - a) XML may have several tags
(39 - b) eachmay contain more
40-
41 select x.gen_by, x.gen_date, x.mn, x.yr
42, y.author, y.auth_fn, y.auth_ln, y.auth_cnt, y.auth_em, y.auth_d1, y.auth_d2
43, z.id, z.pub, z.sec, z.iss_dt, z.pg, z.art_ttl, z.num_wrds, z.oprice, z.ocurr, z.price, z.curr
44 t
45, xmltable ('/ authxml')
from $ 46 t.xml
path of 47 columns gen_date varchar2 (10) '. / generation_date'
48, path of varchar2 (15) of gen_by '. / generated_by'
49, path of varchar2 (4) year '. "/ year"
50 varchar2 (2) mn road '. "/ month"
51, path of xmltype authors '.'
52 ) x
53, xmltable ('/ authxml/authors ')
from $ 54 x.authors
author of 55 path of varchar2 columns (15) '. / author'
56, path of varchar2 (10) of auth_fn '. / author_firstname'
57, path of varchar2 (10) of auth_ln '. / author_lastname'
58 road of VARCHAR2 (3) auth_cnt '. / author_country'
59 road of varchar2 (20) of auth_em '. / author_email'
60 road of varchar2 (5) of auth_d1 '. / author_data_01'
61, path of varchar2 (5) of auth_d2 '. / author_data_02'
62, path of xmltype articles '. / Articles'
63 ) y
64, xmltable ('/ Articles/article_item ')
from $ 65 y.articles
path id 66 number columns '. / article_id'
67, path of varchar2 (10) pub '. ' / publication.
68 road of varchar2 (10) dry '. / section'
69, path of varchar2 (10) of iss_dt '. / issue_date'
70 road of VARCHAR2 (3) pg '. "/ print this page"
71, path of varchar2 (20) of art_ttl '. / article_title'
72, path of varchar2 (5) of num_wrds '. / number_of_words'
73, path of varchar2 (5) of oprice '. / original_price'
74 road to VARCHAR2 (3) ocurr '. / original_price_currency'
75, path of varchar2 (5) price '. "/ price"
76, path of VARCHAR2 (3) curr '. / price_currency'
77* ) z
SQL > /.GEN_DATE GEN_BY YEAR MN AUTHOR AUTH_FN AUTH_LN AUT AUTH_EM AUTH_ AUTH_ ID PUB DRY ISS_DT PG ART_TTL NUM_W OPRIC HEARTS PRICE OCU
---------- --------------- ---- -- --------------- ---------- ---------- --- -------------------- ----- ----- ---------- ---------- ---------- ---------- --- -------------------- ----- ----- --- ----- ---
17/11/2015 Admin Admin 2015 01 user author user author [email protected] 123456 UAE Al Bayan Local 11/11/2015 2 is the title. 165 200 AED AED 250Of course, you'll want to change the types of data, etc. as needed.
I assumed that the XML can contain several "
" sections and that each section can contain several entries. Thus the XMLTABLE aliasing as 'x' gives information of XML, and supplies the data associated with the XMLTABLE with alias 'y' which gets the multiple authors, which itself
section of the XMLTABLE with alias 'z' for each of the article_item. CDATA stuff are handled automatically by SQLX (XML functionality integrated into Oracle's SQL)
-
To access the EFS data from another disk using original certificates/keys
I use EFS on a Workgroup XP - Pro, SP3 and have backed up keys. I am trying to access the files on this disk now mounted in another machine (also SP3 of XP - Pro and in a working group). I can load the certificates of the first machine in the second machine by double clicking it. Once the certificates are loaded, how decipher you? I get "Access denied" when I right click and try to decrypt the files on the original drive. The MS help talk about designating a "File Recovery Agent", but this procedure seems to be to area of machines and gets a bit fuzzy for working groups. BTW, I'm testing my file recovery process. No data is at risk. Can you tell me FAQ to decrypt a file on a disk that is moved to another machine using the original (REC & PFX files) encryption certificates/keys?
Hello Stephen,
The question you posted would be better suited for COMPUTING public Pro on TechNet. I would recommend posting your query in the TechNet Forums to get help:
-
My select statement fails with the error:
The ORA-19011 string buffer too small
The select statement looks like:
SELECT TO_CLOB)
XMLELEMENT ("accounts",
XMLELEMENT ("count",
XMLATTRIBUTES)
rownum AS "recordId."
To_date('20130520','YYYYMMDD') AS "datestarted."
123456 AS "previousBatchId."
56789 AS 'previousRecordId '.
),
....
.... .
.....
XMLFOREST)
SIG_ROLE AS "SignatoryRole."
To_char(TRANSFER_DATE,'YYYY-mm-DD') AS "TransferDate."
NVL(Reason,0) AS 'reason '.
) AS the 'transfer '.
)
()) AS CRDTRPT
OF ANY_TABLE;
- It looks like I can choose only 4000 characters using the SELECT statement (please, correct me if I'm wrong)
I'd use the XMLGEN package. But the environment team says no mounted drives in the future with the arrival of the EXADATA.
NO HARD DRIVE MOUNTED, NO ACCESS TO THE DATABASE DIRECTORIES
No UTL_FILE
I need to use the REEL spool the resulting XML data of the SELECT query.
SQL is a standard in my org, but I can do with a PL/SQL solution also to load data into a table (cannot use a COIL with PL/SQL)
What I do is:
- a column of type CLOB to a xml_report of the loading of the above SELECT query table
- Then using SELECT * FROM xml_report to SPOOL the data to a file report.xml
No need of XMLTYPE data behind. Xml data stream is fine for me.
In addition, I need to validate the XML file, also using XSD.
Problem is that the resulting lines of the select query are supposed to be from 15000 to 20000 bytes long.
Oracle database version: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64 bit Production
A Suggestion or a solution to this problem would be appreciated.
(Sorry for the use of "BOLD", just to make it more readable and highlight the imp points)
Bravo!
Rahul
It looks like I can choose only 4000 characters using the SELECT statement (please, correct me if I'm wrong)
You use the right method.
There is an implicit conversion from XMLType to the data type VARCHAR2 as expected by the function TO_CLOB, where the limitation, and the error.
To serialize XMLType to CLOB, use the XMLSerialize function:
SELECT XMLSerialize (DOCUMENT
XMLELEMENT ("accounts",
...
)
)
OF ANY_TABLE;
For the rest of the requirement, I wish you good luck trying to spool the XML correctly.
You may need to play around with the SET LONG and SET LONGCHUNKSIZE commands to operate.
-
Select table data in another table
Hello
I want to select data from A table, which is not in table B.
Currently I am doing:
Select
smofnut,
nameA,
dobA
a.
If smofnut not in
(select A snoB, B).
where smofnut = snoB
and nameA = Name)
But above all it is very slow.
I can do something like:
Select
smofnut,
nameA,
dobA
of A, B
where
EXCLUDE (smofnut = snoB and nameA = Name)
Please note that I need the place where the condition on the two columns.
any help will be appreciated.
-HarveyWhat are approximate data to A and B?
What is 'very slow '?
What version of Oracle?
What is the query plan?Without knowing anything about your system, my first thought would be to see if a NOT EXISTS happened to be faster for your data
SELECT snoA, nameA, dobA FROM a WHERE NOT EXISTS ( SELECT 1 FROM b WHERE a.snoA = b.snoB AND a.nameA = b.nameB )
Of course, I don't know why you want to join A & B in your NOT IN subquery. It would seem that you would need just a subquery correlated, i.e.
SELECT snoA, nameA, dobA FROM a WHERE snoA NOT IN ( SELECT snoB FROM b WHERE a.snoA = b.snoB AND a.nameA = b.nameB )
This should be more efficient than the original request. The NOT EXISTS version may or may not be more effective than NOT IN according to data volumes.
Justin
-
Before Insert TRIGGER to create partitions problem
Hello
I m having a problem with the following situation in Oracle 8i:
I have a table TEST_TABLE, which is divided by the beach with a DATE column. The idea is to have a partition for each month, so the HIGH_VALUE of partitions is always the first day of the month following that represents the partition.
I created a BEFORE TRIGGER INSERT on the table TEST_TABLE, which tests if the partition for the month of registration which is being inserted exists and, in case it doesn´t, a PROC AUTONOMOUS_TRANSACTION is called to create the TRIGGER.
Running the code below one can see that even if partitions are created as expected, when you try to insert a record with a date greater than the last partition for the first time, this error is returned:
ORA-14400: inserted partition key exceeds plu legal partition key.
Note that if you run the same statement again insert, it s inserted correctly on the partition that was created the first try.
I´ll appreciate any help on this matter.
code
----------------
CREATE TABLE TEST_TABLE)
IDENTIFICATION NUMBER,
DATE OF THE DT
)
TABLESPACE USERS
PARTITION BY RANGE (DT)
(
PART_B42009 PARTITION VALUES LESS THAN (TO_DATE ('2009-01-01 00:00:00 ',' YYYY-MM-DD HH24:MI:SS ',' NLS_CALENDAR = GREGORIAN '))
LOGGING
TABLESPACE USERS
);
/
CREATE OR REPLACE PROCEDURE SP_ADD_PARTITION (TEST_TABLE P_DATE. DT % TYPE)
IS
PRAGMA AUTONOMOUS_TRANSACTION;
V_STR VARCHAR2 (500);
BEGIN
V_STR: = 'ALTER TABLE TEST_TABLE ADD.
|| 'PARTITION BIRD | TO_CHAR ("P_DATE, ' YYYYMM")
|| ' VALUES LESS (TO_DATE ("')).
|| TO_CHAR (ADD_MONTHS (P_DATE, 1), "YYYY-MM"). '-01 00:00:00 ','
|| ((' SYYYY-MM-DD HH24:MI:SS "," NLS_CALENDAR = GREGORIAN "))';
EXECUTE IMMEDIATE (V_STR);
END SP_ADD_PARTITION;
/
CREATE OR REPLACE TRIGGER TR_B_I_R_TEST_TABLE
BEFORE INSERTING
ON TEST_TABLE FOR EACH LINE
DECLARE
NUMBER OF V_PARTITION_EXISTS;
BEGIN
IF: NEW. DT > = TO_DATE ('2009-01-01 00:00:00 ',' ' YYYY-MM-DD HH24:MI:SS) THEN
IMMEDIATELY EXECUTE (' SELECT COUNT (1) ")
|| "Of all_tab_partitions atp'."
|| "WHERE atp.table_name ="table_test"'.
|| "AND the atp. Nom_partition =: v1 ')
IN V_PARTITION_EXISTS
WITH THE HELP OF "BIRD" | TO_CHAR(:NEW.) "DT,"YYYYMM";)
IF V_PARTITION_EXISTS = 0 THEN
DBMS_OUTPUT. Put_line ('Partition [' |]) 'BIRD ' | TO_CHAR(:NEW.) "DT,"YYYYMM"). does not exist!') ;
DBMS_OUTPUT. Put_line ('creation..');
SP_ADD_PARTITION (: NEW.) DT);
DBMS_OUTPUT. Put_line ('success.');
IMMEDIATELY EXECUTE (' SELECT COUNT (1) ")
|| "Of all_tab_partitions atp'."
|| "WHERE atp.table_name ="table_test"'.
|| "AND the atp. Nom_partition =: v1 ')
IN V_PARTITION_EXISTS
WITH THE HELP OF "BIRD" | TO_CHAR(:NEW.) "DT,"YYYYMM";)
IF V_PARTITION_EXISTS = 1 THEN
DBMS_OUTPUT. Put_line ("it s visible at this point..");
ON THE OTHER
DBMS_OUTPUT. Put_line ("it s not visible at this point..");
END IF;
ON THE OTHER
DBMS_OUTPUT. Put_line ('Partition [' |]) 'BIRD ' | TO_CHAR(:NEW.) DT, "YYYYMM")
|| already exists! ") ;
END IF;
END IF;
DBMS_OUTPUT. Put_line ('continues with insertion...");
END TR_B_I_R_TEST_TABLE;
/
-Go to the low score
INSERT INTO TABLE_TEST VALUES (1, TO_DATE ('2008-12-31 23:59:59 ',' YYYY-MM-DD HH24:MI:SS'));))
-Returns the error on the first try
INSERT INTO TABLE_TEST VALUES (2, TO_DATE ('2009-01-01 00:00:01 ',' YYYY-MM-DD HH24:MI:SS'));))
----------------It is the use of the pragma AUTONOMOUS TRANSACTION. Your current transaction cannot see the result of this DOF since it occurs outside of the current transaction. The clue is in the name.
Of course, you cannot run the DDL in a trigger without use of this pragma, so you're pretty much stuck. There is a solution in 11g, but that will not help you. Unfortunately, your only option is to pre-create the partitions required in front of the need. For example, you might have a DBMS JOB to create a partition for the next month, which takes place the last day of each month (or logical date of company).
Cheers, APC
blog: http://radiofreetooting.blogspot.com
-
Update of the data in the Table using XMLTYPE DATA
I did insertions using XMLTYPE data but have never done it and update. Can someone give me some advice?
PROCEDURE ADD_LABORDER_CODES)
IN_ORDERCODESXML IN CLOB DEFAULT NULL,
Number of OUT OUT_AFFECTEDROWS
)
AS
X SYS. XMLTYPE;
BEGIN
X: = SYS. XMLTYPE. CREATEXML (IN_ORDERCODESXML);
INSERT INTO MAINT_LABORD_CODES)
INSERT INTO MAINT_LABORD_CODES)
LABORD_CODE_ID,
COMPENDIUM_ID,
ORDER_CODE,
ORDER_DESC,
ACTIVE,
TIMESTAMP,
MODIFIED_BY)
SELECT MLOCDS_SEQ. NEXTVAL,
EXTRACTVALUE (VALUE (MLOC), '/ ORDERCODE/COMPENDIUM_ID') AS COMPENDIUM_ID,
EXTRACTVALUE (VALUE (MLOC), '/ ORDERCODE/ORDER_CODE') AS ORDER_CODE,
EXTRACTVALUE (VALUE (MLOC), '/ ORDERCODE/ORDER_DESC') AS ORDER_DESC,.
EXTRACTVALUE (VALUE (MLOC), '/ ORDERCODE/LOINC_CODE') AS LOINC_CODE,
EXTRACTVALUE (VALUE (MLOC), '/ ORDERCODE/ACTIVE') AS ACTIVE.
EXTRACTVALUE (VALUE (MLOC), '/ ORDERCODE/TIMESTAMP') AS TIMESTAMP.
EXTRACTVALUE (VALUE (MLOC), '/ ORDERCODE/MODIFIED_BY') AS MODIFIED_BY
TABLE (XMLSEQUENCE (EXTRACT(X,'/ORDERCODES/ORDERCODE'))) NMCO;
OUT_AFFECTEDROWS: = NUMBER OF ROWS SQL %;
EXCEPTION
WHILE OTHERS THEN
dbms_output.put_line (SQLERRM);
RAISE_APPLICATION_ERROR (-20001, SQLERRM);
END;
Example of use of the FUSION-
If the line exists in the target table (based on the COMPENDIUM_ID and ORDER_CODE values), the UPDATE is, if not to INSERT:
declare in_ordercodesxml clob := '
500 696231 ABO Group & Rh Type NULL 12345 Y 2014-08-13 1 Also note that I used XMLTABLE instead of TABLE/XMLSEQUENCE, which is much easier to use (and not deprecated in the latest versions).
You have not precisely the date format in the TIMESTAMP element so I assumed a conform to W3C.
If you have a problem with this part, return to a projection of VARCHAR2 and use TO_DATE with actual size.
-
Copy a column of data in another table?
Hello world
I am a beginner SQL who spent ridiculous amounts of time trying to find a solution to a problem no doubt very simple.
I have two tables composed of 162000 lines each. A table has the following format:
table T1
PID NUMBER
NUMBER OF PG
NUMBER OF PCS
PS NUMBER
NUMBER OF PL
NUMBER OF DRIFTWOOD
(coercion, but a simple data)
The DRIFTWOOD column is empty, all the others are completely filled.
The other table is filled only one column:
table T2
NUMBER OF CALCULATEDCHANCE
My intention is to copy all the rows in the column T2 CALCULATEDCHANCE in the column CYBERJER from T1.
Insert will not work because it adds records. I tried to use the update, but maybe I don't get the syntax right. All I got is to copy the first value of DRIFTWOOD in all lines of CALCULATEDCHANCE.
Thanks in advance for your answers!Try
update T2 set calculatedchance = dbms_random.value(0,100);
-
Update of the data in the table using LAG/LEAD
Hello!
I have a table that looks like:
CREATE TABLE CUSTOMER_INFO_TEST
(
ACCOUNT_NUM VARCHAR2 (40 BYTE),
PHONE VARCHAR2 (100 BYTE),
E-MAIL VARCHAR2 (300 BYTE),
DATE OF START_DT,
DATE OF CHANGE_DT,
END_DT DATE
);
The example data:
INSERT INTO CUSTOMER_INFO_TEST VALUES ('BOB', 555-1234', ", TO_DATE ('2011-01-01', 'YYYY-MM-DD'), TO_DATE ('2011-01-06', 'YYYY-MM-DD'), TO_DATE ('2011-01-10', 'YYYY-MM-DD'));
INSERT INTO CUSTOMER_INFO_TEST VALUES ('BOB', 555-1234', ' BOB@GMAIL.) COM', TO_DATE ('2011-01-01', 'YYYY-MM-DD'), TO_DATE ('2011-01-11', 'YYYY-MM-DD'), NULL);
INSERT INTO CUSTOMER_INFO_TEST VALUES ('BOB', 555-1234', ' BOB@GMAIL.) COM', TO_DATE ('2011-01-01', 'YYYY-MM-DD'), TO_DATE ('2011-01-15', 'YYYY-MM-DD'), NULL);
INSERT INTO CUSTOMER_INFO_TEST VALUES ('JACK', 555-4321', ", TO_DATE ('01-03-2011', 'YYYY-MM-DD'), TO_DATE ('2011-03-06', 'DD-MM-YYYY'), NULL);
INSERT INTO CUSTOMER_INFO_TEST VALUES ('JACK', 555-4321', ' JACK@GMAIL.) COM', TO_DATE ('01-03-2011', 'YYYY-MM-DD'), TO_DATE ('2011-03-11', 'YYYY-MM-DD'), NULL);
My question:
How can I configure end_dt (if null), to the next change_dt minus one
It shows what I want to do:
Select the rowid, account_num, phone, e-mail, start_dt, change_dt, end_dt, nvl (end_dt, lead (change_dt-1, 1) over (partition by account_num of start_dt order)) enddt CUSTOMER_INFO_TEST where end_dt is null;
So, I want to update the table itself with the date in enddt. But how do I do this?
This must be done in a single statement...
Thanks in advance
Richard
Published by: user6702107 on 05-Jan-2011 09:11
Edited by: Rydman on April 17, 2012 15:01Please post sample data!
If your query returns the desired results, you can use the MERGE:SQL> select * 2 from customer_info_test; ACCOUNT_NU PHONE EMAIL START_DT CHANGE_D END_DT ---------- ---------- ------------------------- -------- -------- -------- BOB 555-1234 01-01-11 06-01-11 10-01-11 BOB 555-1234 [email protected] 01-01-11 11-01-11 BOB 555-1234 [email protected] 01-01-11 15-01-11 JACK 555-4321 01-03-11 06-03-11 JACK 555-4321 [email protected] 01-03-11 11-03-11 5 rows selected. SQL> -- SQL> merge into customer_info_test a 2 using ( select rowid rid 3 , nvl(end_dt, lead(change_dt-1, 1) over (partition by account_num order by start_dt)) new_end_dt 4 from customer_info_test 5 where end_dt is null 6 ) b 7 on (a.rowid = b.rid ) 8 when matched then update set a.end_dt = b.new_end_dt; 4 rows merged. SQL> -- SQL> select * 2 from customer_info_test; ACCOUNT_NU PHONE EMAIL START_DT CHANGE_D END_DT ---------- ---------- ------------------------- -------- -------- -------- BOB 555-1234 01-01-11 06-01-11 10-01-11 BOB 555-1234 [email protected] 01-01-11 11-01-11 14-01-11 BOB 555-1234 [email protected] 01-01-11 15-01-11 JACK 555-4321 01-03-11 06-03-11 10-03-11 JACK 555-4321 [email protected] 01-03-11 11-03-11 5 rows selected.
Maybe you are looking for
-
Problem with the new Notes, do not sync with many testimonies
I have several iCloud for work and my only personal accounts, since I just have an iPhone, I tried to synchronize all the accounts and their use on the same device. I found the following, maybe apple wants to examine, or re-thinking that he... HomeMa
-
I wonder if Acer plans to release an update of the BIOS for my Aspire M5 481PT disable Secure Boot. Current BIOS version: 2.20 Currently, my only options are: (1) delete all secure boot settings (2) select a UEFI file as being approved for execution
-
Names aren't allowed on my Contact list when sending Mail
Unsolicited or unauthorized names have been added to my list of 'People' Hotmail that I'm unable to remove. When I write a message and go to my contact list, at the end there are three 'contacts' that I haven't added and cannot remove: DrunkGirlRoom,
-
New accounts of users still connect me automatically.
Former title: new user accounts questions Whenever I try to add new user accounts to my computer, it works first and sets up the account, but when trying to connect, it will fail to load the profile and the signs on its own. What can I do for the new
-
Cisco RV120W PPTP astronomers source IP address
I have a VoIP application that I am trying to run over the PPTP VPN tunnel on a router RV120W. The system is a NEC SV8100 PBX communicate with the phone software NEC (sp310). The system uses SIP to set up the call and for any other information signs