sum of columns query
Hi friends
I use the Oracle 10 g with windows server 2008 Server. How to recover the sum of values of different column based on different conditions
Name of the table - stv_dtls
inv_type varchar2 (5)
credit number (12,2)
Case number (12,2)
name of the column - inv_type - four type of values - type1, type2, type3, null (empty column)
I want to make the sum of the column of credit under the title of type1 and type2, type3, amount of money under the title of cash based on type1, type2, type 3 and adding the total credits and cash
and null I show you separate.
TYPE_1_CR TYPE_1_CA TYPE_2_CR TYPE_2_CA TYPE_3_CR TYPE_3_CA TOTAL CASH Null type CASH
Sum (Credit) sum (cash) sum (credit) sum (cash) sum (credit) sum (cash) sum of all type (CR) sum of money (credit) sum, sum (cash)
Kindly give me a suggestion to solve this query
concerning
RDK
Hello
Looks like you want to use the SUM aggregate function, using CASE expressions when you want to include only certain types:
SELECT SUM (CASE WHEN inv_type = 'type1' CAN in the end credit) AS type_1_cr
, SUM (CASE WHEN inv_type = 'type1' CAN collect END) AS type_1_ca
, SUM (CASE WHEN inv_type = 'type2' CAN in the end credit) AS type_2_cr
, SUM (CASE WHEN inv_type = 'type2' CAN collect END) AS type_2_ca
, SUM (CASE WHEN inv_type = 'type3' CAN in the end credit) AS type_3_cr
, SUM (CASE WHEN inv_type = 'type3' CAN collect END) AS type_3_ca
SUM (cash) AS total_ca
The amount (credit) AS total_cr
, SUM (CASE WHEN inv_type IS NULL THEN credit END) AS null_cr
SUM (CASE WHEN inv_type IS NULL THEN silver END) AS null_ca
OF stv_dtls
;
It will work in Oracle 8.1 and higher, but as of version 11.1, you would like to use SELECT... PIVOT.
I hope that answers your question.
If not, post a small example data (CREATE TABLE and only relevant columns, INSERT statements), and the results you want from this data.
See the FAQ forum: https://forums.oracle.com/message/9362002#9362002
Tags: Database
Similar Questions
-
Dear members,
Suppose I have a command name table and this table has 3 columns by name order_number, travel and net_weight.
The data are as follows:
Of the above, you can see that there are 2 lines, I need to write a SQL in such a way that my data should like below:Trip order# net_weight ---------------------------------------------------- 1876 1234 450 1876 5678 300 7865 6783 250 7865 9738 350
The new sum of column is the sum of net_weight based on the trip.Trip order# net_weight sum ------------------------------------------------------ 1876 1234 450 750 1876 5678 300 750 7865 6783 250 600 7865 9738 350 600
sum = 750-> (450 + 300 based on trip 1876)
sum = 600-> (250 + 350 based on trip 7865)
How can I write a SQL query to get the data as described above.
Thank you
SandeepUse analytical SUM:
with t as ( select 1876 trip,1234 order#,450 net_weight from dual union all select 1876,5678,300 from dual union all select 7865,6783,250 from dual union all select 7865,9738,350 from dual ) -- end of on-the-fly data sample select trip, order#, net_weight, sum(net_weight) over(partition by trip) total_net_weight from t order by trip / TRIP ORDER# NET_WEIGHT TOTAL_NET_WEIGHT ---------- ---------- ---------- ---------------- 1876 1234 450 750 1876 5678 300 750 7865 6783 250 600 7865 9738 350 600 SQL>
SY.
-
sum of column that contains the values in the time format
Hi all
I give you a piece of my code below
kindly helpSELECT dif.EMPLOYEE_NUMBER Employee#, dif.FULL_name EmployeeName, TO_CHAR(START_DATE,'dd-Mon-rrrr') DOJ ,dif.DEPT_NAME,POSITION , DATE_ENTRAY AttendanceDate, to_char(DATE_ENTRAY,'DY') day, TO_CHAR(LNE1,'hh24:mi')TimeIn1 ,TO_CHAR(LNE2,'hh24:mi')TimeOut1, TO_CHAR(LNE3,'hh24:mi')TimeIn2 ,TO_CHAR(LNE4,'hh24:mi')TimeOut2, TO_CHAR(LNE5,'hh24:mi')TimeIn3 , TO_CHAR(LNE6,'hh24:mi')TimeOut3, to_char(decode(LNE8,null , decode(LNE7,null, decode(LNE6,null, decode(LNE5,null, decode(LNE4,null, decode(LNE3,null, decode(LNE2,null,LNE2 ,LNE2 ),LNE3 ),LNE4 ),LNE5 ),LNE6 ),LNE7 ),LNE8 ),'hh24:mi')TimeOuts , ACT_HOUR Work_Hrs, decode(DLY_ABSENT_TYPE,'Late',DED_ABS ,'Late (Deduction)',DED_ABS,'00:00') Late_Hrs,ACT_OVT Over_Time FROM jjj_PUNCH_DATA_EMP_LIST trn , jjj_emp_def dif,jjj_PUNCH_CARD_ELEG ele WHERE trn.EMPLOYEE_NUMBER =dif.EMPLOYEE_NUMBER and ele.EMPLOYEE_NUMBER =trn.EMPLOYEE_NUMBER and DATE_ENTRAY between '23-Aug-2009' and '24-Aug-2009' I require to find the sum of column wrk_hrs Act_hour or wrk_hrs belongs to table jjj_PUNCH_DATA_EMP_LIST trn , and the datatype of act_hour is varchar(10 byte) the values of column act_hrs, (i require the sum of this column) 08:00 07:22 06:08
thanking in advance
concerning
Oracle userHello
Thanks for posting the CREATE TABLE and INSERT. That really helps.
Do you want 19:33 ' as the output? Which makes it look like 19 hours and 33 minutes. Most people would represent 19.33 hours (i.e. 19 more than 1/3 hours) as 19:20 '. Should not the sum of
' x: 00 ' and
'y: 20' be
"z: 20?If you really want 19:33 ', see the solution of Hoek.
If you really want 19:20 ', then you were on the right track.
I think that you were trying to do:WITH got_total_hours AS ( SELECT SUM ( TO_NUMBER (SUBSTR (act_hour, 1, 2)) + ( TO_NUMBER (SUBSTR (act_hour, 4, 2)) / 60 ) ) AS total_hours from p ) SELECT TO_CHAR (FLOOR (total_hours)) || ':' || TO_CHAR ( MOD (total_hours, 1) * 60 , 'fm00' ) AS hh_mm FROM got_total_hours ;
Looks like you were trying calculate total_hours in a subquery, then use the total_hour alias in a query Super, which is quite accurate. You're just confused on how to write a subquery.
There are two basic ways to write subqueries:
(1) WITH clause:
WITH sub_query AS ( SELECT ... ) SELECT ... FROM sub_query ;
(2) online review
SELECT ... FROM ( -- Begin sub_query SELECT ... ) -- End sub-query ;
Looks like you tried a bit of each method.
In most cases (this included problem) either one will work.
Other problems are much easier by using a WITH clause and WITH clauses are usually easier to read and understand, I recommend that you always use a WITH clause rather than views online.You can also change the Hoek solution to get the result 19:20 ' without a subquery (or without using any expression complicated, more than once, which is the only reason why I have proposed a subquery). Since date arithmetic Oracle comes from the days, not hours, when the solution of Hoek has calculated the number of hours, you'll have to divide by 24 to get the number of days. If you are using TO_CHAR to fit the time, however, the results can be confusing if the total is 24 hours or more. You might be better off using NUMTODSINTERVAL.
-
I have the table of 3 columns A, B, C. I want to store the sum of columns A B in the C column without using the DML statements. Can anyone help please how to do. ?
11.1 and especially you have virtual column
SQL> create table t 2 ( 3 a number 4 , b number 5 , c generated always as (a+b) virtual 6 ); Table created. SQL> insert into t (a, b) values (1, 2); 1 row created. SQL> select * from t; A B C ---------- ---------- ---------- 1 2 3
Before that, a front insert - trigger
SQL> create table t 2 ( 3 a number 4 , b number 5 , c number 6 ); Table created. SQL> create or replace trigger t_default before insert on t for each row 2 begin 3 :new.c := :new.a+:new.b; 4 end; 5 / Trigger created. SQL> insert into t (a, b) values (1, 2); 1 row created. SQL> select * from t; A B C ---------- ---------- ---------- 1 2 3
-
The Master Table column updated based on the sum of column Table detail
With the help of JDev 11.1.1.6.I have a master-detail table based on a link to BC.
The main table has a column that displays an InputText or an OutputText, based on the value in another column.
If the InputText is displayed, the user can enter a value and the database will be updated with that value.
If the OutputText is displayed, it must be a sum of a column in the secondary table. Also, this value will be written in the database.
Question:
How can I fill the OutputText in the main table with the sum of the values in a column in the secondary table?
The detail table column is a manually entered InputText field.
Thank you.
Create a spike in the main table and write in its expression as follows - DetailVoAccessorName.sum ("ColumnName");
This will calculate the sum of column table detail and then you can set the value of the transient attribute to attribute DB on backup operation
Ashish
-
Sum of columns together in the responses, and then calculated off summary columns
I have three questions answers (with 4 columns) that I have gathered the results for:
Four columns are: campaign # promoted # Respones, # Respones promoted
Query 1:
Campaign, # promoted, 0, 0
Query 2:
Campaign responses #, 0, 0,.
Query 3:
Campaign, 0, 0, # promoted Respones
My query returns the following (assuming that the campaign code is ABC) 3 rows:
Campaign of # Promoted # Responses # Promoted answers ABC 100 0 0 ABC 0 12 0 ABC 0 0 9 I want to put in place so that it adds campaign to get a row of results as follows:
Campaign of # Promoted # Responses # Promoted answers ABC 100 12 9 I can get this result using the conversion to a PivotTable (won't go thought road if possible because I have other calculations that I need to add). Once I get this summary to a single row result I then do a few such analyses that create:
- Response % (Respones # / # promoted)
- Promoted to the rank of % (# promoted answers / # promoted)
When I add the calculatinos to the percentages to the United query I get null or zero for my results. I used the Add button to insert a calcuation and then I used the following formula: saw_2 / saw_3
Here's what I would preferably:
- Sum of answers (not a pivot table) Table columns.
- Add columns calculated at United query that will produce results based on the combined pivot columns
- If I use the PivotTable, and then how to get the values calculated for work
Thank you...
Ben,
Try to use the rule of aggregation on the columns in the Table view and see if that makes a difference.
I just tried this and the SUM of each of the criteria provided the results I wanted, which in turn you are looking for.
If nothing works for you send me the screenshot of what you are trying to do at vijaybez at gmail and I can understand what is happening properly...
-
Hello
Can someone show me the best query to get the below output
The code I used to read me the below one, but I am confused on how to get the total general.
OUTPUT-select grade,sum(losal)lowsal,sum(hisal)hisal,sum(losal+hisal)as total from SALGRADE group by grade
LOWSAL OF THE CATEGORY WILL DESIGNATE TOTAL
1 1000 2422 3422
2 1801 2490 4291
4 2001 3000 5001
5, 3001, 9999, 13000
3 1401 2000 3401
TOTAL OF THE COLUMN "TOTAL".
CREATE TABLE SALGRADE
(
NUMBER OF GRADE,
NUMBER OF IT,
NUMBER OF DESIGNATE
)
INSERT INTO SALGRADE (GRADE, IT, WILL DESIGNATE) VALUES)
1, 300, 1222).
INSERT INTO SALGRADE (GRADE, IT, WILL DESIGNATE) VALUES)
2, 300, 545);
INSERT INTO SALGRADE (GRADE, IT, WILL DESIGNATE) VALUES)
1, 700, 1200);
INSERT INTO SALGRADE (GRADE, IT, WILL DESIGNATE) VALUES)
2, 1201, 1400);
INSERT INTO SALGRADE (GRADE, IT, WILL DESIGNATE) VALUES)
3, 1401, 2000);
INSERT INTO SALGRADE (GRADE, IT, WILL DESIGNATE) VALUES)
4, 2001, 3000);
INSERT INTO SALGRADE (GRADE, IT, WILL DESIGNATE) VALUES)
5, 3001, 9999);
INSERT INTO SALGRADE (GRADE, IT, WILL DESIGNATE) VALUES)
2, 300, 545);
COMMIT;
Thank you
Published by: GIRIKAIMAL on May 11, 2012 09:58
Published by: GIRIKAIMAL on May 11, 2012 09:59Total general for all columns:
select grade, sum(losal) lowsal, sum(hisal) hisal, sum(losal+hisal) total from salgrade group by grouping sets((),(grade)) order by grade nulls last / GRADE LOWSAL HISAL TOTAL ---------- ---------- ---------- ---------- 1 1700 3622 5322 2 3002 3890 6892 3 2802 4000 6802 4 4002 6000 10002 5 6002 19998 26000 17508 37510 55018 6 rows selected.
Total general just for the total column:
select grade, sign(grade) * sum(losal) lowsal, sign(grade) * sum(hisal) hisal, sum(losal+hisal) total from salgrade group by grouping sets((),(grade)) order by grade nulls last / GRADE LOWSAL HISAL TOTAL ---------- ---------- ---------- ---------- 1 1700 3622 5322 2 3002 3890 6892 3 2802 4000 6802 4 4002 6000 10002 5 6002 19998 26000 55018 6 rows selected. SQL>
SY.
-
Question about SUM in SQL query
I have a ststement SQL that is listed below with its output.
For the NUM_REQ column, I expect to get the number 832 because the pm_requirements_table contains 16 records and they all have a weekly frequency.
16 * 52 = 832. The query returns 12480 I can see that it multiplies the 832 by 15 that matches the number of entries in the table of LPM 16 * 52 * 15 = 12480.
I need the table l/min in there for other reasons, so my question is how can I return 832 and always have the table l/min as part of the query.
Thank you
George
SQL
OUTPUTSELECT 'NAS WIDE' as target , 1 as years , count(distinct lpm.fac_ident) as facilities , SUM(CASE upper(req.frequency) WHEN 'DAILY' THEN 365 WHEN 'WEEKLY' THEN 52 WHEN 'MONTHLY' THEN 12 WHEN 'QUARTERLY' THEN 4 WHEN 'SEMIANNUALLY' THEN 2 WHEN 'ANNUALLY' THEN 1 ELSE 0 END) as num_req FROM lpm, pm_requirements_table req group by 'NAS WIDE';
"TARGET","YEARS","FACILITIES","NUM_REQ" "NAS WIDE",1,1,12480
-- PM_REQUIREMENTS_TABLE "PUBLICATION_ORDER","PUBLICATION_PARAGRAPH_NUMBER","DESCRIPTION","FREQUENCY","CHECK_OR_MAINTENANCE","PRTANTAC_ID" "6310.19A","161A","Check transmitter average rf power output","WEEKLY","",2 "6310.19A","161B","Check transmitter VSWR","WEEKLY","",3 "6310.19A","161C","Check RMS transmitter pulse width","WEEKLY","",4 "6310.19A","161D(1)","Check filament current","WEEKLY","",5 "6310.19A","161D(2)","Check focus coil current","WEEKLY","",6 "6310.19A","161D(3)","Check Klystron voltage","WEEKLY","",7 "6310.19A","161D(4)","Check Klystron current","WEEKLY","",8 "6310.19A","161D(5)","Check PFN voltage","WEEKLY","",9 "6310.19A","161D(6)","Check vacuum pump current","WEEKLY","",10 "6310.19A","161E","Check target receiver MDS","WEEKLY","",11 "6310.19A","161F","Check target receiver NF","WEEKLY","",12 "6310.19A","161G","Check target receiver recovery","WEEKLY","",13 "6310.19A","161H","Check weather receiver MDS","WEEKLY","",14 "6310.19A","161I","Check weather receiver NF","WEEKLY","",15 "6310.19A","161J","Check weather receiver recovery","WEEKLY","",16 "6310.19A","161K","Check spare modem operation","WEEKLY","",17
Published by: George Heller on July 15, 2011 10:32-- LPM table "LOG_ID","FAC_IDENT","FAC_TYPE","CODE_CATEGORY","SUPPLEMENTAL_CODE","MAINT_ACTION_CODE","INTERRUPT_CONDITION","ATOW_CODE","SECTOR_CODE","LOG_STATUS","START_DATE","START_DATETIME","END_DATE","END_DATETIME","MODIFIED_DATETIME","WR_AREA","SHORT_NAME","EQUIPMENT_IDENT","INTERVAL_CODE","EARLIEST_DATE","EARLIEST_DATETIME","SCHEDULED_DATE","SCHEDULED_DATETIME","LATEST_DATE","LATEST_DATETIME","WR_CREW_UNIT","WR_WATCH","PUBLICATION_ORDER","PUBLICATION_ORDER_ORIGINAL","PUBLICATION_PARAGRAPH","PUBLICATION_PARAGRAPH_ORIGINAL","NUMBER_OF_TASKS","LOG_SUMMARY","COMMENTS","RELATED_LOGS","LPMANTAC_ID" 108305902,"ATL","ASR",50,"0","P","","WEQ1C","SO1LB","C",07-MAY-10,"05/07/2010 3:24",07-MAY-10,"05/07/2010 3:28","05/07/2010 3:31","RADAR","SYS","SYSTEM","W","05/02/2010","05/02/2010 0:00",05-MAY-10,"05/05/2010 0:00",08-MAY-10,"05/08/2010 0:00","RADR","","6310.19A","6310.19A","161.K","161. K","1","","","",1 108306002,"ATL","ASR",50,"0","P","","WEQ1C","SO1LB","C",02-MAY-10,"05/02/2010 21:00",02-MAY-10,"05/02/2010 21:30","05/03/2010 1:07","RADAR","SYS","CHAN B","W","05/02/2010","05/02/2010 0:00",05-MAY-10,"05/05/2010 0:00",08-MAY-10,"05/08/2010 0:00","RADR","","6310.19A","6310.19A","161A-J","161 A-J","15","","","",2 108306102,"ATL","ASR",50,"0","P","","WEQ1C","SO1LB","C",02-MAY-10,"05/02/2010 21:00",02-MAY-10,"05/02/2010 21:30","05/03/2010 1:07","RADAR","SYS","CHAN A","W","05/02/2010","05/02/2010 0:00",05-MAY-10,"05/05/2010 0:00",08-MAY-10,"05/08/2010 0:00","RADR","","6310.19A","6310.19A","161A-J","161 A-J","15","","","",3 104188702,"ATL","ASR",50,"0","P","","WEQ1C","SO1LB","C",29-APR-10,"4/29/2010 10:09",29-APR-10,"4/29/2010 10:11","4/29/2010 10:30","RADAR","SYS","SYSTEM","W","4/25/2010","4/25/2010 0:00",28-APR-10,"4/28/2010 0:00",01-MAY-10,"05/01/2010 0:00","RADR","","6310.19A","6310.19A","161.K","161. K","1","","","",4 104188402,"ATL","ASR",50,"0","P","","WEQ1C","SO1LB","C",26-APR-10,"4/26/2010 13:33",26-APR-10,"4/26/2010 13:46","4/26/2010 15:23","RADAR","SYS","CHAN A","W","4/25/2010","4/25/2010 0:00",28-APR-10,"4/28/2010 0:00",01-MAY-10,"05/01/2010 0:00","RADR","","6310.19A","6310.19A","161A-J","161 A-J","15","","","",5 104188502,"ATL","ASR",50,"0","P","","WEQ1C","SO1LB","C",26-APR-10,"4/26/2010 13:33",26-APR-10,"4/26/2010 13:46","4/26/2010 15:23","RADAR","SYS","CHAN B","W","4/25/2010","4/25/2010 0:00",28-APR-10,"4/28/2010 0:00",01-MAY-10,"05/01/2010 0:00","RADR","","6310.19A","6310.19A","161A-J","161 A-J","15","","","",6 101223702,"ATL","ASR",50,"0","P","","WEQ1C","SO1LB","C",19-APR-10,"4/19/2010 1:30",19-APR-10,"4/19/2010 2:10","4/19/2010 3:12","RADAR","SYS","CHAN B","W","4/18/2010","4/18/2010 0:00",21-APR-10,"4/21/2010 0:00",24-APR-10,"4/24/2010 0:00","RADR","","6310.19A","6310.19A","161A-J","161 A-J","15","","","",7 101223802,"ATL","ASR",50,"0","P","","WEQ1C","SO1LB","C",19-APR-10,"4/19/2010 1:30",19-APR-10,"4/19/2010 2:10","4/19/2010 3:12","RADAR","SYS","CHAN A","W","4/18/2010","4/18/2010 0:00",21-APR-10,"4/21/2010 0:00",24-APR-10,"4/24/2010 0:00","RADR","","6310.19A","6310.19A","161A-J","161 A-J","15","","","",8 101223602,"ATL","ASR",50,"0","P","","WEQ1C","SO1LB","C",19-APR-10,"4/19/2010 1:00",19-APR-10,"4/19/2010 1:09","4/19/2010 3:12","RADAR","SYS","SYSTEM","W","4/18/2010","4/18/2010 0:00",21-APR-10,"4/21/2010 0:00",24-APR-10,"4/24/2010 0:00","RADR","","6310.19A","6310.19A","161.K","161. K","1","","","",9 96642602,"ATL","ASR",50,"0","P","","WEQ1C","SO1LB","C",12-APR-10,"04/12/2010 10:25",12-APR-10,"04/12/2010 10:34","04/12/2010 17:49","RADAR","SYS","SYSTEM","W","04/11/2010","04/11/2010 0:00",14-APR-10,"4/14/2010 0:00",17-APR-10,"4/17/2010 0:00","RADR","","6310.19A","6310.19A","161.K","161. K","1","","","",10 96642402,"ATL","ASR",50,"0","P","","WEQ1C","SO1LB","C",11-APR-10,"04/11/2010 11:10",11-APR-10,"04/11/2010 11:15","04/11/2010 12:51","RADAR","SYS","CHAN B","W","04/11/2010","04/11/2010 0:00",14-APR-10,"4/14/2010 0:00",17-APR-10,"4/17/2010 0:00","RADR","","6310.19A","6310.19A","161A-J","161 A-J","15","","","",11 96642302,"ATL","ASR",50,"0","P","","WEQ1C","SO1LB","C",11-APR-10,"04/11/2010 11:05",11-APR-10,"04/11/2010 11:10","04/11/2010 12:51","RADAR","SYS","CHAN A","W","04/11/2010","04/11/2010 0:00",14-APR-10,"4/14/2010 0:00",17-APR-10,"4/17/2010 0:00","RADR","","6310.19A","6310.19A","161A-J","161 A-J","15","","","",12 92805502,"ATL","ASR",50,"0","P","","WEQ1C","SO1LB","C",07-APR-10,"04/07/2010 18:10",07-APR-10,"04/07/2010 18:22","04/07/2010 19:04","RADAR","SYS","CHAN A","W","04/04/2010","04/04/2010 0:00",07-APR-10,"04/07/2010 0:00",10-APR-10,"04/10/2010 0:00","RADR","","6310.19A","6310.19A","161A-J","161 A-J","15","","","",13 92805402,"ATL","ASR",50,"0","P","","WEQ1C","SO1LB","C",07-APR-10,"04/07/2010 17:53",07-APR-10,"04/07/2010 18:05","04/07/2010 19:04","RADAR","SYS","CHAN B","W","04/04/2010","04/04/2010 0:00",07-APR-10,"04/07/2010 0:00",10-APR-10,"04/10/2010 0:00","RADR","","6310.19A","6310.19A","161A-J","161 A-J","15","","","",14 92805302,"ATL","ASR",50,"0","P","","WEQ1C","SO1LB","C",07-APR-10,"04/07/2010 9:55",07-APR-10,"04/07/2010 10:05","04/07/2010 10:29","RADAR","SYS","SYSTEM","W","04/04/2010","04/04/2010 0:00",07-APR-10,"04/07/2010 0:00",10-APR-10,"04/10/2010 0:00","RADR","","6310.19A","6310.19A","161.K","161. K","1","","","",15
-
Max and sum in a query using Group by and dense_rank
Hi all
I am running Oracle 10 G on Windows 2003.
I have two tables, RT_DY_ZONE1EDYCONS and MV_ZONE_HOURLY. I need a query that will give me the SUM of MR_OL01_VT of RT_DY_ZONE1EDYCONS for each month and the maximum value of MR_OL01_FI_0S and MR_OL02_FI_0S and the time of the maximum value for each group for the month of MV_ZONE_HOURLY. I can't combine the two querys I came up with these forums in a single search.
I need the following result, any help would be appreciated.
I used the following query to obtain the SUM of the MR_OL01_VTdatetime, SUM of MR_OL01_VT , max value MR_OL01_FI_0S ,max_time MR_OL01_FI_0S , max value MR_OL02_FI_0S ,max_time MR_OL02_FI_0S January 2010,8.373765,4.96935,2010-01-15:01,5.96835,2010-01-15:17
and this query for the maximum value/time MR_OL01_FI_0S and MR_OL02_FI_0SSELECT TRUNC(VOL.TIMESTAMP, 'MM') datetime, SUM(VOL.MR_OL01_VT) FROM RT_DY_ZONE1EDYCONS VOL GROUP BY TRUNC(VOL.TIMESTAMP, 'MM') ORDER BY TRUNC(VOL.TIMESTAMP, 'MM')
first tableselect TAG_NAME, max(tag_avg) keep (dense_rank last order by tag_avg) over (partition by tag_name) Max_Value, max(datetime) keep (dense_rank last order by tag_avg) over (partition by tag_name) AS MAX_DATE from mv_zone_hourly WHERE tag_name LIKE 'MR_OL0%_FI_0S'
with the sample dataCREATE TABLE RT_DY_ZONE1EDYCONS ( TIMESTAMP DATE NOT NULL ENABLE, HB_OL00_VT NUMBER(12,6), OR_RES0_VT NUMBER(12,6), OP_OL01_VT NUMBER(12,6), FP_OL01_VT NUMBER(12,6), BD_OL01_VT NUMBER(12,6), MR_OL01_VT NUMBER(12,6), Z1E_VT NUMBER(12,6) )
and the second tableInsert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:00','YYYY-MM-DD:HH24'),4.443056,1.088,1.224927,0.663266,0,0.387499,1.079364); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:01','YYYY-MM-DD:HH24'),4.352083,null,0.692914,0.044029,0,0.373536,null); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:02','YYYY-MM-DD:HH24'),null,null,null,null,0,null,null); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:03','YYYY-MM-DD:HH24'),4.300694,null,0.662924,0,0,0.380275,null); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:04','YYYY-MM-DD:HH24'),null,null,null,null,0,null,null); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:05','YYYY-MM-DD:HH24'),0.025694,null,0.650406,0,0,0.342299,null); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:06','YYYY-MM-DD:HH24'),null,null,null,null,0,null,null); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:07','YYYY-MM-DD:HH24'),0.122917,-2.992,0.673062,0,0,0.423474,2.018381); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:08','YYYY-MM-DD:HH24'),0.106944,null,1.27403,0.768119,0,0.449303,null); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:09','YYYY-MM-DD:HH24'),null,null,null,null,0,null,null); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:10','YYYY-MM-DD:HH24'),0.122917,-2.448,0.637977,0,0,0.418056,1.514884); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:11','YYYY-MM-DD:HH24'),0.183333,-2.992,0.649855,0,0,0.401947,2.123531); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:12','YYYY-MM-DD:HH24'),1.157639,-2.992,1.039931,0.463684,0,0.43389,2.212134); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:13','YYYY-MM-DD:HH24'),4.536111,1.36,0.972226,0.381604,0,0.461941,1.36034); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:14','YYYY-MM-DD:HH24'),4.496528,2.176,0.647979,0,0,0.45611,1.216439); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:15','YYYY-MM-DD:HH24'),4.409028,2.72,0.665355,0,0,0.440141,0.583532); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:16','YYYY-MM-DD:HH24'),4.380556,1.36,0.886389,0.256387,0,0.429446,1.448334); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:17','YYYY-MM-DD:HH24'),4.433333,0.272,1.21716,0.656324,0,0.434169,1.85368); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:18','YYYY-MM-DD:HH24'),4.409722,2.176,0.653266,0,0,0.436253,1.144203); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:19','YYYY-MM-DD:HH24'),4.44375,2.448,0.67917,0,0,0.436947,0.879633); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:20','YYYY-MM-DD:HH24'),4.420833,0,1.273057,0.733813,0,0.428474,1.985489); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:21','YYYY-MM-DD:HH24'),4.390278,2.176,0.895212,0.280419,0,0.418195,0.620452); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:22','YYYY-MM-DD:HH24'),4.336806,1.904,0.670843,0,0,0.412711,1.349252); Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:23','YYYY-MM-DD:HH24'),4.305556,2.448,0.689441,0,0,0.409099,0.759016);
with the sample dataCREATE TABLE MV_ZONE_HOURLY ( TAG_NAME VARCHAR2(30), TAG_DESCRIP VARCHAR(100), DATETIME DATE, TAG_AVG NUMBER(12,6) )
Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:00','YYYY-MM-DD:HH24'),4.166712); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:01','YYYY-MM-DD:HH24'),4.96935); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:02','YYYY-MM-DD:HH24'),4.367); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:03','YYYY-MM-DD:HH24'),4.67788); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:04','YYYY-MM-DD:HH24'),4.06335); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:05','YYYY-MM-DD:HH24'),3.23456); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:06','YYYY-MM-DD:HH24'),4.2333555); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:07','YYYY-MM-DD:HH24'),4.5890); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:08','YYYY-MM-DD:HH24'),4.166712); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:09','YYYY-MM-DD:HH24'),4.96735); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:10','YYYY-MM-DD:HH24'),4.8456); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:11','YYYY-MM-DD:HH24'),4.2468); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:12','YYYY-MM-DD:HH24'),4.06335); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:13','YYYY-MM-DD:HH24'),3.9746); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:14','YYYY-MM-DD:HH24'),4.2333555); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:15','YYYY-MM-DD:HH24'),4.47); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:16','YYYY-MM-DD:HH24'),4.166712); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:17','YYYY-MM-DD:HH24'),4.96835); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:18','YYYY-MM-DD:HH24'),4.6890); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:19','YYYY-MM-DD:HH24'),4.42345); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:20','YYYY-MM-DD:HH24'),4.06335); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:21','YYYY-MM-DD:HH24'),3.4579); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:22','YYYY-MM-DD:HH24'),4.2333555); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:23','YYYY-MM-DD:HH24'),4.45789); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:00','YYYY-MM-DD:HH24'),5.166712); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:01','YYYY-MM-DD:HH24'),5.97835); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:02','YYYY-MM-DD:HH24'),5.367); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:03','YYYY-MM-DD:HH24'),5.67788); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:04','YYYY-MM-DD:HH24'),5.06335); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:05','YYYY-MM-DD:HH24'),4.23456); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:06','YYYY-MM-DD:HH24'),5.2333555); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:07','YYYY-MM-DD:HH24'),5.5890); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:08','YYYY-MM-DD:HH24'),5.166712); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:09','YYYY-MM-DD:HH24'),5.95635); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:10','YYYY-MM-DD:HH24'),5.8456); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:11','YYYY-MM-DD:HH24'),5.2468); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:12','YYYY-MM-DD:HH24'),5.06335); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:13','YYYY-MM-DD:HH24'),4.9746); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:14','YYYY-MM-DD:HH24'),5.2333555); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:15','YYYY-MM-DD:HH24'),5.47); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:16','YYYY-MM-DD:HH24'),5.166712); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:17','YYYY-MM-DD:HH24'),5.96835); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:18','YYYY-MM-DD:HH24'),5.6890); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:19','YYYY-MM-DD:HH24'),5.42345); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:20','YYYY-MM-DD:HH24'),5.06335); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:21','YYYY-MM-DD:HH24'),4.4579); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:22','YYYY-MM-DD:HH24'),5.2333555); Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:23','YYYY-MM-DD:HH24'),5.45789);
Hello
Thanks for posting the CREATE TABLE and INSERT statements; This is really useful!
Your volumninous sample data are all for a month. Since your first query is grouped by month, I suppose that this is not always the case.
In this case, you can make GROUP BY queries on two tables separately, in two subqueries, then join the results.
For the separate columns for the values of mv_zone_hoiurly, I GROUP BY the name of tag in the subquery, then their pivot in two columns in the query of amin.WITH edycons_summary AS ( SELECT TRUNC (tmstmp, 'MM') AS mnth , SUM (mr_ol01_vt) AS sum_mr_ol01_vt FROM rt_dy_zone1edycons GROUP BY TRUNC (tmstmp, 'MM') ) , hourly_summary AS ( SELECT tag_name , TRUNC (datetime, 'MM') AS mnth , MAX (tag_avg) AS max_avg , MAX (datetime) KEEP (DENSE_RANK LAST ORDER BY tag_avg NULLS FIRST) AS max_avg_datetime FROM mv_zone_hourly WHERE tag_name IN ( 'MR_OL01_FI_0S' , 'MR_OL02_FI_0S' ) GROUP BY tag_name , TRUNC (datetime, 'MM') ) SELECT TO_CHAR (v.mnth, 'fmMonth YYYY') AS datetime , v.sum_mr_ol01_vt , MAX ( CASE WHEN h.tag_name = 'MR_OL01_FI_0S' THEN h.max_avg END ) AS max_avg_01 , MAX ( CASE WHEN h.tag_name = 'MR_OL01_FI_0S' THEN h.max_avg_datetime END ) AS datetime_01 , MAX ( CASE WHEN h.tag_name = 'MR_OL02_FI_0S' THEN h.max_avg END ) AS max_avg_02 , MAX ( CASE WHEN h.tag_name = 'MR_OL02_FI_0S' THEN h.max_avg_datetime END ) AS datetime_02 FROM edycons_summary v JOIN hourly_summary h ON h.mnth = v.mnth GROUP BY v.mnth , v.sum_mr_ol01_vt ;
-
SUM OF COLUMN IN A TABLE.
How can I make the sum of the rows in a table in the apex oracle 4.2.6.
I want to perform validations on adj_amount column in a table. If the adj_amount column for the selected row is < = 0 then return false.
How can I choose the name of the column of tabular presentation in validation.
Hi Maxence,
CORINE wrote:
How can I make the sum of the rows in a table in the apex oracle 4.2.6.
I want to perform validations on adj_amount column in a table. If the adj_amount column for the selected row is<=0 then="" return="">=0>
How can I choose the name of the column of tabular presentation in validation.
Always provide the necessary information on your question.
Members of the Forum cannot determine information such as the type of tabular presentation, you use (i.e. Assistant generated/manual based on APEX_ITEM).
Now, the information you have provided, I concluded you want proof on a certain column the sum of this column must be greater than zero.
Well, you can use a "PLSQL function return Boolean" type, the level of the Page using APEX_APPLICATION validation. G_FXX berries.
Determine a column which is not null loop through this column and the adj_amount of the amount column. Determine if the sum is greater than zero and therefore trigger the validation.
Validation sample code would be:
DECLARE L_ADJ_AMOUNT_SUM NUMBER := 0; BEGIN FOR I in 1 .. APEX_APPLICATION.G_F01.COUNT LOOP L_ADJ_AMOUNT_SUM := L_ADJ_AMOUNT_SUM + TO_NUMBER(NVL(APEX_APPLICATION.G_F02(I),0)); END LOOP; IF L_ADJ_AMOUNT_SUM <= 0 THEN RETURN FALSE; ELSE RETURN TRUE; END IF; END;
Here APEX_APPLICATION. G_F01 is a column not null and APEX_APPLICATION. G_F02's adj_amount column for example.
NOTE: You can use a code inspection tools in the browser to identify to which column is mapped to APEX_APPLICATION. Table G_FXX.
Just inspect element in the column to determine the name attribute. If name = "f04" then it is mapped to APEX_APPLICATION. G_F04.
I hope this helps!
Kind regards
Kiran
-
Hello
I created a block with 15 records of brands for example for 15 subjects. now, I want to sum up this column, but when I take the text of the Ant or view the item with the sum of the setting of this column in the property happened 15 times. but I need that a registration.
If I'm moving that to other block throwing error. FRM-30377: Preview to the point must reside in the block of single record or within the same block that point summarized.
Please, help me to sum up this column.
Thank youIf I'm moving that to other block throwing error
Place the summary total item in a control block. On the property of this block range, under the box of records, set the property of single record Yes.
-
Hello
My question is - I would like to make the sum of the column "volume".
Current Output ENVIRONMENT Volume Production 79 QA 66 Development 30 Desired Output ENVIRONMENT Volume Production 79 QA 66 Development 30 *Total 175* select * from ( select ci.environment, count(1) "Volume" from tsgdw_task@awhsncp1 t, tsgdw_operational_process@awhsncp1 op, tsgdw_support_group@awhsncp1 sg, tsgdw_person@awhsncp1 p, tsgdw_configuration_item@awhsncp1 ci where sg.SUPPORT_GROUP_SEQ_ID in (SELECT sg2.SUPPORT_GROUP_SEQ_ID FROM tsgdw_support_group@awhsncp1 sg2 start with sg2.SEARCH_CODE = 'DBNONEC' --p_target_name connect by prior sg2.SUPPORT_GROUP_SEQ_ID=sg2. PARENT_SUPPORT_GROUP_SEQ_ID) and t.task_seq_id = op.task_seq_id and sg.support_group_seq_id = op.support_group_seq_id and op.assigned_person_seq_id = p.person_seq_id and op.open_CONFIG_ITEM_SEQ_ID = ci.CONFIGURATION_ITEM_SEQ_ID and t.task_type = 'Change Task' and t.state = 'Closed' --and sg.search_code = p_target_name and ci.environment in ('Production','Development','Staging','QA') and OP.EST_ACTUAL_END_DATETIME between to_date('01-Apr-2012,00:00:00','DD-Mon-YYYY,HH24:MI:SS') and to_date('08-Apr-2012,23:59:59','DD-Mon-YYYY,HH24:MI:SS') --and t.LAST_UPDATE_DATE between ??EMIP_BIND_START_DATE?? and ??EMIP_BIND_END_DATE?? group by ci.environment order by 2 desc) where rownum < 6;
Use the rollup group by
-
I created a form with 12 columns and 23 fixed lines. In line 1 of table is a value, the 22 remaining lines allow the user to enter a number, what I would like task is in the footer line power add up the figures in 22 columns and then multiple that by rank value for a total 1.
Any help would be appreciated.
Hi Steve,.
for now, I'll send your pdf with the calculation script.
You must use the following script in each column in the total field
-Sterling is the objectname in the user input in column 1 in each row field
-Various is the objectname in the user input in column 1 in each row field
- ...
Column $ = Sum(Row1[*].Sterling)* Row1.TTValue Columns $ = Sum(Row1[*].Other)* Row1.OtherValue Column $ = Sum(Row1[*].Banking)* Row1.BankingValue
You understand that?
You must use the [*] the repeated thing, in your, this is the Row1. You can see on the Row1 [0]... Row1 [1). And your amount, for example 4 oder 7.50 needs a static cross reference by column.
I hope I could help?
Mandy
-
Help! Calculate a percentage between two fields that the sum of columns
Members of the forum good day!
I'm hoping to get help with and model RTF I use to sum up the output of a query in PeopleSoft.
I have a line with two fields of the query and a third field calculates a percentage using the query fields:
Row1, Field1: EXPR14_14
Row1, Field2: EXPR1_1
Row1, Field3: <? xdofx: If EXPR1_1 <>0 and EXPR14_14 0 then (EXPR1_1 DIV EXPR14_14) <>* 100 or 0 end if? >
I have a second row who has these fields:
Row2, Field1: <? sum (EXPR14_14)? >
Row2, Field2: <? sum (EXPR1_1)? >
Row3, Field3: I need the third field to calculate the percentage of sum of two fields.
I tried different ways to insert this calculation with no luck. It's my first time to create a rtf report and any help would be gratly appreciated.
In advanced thank you for your help.
Kind regards
RaquelFor the calculation of percentage you can use something similar to this (adjust the formula you need)
(sum(current-group() xpr1_1))="" div="" (sum(current-group()/expr14_14)="" )*="" 100?=""?>
For Row1, Field3:(EXPR1_1 div="" expr14_14)*100?=""?>
Hope that helps. If you need more help, send me the xml file and the [email protected] model
Thank you
Bipuser -
Symbol of the missing sum in column a of the measures
Hi, I need Sum (∑) by one of the measures, and this column is missing the ∑ symbol. Why would it be? All other columns have ∑.
Thank you.It won't be shown on measure,
Still one thing that discovers you are using? do you have a screenshot of it?
Maybe you are looking for
-
How can I Retirerr the bs bar if there is no option in my tools
When I click on tools, Options, tabs... There is no box to uncheck so that these stupid tabs will stop at the display.now the King header takes 20% of my screen. This isn't just an inefficient use of space, but the stupid jerk who came with it should
-
Swapping screens on a X 61 Tablet
I have a X 61 Tablet, I liked better than any computer I've owned. However, it is no longer feeds the backlight on the screen, and after a lot of time to look at the repair options * I am fairly certain that it is a motherboard problem. Looks like I
-
Sending mail with an attachment using UTL_SMTP
Hi allI'm working on a PL/SQL code, where do I send a mail with an attachment using UTL_SMTP. I use the following code, I changed the value of p_to and p_fromDECLAREl_mail_conn UTL_SMTP.connection;l_boundary VARCHAR2 (50): = '-= * #abc1234321cba #* =
-
Images related to the project (wedding album)
I just ask how is it possible to work with the ID using a normal navigation on file/images (like the old minijonction) when you insert multiple images in a project and you need to keep them on track (see easily images of witch you already insert into
-
Dell inspiron 15R 5520 Graphics
I have a dell inspiron 15r 5520 and I'm trying to find a dedicated graphics card, which is less than $ 100 Can someone give me a list?