Query to group dates by days of week

Hello

I have a table that contains the following format:

Create table testtable(dates date, day_identifier number);

The data in the day_identifier column contains day dates of the corresponding columns of equivalent week, i.e.

to_char(dates, 'd')

The table contains a suite of sample data:

Dates Day_identifier
October 1, 20133
October 2, 20134
4 October 20136
October 6, 20131
8 October 20133
October 9, 20134
October 11, 20136
October 18, 20136
October 21, 20132
October 23, 20134

I am looking for a query that will consolidate the data above, based on the data in the column day_identifier in the following format:

October 1, 2013 1346 11 October 2013

18 October 2013 October 23, 2013 246

The above so well developed data for example

all dates between October 1, 2013 and October 11, 2013 and the day of the week value in 1,3,4,6

and

all dates between October 18, 2013 and October 23, 2013 and having the day of the week in 2,4,6 value

will give me the result set from the table above.

Please help me to solve the issue.

Thank you.

Like this? (The results of the previous example has not changed)

with

flights only

(select 1 flight_no, to_date('01-10-2013','dd-mm-yyyy') flight_date, 3 day_identifier union double all the)

Select 1, to_date (March 10, 2013 ', 'dd-mm-yyyy'), 5 Union double all the

Select 1, to_date (August 10, 2013 ', 'dd-mm-yyyy'), 3 union double all the

Select 1, to_date (October 10, 2013 ', 'dd-mm-yyyy'), 5 Union double all the

Select 1, to_date (October 15, 2013 ', 'dd-mm-yyyy'), 3 union double all the

Select 1, to_date (17 October 2013 ', 'dd-mm-yyyy'), 5 Union double all the

Select 1, to_date (22 October 2013 ', 'dd-mm-yyyy'), 3 union double all the

Select 1, to_date (24 October 2013 ', 'dd-mm-yyyy'), 5 Union double all the

Select 1, to_date (29 October 2013 ', 'dd-mm-yyyy'), 3 union double all the

Select 1, to_date (31 October 2013 ', 'dd-mm-yyyy'), 5 Union double all the

Select 1, to_date (11 may 2013 ', 'dd-mm-yyyy'), 3 union double all the

Select 1, to_date (11 July 2013 ', 'dd-mm-yyyy'), 5 Union double all the

-select 1, to_date (December 11, 2013 ',' dd-mm-yyyy '), 3 all the double union

Select 1, to_date (14 November 2013 ', 'dd-mm-yyyy'), 5 Union double all the

-select 1, to_date (November 19, 2013 ',' dd-mm-yyyy '), 3 all the double union

Select 1, to_date (November 21, 2013 ', 'dd-mm-yyyy'), 5 Union double all the

-select 1, to_date (November 26, 2013 ',' dd-mm-yyyy '), 3 all the double union

Select 1, to_date (28 November 2013 ', 'dd-mm-yyyy'), 5 Union double all the

Select 1, to_date (March 12, 2013 ', 'dd-mm-yyyy'), 3 union double all the

Select 1, to_date (May 12, 2013 ', 'dd-mm-yyyy'), 5 Union double all the

Select 1, to_date (12 October 2013 ', 'dd-mm-yyyy'), 3 union double all the

Select 1, to_date (12 December 2013 ', 'dd-mm-yyyy'), 5 Union double all the

Select 1, to_date (17 December 2013 ', 'dd-mm-yyyy'), 3 union double all the

Select 1, to_date (19 December 2013 ', 'dd-mm-yyyy'), 5 Union double all the

Select 1, to_date (24 December 2013 ', 'dd-mm-yyyy'), 3 union double all the

Select 1, to_date (December 26, 2013 ', 'dd-mm-yyyy'), 5 Union double all the

Select 1, to_date (31 December 2013 ', 'dd-mm-yyyy'), 3 union double all the

Select 1, to_date (February 1, 2014 "," dd-mm-yyyy '), 5 Union double all the

Select 1, to_date (July 1, 2014 "," dd-mm-yyyy '), 3 Union double all the

Select 1, to_date (September 1, 2014 "," dd-mm-yyyy '), 5 Union double all the

Select 1, to_date (January 14, 2014 "," dd-mm-yyyy '), 3 Union double all the

Select 1, to_date (January 16, 2014 "," dd-mm-yyyy '), 5 Union double all the

Select 1, to_date (January 21, 2014 "," dd-mm-yyyy '), 3 Union double all the

Select 1, to_date (January 23, 2014 "," dd-mm-yyyy '), 5 Union double all the

Select 1, to_date (28 January 2014 "," dd-mm-yyyy '), 3 Union double all the

Select 1, to_date (January 30, 2014 "," dd-mm-yyyy'), 5 double

),

all_identifiers as / * collection presents all the day_identifier * /.

(select flight_no,

min (from_date) from_date,

Max (till_date) till_date,

listagg(day_identifier,'') within the Group (order by day_identifier) day_identifiers

from (select flight_no,

min (flight_date) from_date,

Max (flight_date) till_date,

day_identifier

flights

Flight_no group, day_identifier

)

Flight_no group

),

generated_rows as / * generate all the resulting flight_date of day_identifier collected * /.

(select flight_no,

flight_date,

day_identifier

from (select flight_no,

level - 1 flight_date, + from_date

case when instr (day_identifiers, to_char (tier - 1, + from_date '))! = 0

then to_number (to_char (tier - 1, + from_date '))

end day_identifier

of all_identifiers

connect by level<= till_date="" -="" from_date="" +="">

and prior flight_no = flight_no

and prior sys_guid() is not null

)

where day_identifier is not null

),

matched_rows as / * flights against generated lines correspondence reveals partially cancelled flights * /.

(select g.flight_no,

g.flight_date generated_date,

g.day_identifier,

f.day_identifier flights_day_identifier / * null if flight cancelled * /.

of generated_rows g

left outer join

F flights

on g.flight_no = f.flight_no

and g.flight_date = f.flight_date

and g.day_identifier = f.day_identifier

),

grouped_rows as / * grouping of lines before the final response * /.

(select flight_no,

generated_date,

day_identifier,

flights_day_identifier,

-case when count (day_identifier) on (partition flight_no, day_identifier

order of generated_date

lines between unbounded preceding and following unlimited

) =

Count (flights_day_identifier) more (partition flight_no, day_identifier

order of generated_date

lines between unbounded preceding and following unlimited

)

Press 'Finish' / * count_of_all = count_of_not_null * /.

otherwise "incomplete".

end day_identifier_type

ROW_NUMBER() (flight_no, day_identifier, generated_date order)-

ROW_NUMBER() over grp (partition flight_no, day_identifier, flights_day_identifier order of generated_date) / * tabibitosan * /.

of matched_rows

)

Select flight_no,

TO_CHAR (min (from_date), 'dd-mon-yyyy') from_date,.

TO_CHAR (Max (till_date), 'dd-mon-yyyy') till_date,.

listagg(day_identifier,'') within the Group (order by day_identifier) day_identifiers

from (select flight_no,

min (generated_date) from_date,

Max (generated_date) till_date,

day_identifier

of grouped_rows

where day_identifier_type = 'Finish '.

Flight_no group, day_identifier

)

Flight_no group

Union of all the

Select distinct

flight_no,

TO_CHAR (FIRST_VALUE (generated_date) more (partition flight_no, grp)

order of generated_date

),

"Mon-dd-yyyy".

) from_date,.

TO_CHAR (FIRST_VALUE (generated_date) more (partition flight_no, grp)

generated_date desc order

),

"Mon-dd-yyyy".

) till_date,.

TO_CHAR (day_identifier) day_identifiers

of grouped_rows

where day_identifier_type = "incomplete".

and flights_day_identifier is not null

order of flight_no, day_identifiers, from_date

FLIGHT_NO FROM_DATE TILL_DATE DAY_IDENTIFIERS
1 October 1, 2013 November 5, 2013 3
1 December 3, 2013 January 28, 2014 3
1 October 3, 2013 January 30, 2014 5

Concerning

Etbin

Tags: Database

Similar Questions

  • SQL query to group data by Code and dates

    Hello

    I have the following table structure

    col1 col2 col3
    January 21, 2012 tested Code1
    January 20, 2012 tested Code1
    June 1, 2012 tested Code1
    June 1, 2012 tested Code2
    code 3 tested June 4, 2012

    so now

    The output should be something like

    code week1 week semaine2 3 semaine4 week5 until the last 14 weeks from the date that we run
    code 1 1 0 0 0 0
    Code2 1 0 0 0 0
    code 3 0 1 0 0 0

    where 1, 0 is in fact the charges and no sum and the week in this case perhaps should since we are in the second week, he should be

    code... .week3 may semaine4 peut week1 jun week2june


    Was looking for suggestions on how to achieve this.

    I guess that this would require some kind of a pivot query?

    Thank you
    Sun

    Hello

    Here's how you can make this pivot in Oracle 10.2. (In fact, it will work in Oracle 9.1 or higher.)

    WITH  got_week_num  AS
    (
         SELECT  error_code, date_logged
         ,     1 + FLOOR ( ( TO_DATE (:end_dt_txt, 'DD-Mon-YYYY') - date_logged)
                         / 7
                     )     AS week_num
         FROM    data_analysis
         WHERE     date_logged     >= TO_DATE (:start_dt_txt, 'DD-Mon-YYYY')
         AND     date_logged     <  TO_DATE (:end_dt_txt,   'DD-Mon-YYYY') + 1
    )
    ,     all_weeks     AS
    (
         SELECT     LEVEL               AS week_num
         ,     TO_CHAR ( 1 + TO_DATE (:end_dt_txt, 'DD-Mon-YYYY')
                       - (7 * LEVEL)
                   , 'fmDD-Mon-YYYY'
                   )          AS heading
         FROM    dual
         CONNECT BY     LEVEL <= 1 + FLOOR ( ( TO_DATE (:end_dt_txt,   'DD-Mon-YYYY')
                                             - TO_DATE (:start_dt_txt, 'DD-Mon-YYYY')
                                  )
                                / 7
                                )
    )
    SELECT       NULL                                   AS error_code
    ,       MIN (CASE WHEN week_num =  1 THEN heading END)     AS week_1
    ,       MIN (CASE WHEN week_num =  2 THEN heading END)     AS week_2
    --       ...
    ,       MIN (CASE WHEN week_num =  5 THEN heading END)     AS week_5
    FROM       all_weeks
           --
         UNION ALL
                --
    SELECT       error_code
    ,       TO_CHAR (COUNT (CASE WHEN week_num =  1 THEN 1 END))     AS week_1
    ,       TO_CHAR (COUNT (CASE WHEN week_num =  2 THEN 1 END))     AS week_2
    --       ...
    ,       TO_CHAR (COUNT (CASE WHEN week_num =  5 THEN 1 END))     AS week_5
    FROM       got_week_num
    GROUP BY  error_code
                 --
    ORDER BY  error_code     NULLS FIRST
    ;
    

    Output:

    ERROR_CODE WEEK_1      WEEK_2      WEEK_5
    ---------- ----------- ----------- -----------
               4-Jun-2012  28-May-2012 7-May-2012
    a          3           0           0
    b          0           2           1
    c          0           0           1
    

    Once more, the number of columns, such as aliases, is hard-coded in the query.
    If you want the number of columns, or their aliases depends on the data in the table, then you need dynamic SQL. See {message identifier: = 3527823}

    Did you ever what defined a "week" is in this query?
    The query above makes week_1 end to the given date (: end_dt_txt). The first week (in other words, an ioncluding: start_dt_txt) may have less than 7 days.
    If you want all the weeks to start Monday (in which case, the first and the last few weeks may have less than 7 days) see stew solution, using TRUNC (date_logged, 'IW').

  • SQL Query to extract data between days and hours

    Hi friends,

    I need, it's for a report.

    Description of the requirement:

    Need of data in a table located in an OPEN State for more than 30 minutes

    Well, this requirement can be obtained from

    Select * from xyz where status = 'OPEN' and last_update_date < = sysdate---30 (30/1440) minutes and it is a parameter

    the query above will search all data that are in the OPEN State beginning to sysdate - (30/1440). So I want to change the query to restrict the data, by adding another parameter as DAY

    for example if I give 10 day, it should recover all the data only within 10 days and sysdate-30 minutes.

    We use the last_update_date column to restrict the day.

    If I do not give any date for entry must retrieve all records of sysdate-30 minutes.

    If I don't give minutes that he must retrieve all the records in the OPEN State.

    is the question clear enough? my English is bad.

    Please suggest me a query...

    Thank you and best regards,

    Arun Thomas T

    Hello

    Select * from xyz where status = 'OPEN' and

    last_update_date between nvl (sysdate -: days, last_update_date) and nvl2 (: days, last_update_date, nvl (sysdate -: minutes/1440, last_update_date));

    that should be:

    If the days parameter is entered then start - days of sysdate and end up last_update_date

    If days is not entered then boot from the begingin and put end to:

    1 sysdate -: minutes/1440 if minutes are entered

    2. up to the last_update_date if the minutes are not entered

  • regional settings/date problems; first day of week / browser language

    Hello
    I tried to create an event calendar portlet PL/SQL. My problem is that he reacts to the language of the browser or something, he goes all weird seen with browsers in another language.

    How can I force my portlet to be ONLY the Finns and Finnish. I DON'T want to change (first day of the week Monday or Sunday) according to the browser the user is using. I tried to use the third parameter to_char (p_date, 'dd.mm.yyyy','NLS_DATE_LANGUAGE Finnish = "") in all places, but it has no effect. Local database is correct (I think, I'm not the admin), but it still happens. Chrome, Firefox, IE show different tricks in the language of the browser.

    I'll tell you it's not not the most elegant coding ever (if it was I would have taken this into account and dealt with it!), but is it possible to force just my dates and variabes date to be Finnish, period.

    Published by: wand on March 9, 2010 10:37

    I have it.

    For public meetings, portal will indeed explore the language of the browser. Waiting on the language of the browser, the portal can display languages. Only when the users connect and set a language will be explicitly, we maintain the language setting, unless the user switches browsers again.

    DAD portal settings will replace your system NLS_TERRITORY setting scale indeed. The dads.conf will be something as AMERICAN_AMERICA. UTF8. If you want to override this in your portlet, you have to find a broken way the decision of the NLS_TERRITORY in the session. This can be done with dbms_session.set_nls ('nls_territory', 'FINLAND');

    for example for my previous example:

    declare
    Date1 varchar2 (2000);
    date2 varchar2 (2000);
    Date3 varchar2 (2000);
    Start
    Select to_char (sysdate,'Dy DD-Mon-YYYY "") today in the double date1;
    Select to_char (sysdate + (1-to_char(sysdate,'D')),'Dy DD-Mon-YYYY "") in double date2.
    Select to_char (sysdate + (7-to_char(sysdate,'D')),'Dy DD-Mon-YYYY "") in the double date3;
    HTP.p ("United STATES territory value");
    HTP.hr;
    HTP.p ("today: ' |") Date1);
    HTP.br;
    HTP.p ("first day of week: ' |") date2);
    HTP.br;
    HTP.p ("last day of the week: ' |") Date3);
    HTP.br;

    HTP.p ("defined territory in FINLAND");
    HTP.hr;

    DBMS_SESSION.set_nls ('nls_territory', 'FINLAND');
    Select to_char (sysdate,'Dy DD-Mon-YYYY "") today in the double date1;
    Select to_char (sysdate + (1-to_char(sysdate,'D')),'Dy DD-Mon-YYYY "") in double date2.
    Select to_char (sysdate + (7-to_char(sysdate,'D')),'Dy DD-Mon-YYYY "") in the double date3;
    HTP.hr;
    HTP.p ("today: ' |") Date1);
    HTP.br;
    HTP.p ("first day of week: ' |") date2);
    HTP.br;
    HTP.p ("last day of the week: ' |") Date3);
    end;

    Thank you
    EJ

  • How to get the date - 7 days (week)

    Hi all

    I am beginner in Java and JSP

    I'm JSP search form (VO) where I need to set a default value for the start_date - 7 days

    OK I get this display today date but how to get that displays a date-7 days?


    < %
    DateFormat df = new SimpleDateFormat ("JJ. MM.yyyy");
    Date date = new Date();
    String default = 'StartDate =' + df.format (new Date());
    % >

    ID
    Thank you

    Simple approach if you're not worried about leap years: -.

    Date todayMinus7 = new Date(System.currentTimeMillis() - 7 * 24 * 60 * 60 * 1000);
    

    Alternatively take a look at the GregorianCalendar class and the add() method.

  • Oracle query to generate date and calculate fees

    Hi, please help me to create a query to get the end result of both tables as below

     CREATE TABLE detention_charge_slot
       (slot_no NUMBER(5),
      from_days NUMBER(10),
      to_days NUMBER(10),
      charge_amount NUMBER(10,2));
    
     INSERT INTO detention_charge_slot  VALUES (1,1,4,0);
     INSERT INTO detention_charge_slot  VALUES (2,5,9,10);
     INSERT INTO detention_charge_slot  VALUES (3,10,14,20);
     INSERT INTO detention_charge_slot  VALUES (4,15,999,25);
    
       CREATE TABLE detention_invoice
       (invoice_no NUMBER(10),
      invoice_dt DATE,
      delivery_dt DATE);
    
       INSERT INTO detention_invoice VALUES(1,'10-JAN-2015','25-JAN-2015');
    

    Where expected result 1 = Invoice_no:

    Start_date End_date Days Charge_Amount
    JANUARY 10, 2015JANUARY 13, 201540
    JANUARY 14, 201518 JANUARY 2015510
    19 JANUARY 2015JANUARY 23, 2015520
    JANUARY 24, 201525 JANUARY 2015225

    If you expect more than one line in DETENTION_INVOICE, use the following query

    WITH DATES

    AS

    (SELECT DI. INVOICE_DT + FROM_DAYS - 1 START_DATE,

    LEAST (DI. INVOICE_DT + TO_DAYS - 1, DELIVERY_DT) END_DATE.

    CHARGE_AMOUNT

    OF DCS, DETENTION_INVOICE DI DETENTION_CHARGE_SLOT

    where di.invoice_no = 1)

    SELECT start_date, end_date - + 1 days START_DATE, end_date, charge_amount

    OF DATES

  • query to display data in table with several detail table

    Hi all

    I have a few cases with a table header that have more than 3 table of detail, and I have to generate the query to show all the data horizontally.

    tblHdr have column A (PK)

    tblDtl1 have column A (FK), B (PK)

    tblDtl2 have column A (FK), C (PK)

    tblDtl3 have column A (FK), D (PK)

    and I need a query to display data like this:

    AB1C3D1
    AB2C4D2
    AC5D3
    AC6

    all the Details of the table should display data based on the relationship of tblHdr (A).

    tblDtl1 have only 2 rows of data for the FK A

    tblDtl2 just for FK 4 lines of data

    tblDtl3 only 3 lines of data for the FK A

    Another example:

    AB1C1D1
    AB2C2
    A

    B3

    tblDtl1 only 3 lines of data for the FK A

    tblDtl2 have only 2 rows of data for the FK A

    tblDtl3 have only 1 rows of data for the FK A

    Please shed some light. for the record, I'll use this query in ADF, so I'll put using PLSQL in second priority.  I prefer to do it in the SQL query.

    Thank you

    Here are 3 ways. First test of data:

    drop table ta purge;
    create table ta as
    SELECT 'A' AS A FROM dual
    union all
    select 'B' from dual;
    
    drop table tb purge;
    create table tb as
    SELECT 'A' AS A, 'B1' AS B FROM dual
    UNION ALL
    SELECT 'A', 'B2' FROM dual ;
    
    drop table tc purge;
    create table tc as
    SELECT 'A' AS A, 'C1' AS C FROM dual
    UNION ALL
    SELECT 'A', 'C2' FROM dual
    UNION ALL
    SELECT 'A', 'C3' FROM dual
    UNION ALL
    SELECT 'A', 'C4' FROM dual ;
    
    drop table td purge;
    create table td as
    SELECT 'A' AS A, 'D1' AS D FROM dual
    UNION ALL
    SELECT 'A', 'D2' FROM dual
    UNION ALL
    SELECT 'A', 'D3' FROM dual;
    

    Now 3 solutions: full join, group by and pivot:

    with b as (
      select a, b,
      row_number() over(partition by a order by b) rn
      from tb
    )
    , c as (
      select a, c,
      row_number() over(partition by a order by c) rn
      from tc
    )
    , d as (
      select a, d,
      row_number() over(partition by a order by d) rn
      from td
    )
    select a, b, c, d
    from ta left join b using(a)
    full join c using(a, rn)
    full join d using(a, rn)
    order by a, rn;
    
    select a, max(b) b, max(c) c, max(d) d
    from (
      select a, null b, null c, null d, 1 rn
      from ta
      union all
      select a, b, null, null,
      row_number() over(partition by a order by b) rn
      from tb
      union all
      select a, null, c, null,
      row_number() over(partition by a order by c) rn
      from tc
      union all
      select a, null, null, d,
      row_number() over(partition by a order by d) rn
      from td
    )
    group by a, rn
    order by a, rn;
    
    select A,B,C,D from (
      select 'A' tab, a, null val, 1 rn from ta
      union all
      select 'B' tab, a, b,
      row_number() over(partition by a order by b) rn
      from tb
      union all
      select 'C' tab, a, c,
      row_number() over(partition by a order by c) rn
      from tc
      union all
      select 'D' tab, a, d,
      row_number() over(partition by a order by d) rn
      from td
    )
    pivot(max(val) for tab in('B' B, 'C' C, 'D' D))
    order by a, rn;
    
    A B C D
    A B1 C1 D1
    A B2 C2 D2
    A C3 D3
    A C4
    B

    Personally, I would prefer to view the data by using a 'join the union', in order to avoid the impression that the different detail records are related somehow.

    select ta.a, b, c, d
    from tb full join tc on 1=0
    full join td on 1=0
    right join ta on ta.a in (tb.a, tc.a, td.a);
    
    A B C D
    A B1
    A B2
    A C1
    A C2
    A C3
    A C4
    A D1
    A D2
    A D3
    B
  • Logs archiving group Dates?

    Hi all

    10.2.0.4

    SLES 11

    It's our archivelog folder groups:
    oracle@sdb51:/u01/app/oracle/flash_recovery_area/PROD/archivelog> ls -lt
    total 32
    drwxr-x--- 2 oracle oinstall 4096 2013-04-30 12:42 2013_04_30
    drwxr-x--- 2 oracle oinstall 4096 2013-04-29 12:35 2013_04_29
    drwxr-x--- 2 oracle oinstall 4096 2013-04-28 23:42 2013_04_28
    drwxr-x--- 2 oracle oinstall 4096 2013-04-24 12:20 2013_04_24
    drwxr-x--- 2 oracle oinstall 4096 2013-04-23 22:00 2013_04_23
    drwxr-x--- 2 oracle oinstall 4096 2013-04-21 02:01 2013_04_21
    drwxr-x--- 2 oracle oinstall 4096 2013-04-19 22:00 2013_04_19
    drwxr-x--- 2 oracle oinstall 4096 2013-04-18 18:46 2013_04_18
    There the so-called complete record by dates every day?
    So I'm missing the archivelogs of 27, 26, 25, etc.?


    Thank you very much

    zxy

    It's true, you have no problem on the archived logs.
    Just your database is shut down normally between 25 and 27 date.

    Concerning
    Mr. Mahir Quluzade

    p.s. Please configure answered thread, if your question is clearly answered Forum.

  • Data per day, per month, per year

    Hello, there is a field in my table that includes dates.

    I counted my recordings of data per day using the aggregator operator and

    This simple select DATE, COUNT (*) FROM TABLE GROUP BY DATE

    Can you tell me how I count my files per month or per year? Thank you.

    Hello

    You can use functions to retrieve information from the date and use it, for example;

    Select to_char (hiredate, 'YYYY') there, sum (sal) from emp group by to_char (hiredate, 'YYYY');

    So in OWB if you have an expression before the aggregation operator, you can define the expressions and use them in the aggregation.
    See you soon
    David

  • Need help with query outputing group names

    I'm trying to find a way for groups of output headers, then all the records in each group etc header. It would be easy, except there is a key with what I want to do.

    Normally, if I have this data set (that I've ' borrowed' a site that showed the closest to what I was looking for):

    Example table:

    TABLE [number]

    (Name, NUMBER)

    Dave Bower 843-444-4444

    Dave Bower 843-555-5555

    Matthew Small 843-111-1111

    Matthew Small _843-222-2222

    Matthew Small 843-333-3333

    I could use the following code:

    < cfoutput query = "somequery" group = "name" >

    #name # < br >

    < cfoutput >

    #phonenumber # < br >

    < / cfoutput >

    < hr >

    < / cfoutput >

    And get this:

    Dave Bower

    843-444-4444.

    843-555-5555.

    -------------------

    Matthew Small

    843-111-1111.

    843-222-2222.

    843-333-3333.

    -------------------

    BUT my actual tables are not set up like that. Rather than recording of every name of every record, I would have an ID that is the foreign key to another table.

    Current table set up is as follows:

    TABLE [people]

    (ID, NAME)

    1 Dave Bower

    2 small Matthew

    TABLE [Phones]

    (PEOPLE_ID NUMBER)

    1 843-444-4444

    1 843-555-5555

    2 843-111-1111

    2 843-222-2222

    2 843-333-3333

    If this output actually would this give me with my current setup and request above code:

    1

    843-444-4444.

    843-555-5555.

    -------------------

    2

    843-111-1111.

    843-222-2222.

    843-333-3333.

    -------------------

    How can I keep my current setup but create a query that produces the same result from the top? (The table names of people like the group headers, but data from the phones table under that output)

    You must gather the two tables, and then group the output.

    Something along the lines of the (may vary slightly depending on your DB and the exact table structure)

    SELECT ppl.name, ph.number

    PEOPLE ppl

    INNER JOIN phones ph ON ppl.id = ph.people_id

    ORDER BY ppl.name

    See you soon

    Kai

  • query for results last 8 days

    anyone with a query for results last 8 days
    TIMESTAMPADD (SQL_TSI_DAY,-8, CURRENT_DATE), this results in just 1 day results only if I run this query is it display 8th day back results only.

    I want all 8 results back days.
    what someone request for it...

    Published by: user12255470 on April 28, 2010 07:19

    This works very well for me:

    SELECT "D0 time." "" Saw_0 T00 calendar Date ","recipes of F1. " "" Recipes 1-01 (all in all) ' saw_1 'Sample Sales' WHERE 'Time D0. "' T00 calendar Date" BETWEEN TIMESTAMPADD(SQL_TSI_DAY,-8, CURRENT_DATE) AND TIMESTAMPADD (SQL_TSI_DAY-1, CURRENT_DATE) ORDER BY saw_0

    concerning

    John
    http://www.obiee101.blogspot.com/

  • Max and sum in a query using Group by and dense_rank

    Hi all

    I am running Oracle 10 G on Windows 2003.

    I have two tables, RT_DY_ZONE1EDYCONS and MV_ZONE_HOURLY. I need a query that will give me the SUM of MR_OL01_VT of RT_DY_ZONE1EDYCONS for each month and the maximum value of MR_OL01_FI_0S and MR_OL02_FI_0S and the time of the maximum value for each group for the month of MV_ZONE_HOURLY. I can't combine the two querys I came up with these forums in a single search.

    I need the following result, any help would be appreciated.
    datetime, SUM of MR_OL01_VT , max value MR_OL01_FI_0S ,max_time MR_OL01_FI_0S , max value MR_OL02_FI_0S ,max_time MR_OL02_FI_0S
    January 2010,8.373765,4.96935,2010-01-15:01,5.96835,2010-01-15:17
    I used the following query to obtain the SUM of the MR_OL01_VT
    SELECT 
        TRUNC(VOL.TIMESTAMP, 'MM') datetime, 
        SUM(VOL.MR_OL01_VT) 
    FROM 
        RT_DY_ZONE1EDYCONS VOL 
    GROUP BY 
        TRUNC(VOL.TIMESTAMP, 'MM')
    ORDER BY
        TRUNC(VOL.TIMESTAMP, 'MM')
    and this query for the maximum value/time MR_OL01_FI_0S and MR_OL02_FI_0S
    select TAG_NAME,
           max(tag_avg) keep (dense_rank last order by tag_avg) over (partition by tag_name) Max_Value,
           max(datetime) keep (dense_rank last order by tag_avg) over (partition by tag_name) AS MAX_DATE
    from mv_zone_hourly
    WHERE tag_name LIKE 'MR_OL0%_FI_0S'
    first table
    CREATE TABLE RT_DY_ZONE1EDYCONS 
       (     TIMESTAMP DATE NOT NULL ENABLE, 
         HB_OL00_VT NUMBER(12,6), 
         OR_RES0_VT NUMBER(12,6), 
         OP_OL01_VT NUMBER(12,6), 
         FP_OL01_VT NUMBER(12,6), 
         BD_OL01_VT NUMBER(12,6), 
         MR_OL01_VT NUMBER(12,6), 
         Z1E_VT NUMBER(12,6)
    )
    with the sample data
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:00','YYYY-MM-DD:HH24'),4.443056,1.088,1.224927,0.663266,0,0.387499,1.079364);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:01','YYYY-MM-DD:HH24'),4.352083,null,0.692914,0.044029,0,0.373536,null);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:02','YYYY-MM-DD:HH24'),null,null,null,null,0,null,null);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:03','YYYY-MM-DD:HH24'),4.300694,null,0.662924,0,0,0.380275,null);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:04','YYYY-MM-DD:HH24'),null,null,null,null,0,null,null);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:05','YYYY-MM-DD:HH24'),0.025694,null,0.650406,0,0,0.342299,null);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:06','YYYY-MM-DD:HH24'),null,null,null,null,0,null,null);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:07','YYYY-MM-DD:HH24'),0.122917,-2.992,0.673062,0,0,0.423474,2.018381);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:08','YYYY-MM-DD:HH24'),0.106944,null,1.27403,0.768119,0,0.449303,null);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:09','YYYY-MM-DD:HH24'),null,null,null,null,0,null,null);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:10','YYYY-MM-DD:HH24'),0.122917,-2.448,0.637977,0,0,0.418056,1.514884);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:11','YYYY-MM-DD:HH24'),0.183333,-2.992,0.649855,0,0,0.401947,2.123531);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:12','YYYY-MM-DD:HH24'),1.157639,-2.992,1.039931,0.463684,0,0.43389,2.212134);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:13','YYYY-MM-DD:HH24'),4.536111,1.36,0.972226,0.381604,0,0.461941,1.36034);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:14','YYYY-MM-DD:HH24'),4.496528,2.176,0.647979,0,0,0.45611,1.216439);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:15','YYYY-MM-DD:HH24'),4.409028,2.72,0.665355,0,0,0.440141,0.583532);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:16','YYYY-MM-DD:HH24'),4.380556,1.36,0.886389,0.256387,0,0.429446,1.448334);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:17','YYYY-MM-DD:HH24'),4.433333,0.272,1.21716,0.656324,0,0.434169,1.85368);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:18','YYYY-MM-DD:HH24'),4.409722,2.176,0.653266,0,0,0.436253,1.144203);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:19','YYYY-MM-DD:HH24'),4.44375,2.448,0.67917,0,0,0.436947,0.879633);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:20','YYYY-MM-DD:HH24'),4.420833,0,1.273057,0.733813,0,0.428474,1.985489);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:21','YYYY-MM-DD:HH24'),4.390278,2.176,0.895212,0.280419,0,0.418195,0.620452);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:22','YYYY-MM-DD:HH24'),4.336806,1.904,0.670843,0,0,0.412711,1.349252);
    Insert into RT_DY_ZONE1EDYCONS (TIMESTAMP,HB_OL00_VT,OR_RES0_VT,OP_OL01_VT,FP_OL01_VT,BD_OL01_VT,MR_OL01_VT,Z1E_VT) values (to_date('2010-01-15:23','YYYY-MM-DD:HH24'),4.305556,2.448,0.689441,0,0,0.409099,0.759016);
    and the second table
    CREATE TABLE MV_ZONE_HOURLY
    ( TAG_NAME VARCHAR2(30),
      TAG_DESCRIP VARCHAR(100),
      DATETIME DATE,
      TAG_AVG NUMBER(12,6)
    )
    with the sample data
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:00','YYYY-MM-DD:HH24'),4.166712);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:01','YYYY-MM-DD:HH24'),4.96935);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:02','YYYY-MM-DD:HH24'),4.367);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:03','YYYY-MM-DD:HH24'),4.67788);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:04','YYYY-MM-DD:HH24'),4.06335);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:05','YYYY-MM-DD:HH24'),3.23456);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:06','YYYY-MM-DD:HH24'),4.2333555);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:07','YYYY-MM-DD:HH24'),4.5890);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:08','YYYY-MM-DD:HH24'),4.166712);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:09','YYYY-MM-DD:HH24'),4.96735);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:10','YYYY-MM-DD:HH24'),4.8456);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:11','YYYY-MM-DD:HH24'),4.2468);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:12','YYYY-MM-DD:HH24'),4.06335);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:13','YYYY-MM-DD:HH24'),3.9746);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:14','YYYY-MM-DD:HH24'),4.2333555);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:15','YYYY-MM-DD:HH24'),4.47);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:16','YYYY-MM-DD:HH24'),4.166712);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:17','YYYY-MM-DD:HH24'),4.96835);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:18','YYYY-MM-DD:HH24'),4.6890);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:19','YYYY-MM-DD:HH24'),4.42345);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:20','YYYY-MM-DD:HH24'),4.06335);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:21','YYYY-MM-DD:HH24'),3.4579);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:22','YYYY-MM-DD:HH24'),4.2333555);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL01_FI_0S','Montreal Rd Disch Flow 1',to_date('2010-01-15:23','YYYY-MM-DD:HH24'),4.45789);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:00','YYYY-MM-DD:HH24'),5.166712);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:01','YYYY-MM-DD:HH24'),5.97835);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:02','YYYY-MM-DD:HH24'),5.367);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:03','YYYY-MM-DD:HH24'),5.67788);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:04','YYYY-MM-DD:HH24'),5.06335);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:05','YYYY-MM-DD:HH24'),4.23456);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:06','YYYY-MM-DD:HH24'),5.2333555);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:07','YYYY-MM-DD:HH24'),5.5890);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:08','YYYY-MM-DD:HH24'),5.166712);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:09','YYYY-MM-DD:HH24'),5.95635);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:10','YYYY-MM-DD:HH24'),5.8456);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:11','YYYY-MM-DD:HH24'),5.2468);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:12','YYYY-MM-DD:HH24'),5.06335);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:13','YYYY-MM-DD:HH24'),4.9746);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:14','YYYY-MM-DD:HH24'),5.2333555);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:15','YYYY-MM-DD:HH24'),5.47);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:16','YYYY-MM-DD:HH24'),5.166712);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:17','YYYY-MM-DD:HH24'),5.96835);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:18','YYYY-MM-DD:HH24'),5.6890);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:19','YYYY-MM-DD:HH24'),5.42345);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:20','YYYY-MM-DD:HH24'),5.06335);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:21','YYYY-MM-DD:HH24'),4.4579);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:22','YYYY-MM-DD:HH24'),5.2333555);
    Insert into MV_ZONE_HOURLY (TAG_NAME,TAG_DESCRIP,DATETIME,TAG_AVG) values ('MR_OL02_FI_0S','Montreal Rd Disch Flow 2',to_date('2010-01-15:23','YYYY-MM-DD:HH24'),5.45789);

    Hello

    Thanks for posting the CREATE TABLE and INSERT statements; This is really useful!

    Your volumninous sample data are all for a month. Since your first query is grouped by month, I suppose that this is not always the case.
    In this case, you can make GROUP BY queries on two tables separately, in two subqueries, then join the results.
    For the separate columns for the values of mv_zone_hoiurly, I GROUP BY the name of tag in the subquery, then their pivot in two columns in the query of amin.

    WITH     edycons_summary       AS
    (
         SELECT       TRUNC (tmstmp, 'MM')     AS mnth
         ,       SUM (mr_ol01_vt)     AS sum_mr_ol01_vt
         FROM       rt_dy_zone1edycons
         GROUP BY  TRUNC (tmstmp, 'MM')
    )
    ,     hourly_summary       AS
    (
         SELECT       tag_name
         ,       TRUNC (datetime, 'MM')     AS mnth
         ,       MAX (tag_avg)                  AS max_avg
         ,       MAX (datetime) KEEP (DENSE_RANK LAST ORDER BY tag_avg NULLS FIRST)
                                              AS max_avg_datetime
         FROM       mv_zone_hourly
         WHERE       tag_name     IN ( 'MR_OL01_FI_0S'
                             , 'MR_OL02_FI_0S'
                           )
         GROUP BY  tag_name
         ,            TRUNC (datetime, 'MM')
    )
    SELECT       TO_CHAR (v.mnth, 'fmMonth YYYY')     AS datetime
    ,       v.sum_mr_ol01_vt
    ,       MAX ( CASE
                     WHEN  h.tag_name = 'MR_OL01_FI_0S'
                  THEN  h.max_avg
                 END
               )                         AS max_avg_01
    ,       MAX ( CASE
                     WHEN  h.tag_name = 'MR_OL01_FI_0S'
                  THEN  h.max_avg_datetime
                 END
               )                         AS datetime_01
    ,       MAX ( CASE
                     WHEN  h.tag_name = 'MR_OL02_FI_0S'
                  THEN  h.max_avg
                 END
               )                         AS max_avg_02
    ,       MAX ( CASE
                     WHEN  h.tag_name = 'MR_OL02_FI_0S'
                  THEN  h.max_avg_datetime
                 END
               )                         AS datetime_02
    FROM       edycons_summary     v
    JOIN       hourly_summary     h     ON     h.mnth     = v.mnth
    GROUP BY  v.mnth
    ,            v.sum_mr_ol01_vt
    ;
    
  • Date range for a week

    Hai All,
    Y at - it opportunities for a set of value which can set date and between a week (from Monday to Saturday for example)?

    I defined a valueset to select any date which is on Monday. My plan is to add 7 days to the date selected using flex to another set of values.
    But I am not able to do it. I get the following error. ORA-01861: Literal formats not chain. is this correct or is there another option I can use...

    Concerning

    Take a look on [Note: 376034.1 - how to handle the new Formats of Date in SQL * more and PL/SQL procedures? | https://metalink2.oracle.com/metalink/plsql/ml2_documents.showDocument?p_database_id=NOT&p_id=376034.1], it may be useful.

  • You need to unplug the 10 pro printer when you print not for days or weeks?

    You need to unplug the 10 pro printer when you print not for days or weeks?

    ebiggs1

    Thanks for your reply.

  • Is legal to use "slmgr-rearm" extend using period in Windows 7 Ultimate & may I re-install Win 7 and use it for 30 days after the expiration date (29 days)?

    Hello
    I live in Iran.
    Here access to the original windows is not easy.
    I have ultimate edition (eternity) win 7.
    He let me to use during 30 days three times use "slmgr-rearm" to extend the trial period.
    Hereby, I can use and update.
    I have two questions.
    1. is legal to use "slmgr-rearm" extend with period?
    2-may I have reinstall win 7 and use it for 30 days after the expiration date (29 days)?

    with speciall thank you

    original title: Reinstall windows 7 ultimate

    According to the software Microsoft Windows 7 license, you must activate Windows 7 within 30 days of installation.  You are not allowed to circumvent or bypass the product activation.  After 30 days, you must enter a genuine Windows 7 product key for the edition you have installed, or remove Windows 7 by reformatting the hard drive, on which it is installed.

    In addition, you must respect Export Basics.

Maybe you are looking for