Date table of MAX and research

I have 2 tables (MED_IMMUNO and LU_MED_IMMUNO).  "MED_IIMMUNO" is the base table, and "LU_MED_IMMUNO" is the lookup table for the column 'IMM_REC '.  Please see CFDS below:

CREATE TABLE 'MED_IMMUNO' ('ID', 'EMP_ID' NUMBER, NUMBER OF "IMM_REC", "IMM_DATE" DATE, VARCHAR2 (255) 'IMM_NOTES', 'IMM_APPLICABLE' NUMBER, NUMBER OF "IMM_REFUSED")

Insert into MED_IMMUNO (ID, EMP_ID, IMM_REC, IMM_DATE, IMM_NOTES, IMM_APPLICABLE, IMM_REFUSED) values (65,5900,1,to_date('19-JUN-11','DD-MON-RR'),null,null,null);

Insert into MED_IMMUNO (ID, EMP_ID, IMM_REC, IMM_DATE, IMM_NOTES, IMM_APPLICABLE, IMM_REFUSED) values (66,5900,2,to_date('17-JUL-11','DD-MON-RR'),null,null,null);

Insert into MED_IMMUNO (ID, EMP_ID, IMM_REC, IMM_DATE, IMM_NOTES, IMM_APPLICABLE, IMM_REFUSED) values (67,5900,3,to_date('20-DEC-11','DD-MON-RR'),null,null,null);

Insert into MED_IMMUNO (ID, EMP_ID, IMM_REC, IMM_DATE, IMM_NOTES, IMM_APPLICABLE, IMM_REFUSED) values (68,5900,22,to_date('19-JAN-11','DD-MON-RR'),null,null,null);

Insert into MED_IMMUNO (ID, EMP_ID, IMM_REC, IMM_DATE, IMM_NOTES, IMM_APPLICABLE, IMM_REFUSED) values (69,5900,4,to_date('19-JAN-11','DD-MON-RR'),null,null,null);

Insert into MED_IMMUNO (ID, EMP_ID, IMM_REC, IMM_DATE, IMM_NOTES, IMM_APPLICABLE, IMM_REFUSED) values (70,5900,14,to_date('19-JAN-11','DD-MON-RR'),null,null,null);

Insert into MED_IMMUNO (ID, EMP_ID, IMM_REC, IMM_DATE, IMM_NOTES, IMM_APPLICABLE, IMM_REFUSED) values (71,5900,17,to_date('19-JAN-11','DD-MON-RR'),null,null,null);

Insert into MED_IMMUNO (ID, EMP_ID, IMM_REC, IMM_DATE, IMM_NOTES, IMM_APPLICABLE, IMM_REFUSED) values (72,5900,16,to_date('19-JAN-11','DD-MON-RR'),null,null,null);

create table 'LU_MED_IMMUNO' ('ID', 'IMM_DESC' varchar2 (55), number of the "DSB")

Insert into LU_MED_IMMUNO (ID, IMM_DESC, DSB) values (20, 'SHOT_1', 20);

Insert into LU_MED_IMMUNO (ID, IMM_DESC, DSB) values (21, "SHOT_2", 21);

Insert into LU_MED_IMMUNO (ID, IMM_DESC, DSB) values (22, 'SHOT_3', 12);

Insert into LU_MED_IMMUNO (ID, IMM_DESC, DSB) values (1, 'SHOT_4', 1);

Insert into LU_MED_IMMUNO (ID, IMM_DESC, DSB) values (2, 'SHOT_5', 2);

Insert into LU_MED_IMMUNO (ID, IMM_DESC, DSB) values (3, 'SHOT_6', 3);

Insert into LU_MED_IMMUNO (ID, IMM_DESC, DSB) values (4, 'SHOT_7', 13);

Insert into LU_MED_IMMUNO (ID, IMM_DESC, DSB) values (5, 'SHOT_8', 5);

Insert into LU_MED_IMMUNO (ID, IMM_DESC, DSB) values (6, 'SHOT_9', 6);

insert into lu_med_immuno (id, imm_desc, DSB) values (8, 'SHOT_10', 8);

insert into lu_med_immuno (id, imm_desc, DSB) values (9, 'SHOT_11', 9);

insert into lu_med_immuno (id, imm_desc, DSB) values (10, 'SHOT_12', 10);

insert into lu_med_immuno (id, imm_desc, DSB) values (11, 'SHOT_13', 11);

insert into lu_med_immuno (id, imm_desc, DSB) values (14, 'SHOT_14', 14);

insert into lu_med_immuno (id, imm_desc, DSB) values (16, "SHOT_15", 16);

insert into lu_med_immuno (id, imm_desc, DSB) values (17, "SHOT_16", 17);

insert into lu_med_immuno (id, imm_desc, DSB) values (18, 'SHOT_17', 18);

Insert into LU_MED_IMMUNO (ID, IMM_DESC, DSB) values ('SHOT_18', 19, 19);

Basically, these 2 tables to serve as a record of vaccination tracker.  My requirement is to create a query and return the maximum date for each vaccination shot (IMM_REC) and also display a null if the employee (EMP_ID) never received this immunization record.  In other words, there are 18 rows in the lookup table, so for each EMP_ID, there should be 18 rows with a date and/or null for every IMM_REC max.  I know I'm close to reach with the query below, I developed (but I think that even what is defective so far), but I can't go any further.  I think I should use the lookup table as the basis of the request, or somehow make a left join with this table.  In any event, it is the query I have at the moment:

with got_rnk as

(

Select p.id, p.emp_id, p.imm_rec, p.imm_date

, evaluate () during (partition of p.imm_rec

p.imm_date desc order

) as rnk

of med_immuno p

)

SELECT id, emp_id, imm_rec, imm_date

of got_rnk

where rnk = 1

order by id

;

Any help would be appreciated...

Try this:

WITH max_date AS

(

SELECT emp_id, imm_rec, max (imm_date) imm_date

OF MED_IMMUNO M

GROUP BY emp_id, imm_rec

)

DIS_MED_IMMUNO AS

(

SELECT DISTINCT M.emp_id

OF MED_IMMUNO M

)

tmp AS

(

SELECT *.

OF DIS_MED_IMMUNO

CROSS JOIN LU_MED_IMMUNO

)

Tmp.emp_id SELECT emp_id, tmp.ID, tmp. IMM_DESC, M.imm_date

OF tmp LEFT OUTER JOIN max_date M

ON tmp.emp_id = M.emp_id

AND tmp.ID = M.IMM_REC

ORDER BY tmp.emp_id, tmp.ord

Tags: Database

Similar Questions

  • Table of comparison and research

    Hallo

    I compare a data entry and the ideal data. so far, I understood how to do comparison but the problem create when ideal table have the same element with different respective value, dat same of entry are also seems to be the same, but everything by doing research in the final output in the 3 column table looks different than it should be.

    hope you understand my problem

    thanking you

    BR

    Karine

    Search table stops his research when he met a first similarity. THST the reason you get 500 for all the "E" s

    I joined a project. There are always elegant ways to do what you want to achieve.

  • date wise-value Max and min

    I have Table with columns of test


    name value values_date
    -----------------------------
    A 40 01/08/2010
    A 10 02/08/2010
    A 10 03/08/2010
    A 10 08-04-2010
    A 20 03/08/2010
    A 50 02/08/2010
    A 50 03/08/2010
    A 50 04/08/2010

    B 100 01/08/2010
    B 10 02/08/2010
    B 20 03/08/2010
    B 10 08-01-2010
    B 100 11/08/2010
    B 100 12/08/2010
    B 100 13/08/2010


    Insert into test values('A','40','1/8/2010');
    Insert into test values('A','10','2/8/2010');
    Insert into test values('A','10','3/8/2010');
    Insert into test values('A','10','4/8/2010');
    Insert into test values('A','20','3/8/2010');
    Insert into test values('A','50','2/8/2010');
    Insert into test values('A','50','3/8/2010');
    Insert into test values('A','50','4/8/2010');
    Insert into test values('B','100','1/8/2010');
    Insert into test values('B','10','2/8/2010');
    Insert into test values('B','20','3/8/2010');
    Insert into test values('B','10','1/8/2010');
    Insert into test values('B','100','11/8/2010');
    Insert into test values('B','100','12/8/2010');
    Insert into test values('B','100','13/08/2010');


    I like OP

    name min_value max_value max_value_date min_value_date
    -------------------------------------------------------------------------
    A 50 04/08/2010 10 02/08/2010
    B 10 01/08/2010 100 13/08/2010

    Hello...

    Try saying,.

     SQL> SELECT A.NAME,A.VALUE,MIN(A.V_DATE),B.VALUE,MAX(B.V_DATE)FROM TEST A,TEST B WHERE
      2  (A.NAME,A.VALUE) IN (SELECT NAME,MIN(VALUE) FROM TEST GROUP BY NAME) AND
      3  (A.NAME,B.VALUE) IN (SELECT NAME,MAX(VALUE) FROM TEST GROUP BY NAME) GROUP BY A.NAME,A.VALUE,B.VALUE;
    
    NAME            VALUE MIN(A.V_DATE)        VALUE MAX(B.V_DATE)
    ---------- ---------- --------------- ---------- ---------------
    A                  10 2/8/2010                50 4/8/2010
    B                  10 1/8/2010               100 13/08/2010
    

    Kind regards
    Santosh.Minupurey

  • What is the best way to move the data from the app and Server data structures?

    Hi guys,.

    I developed my application locally with Apex 4.2 and Oracle 11 g XE on Windows 7. Not far away, it is time to move the application to a server Oracle Apex. I guess that Exim is the way to go app. But what about the APA tables and data (tables/as 'customer' and 'account' created specially for the application)? I've been using a modeling tool, so I can run a DDL script to create the database server data structures. What is the best way to move the application data on the server? Is it possible to move the structures and data in a single process?

    Thank you
    Kim

    There is probably another way to get here, but in Developer SQL, on the navigation tree, expand objects until your table, right-click, and then click EXPORT... you will see all the options. It is a tedious process and it sucks IMO, but yes, it works. This is zero especially because 1) it's a table at a time, 2) If your data model is robust and has constraints and sequences and triggers, then you will need to disable all for the insert and we hope you can re-enable constraints, etc. without hitch (good luck, unless you have only a handful of tables)

    I prefer to use the oracle EXP command-line to export an entire schema, then the server target, I use IMP to import the schema. In this way, it is almost true. This makes the dirty life if you develop multiple applications in a single schema, and I felt pain - however - it is much easier to drop the tables and other objects is to create them! (thus, even if the process of EXP/IMP moved more than you wanted to "move".. blow everything you don't want on the target after the fact...)

    You can use method datapump oracle too.

    If not, what can be done, IF you have access to both servers of your instance of SQL developer (or if you can tnsping both already from the command line, you can use SQL * MORE), is to run a script that will identify the objects of your applications apex (usually by the prefix for the names of objects, such as % EBA_PROJ_ etc.) and do all the manual work for you. I've created a script that does just that so that I can pass data from dev to prod servers on a dblink. It is difficult because of the order that must be executed to disable constraints and then turn it back on and of course, more complicated if don't always precede you ALL your "objects demand '... (tables, views, triggers, sequences, functions, procs, indexes, etc.)

  • months Max and the year of the table

    Hello

    I need to get the months max and the year of a table.
    DESC WHR_REPORT
    REPORTMONTH   NUMBER(2)
    REPORTYEAR    NUMBER(4)
    Sample data in table
    reportmonth    reportyear
    01              2009
    02              2009
    03              2009
    04              2009
    09              2009
    12              2009
    01              2010
    02              2010
    How can I get the date max which means 022010 table?

    Thank you
    Sandy
    select max(to_date(to_char(reportyear) || lpad(to_char(reportmonth), 2, '0'), 'yyyymm' ) )
    from whr_report
    
    or
    
    select reportyear, reportmonth
    from
    (
    select reportyear, reportmonth, row_number() over (order by reportyear desc, reportmonth desc) rn
    from whr_report
    )
    where rn = 1
    

    I forgot to lpad around month

    Published by: bluefrog on June 10, 2010 15:59

  • AVG, max and min calculated from form data

    I have two tables. One with the raw data and the other with summary data. There is a form which the raw data to column came in, and then click submit, I use a cfloop function to get the data in the same column in the table of raw data. I can't find an easy way to calculate the max, min and average values for the data that are in the raw data table that can be put in the summary data table. The data are customized according to the date and place.

    This is the code to get the raw data into the table.

    < CFLOOP INDEX = "onerow" FROM = "1" TO = "#form.numrows #" >
    < CFQUERY NAME = "insertrawdata" DATASOURCE = 'test' >
    INSERT INTO test.dbo.rawdata (locationid, date, data)
    VALUES (' #form.locaitonid # ', #DateValue #, ' #Form ['data' & onerow] #")
    < / CFQUERY >
    < / CFLOOP >

    OK, well, do it in 2 queries then...


    Select max (data) as max, min (data) as min, avg (data) like avg test.dbo.rawdata where locationid = ' #form.locationid # ' and date = #datevalue # group by locationid, date


    insert into test.dbo.summarydata (locationid, date, max, min, avg)
    values (' #form.locationid #', #datevalue #,)
    (#getSummaryData.max #, #getSummaryData.min #, #getSummaryData.avg #)

  • Max and min to find examples of data

    I'm discerning the max and min of cycle data values. We have about 20 data points for each cycle and I need make the max and min out of it. I tried to write a script for this but it takes way to long to run. Do not complete our largest files (over 1 million points of data) with my script and DIAdem crash.  I also tried the peak built by finding functions, but they come with data points that are not max or min. Here is my script below:

    Dim cyclemin
    Dim cyclemax
    Dim cyclecount
    Dim displacementmax, displacementmin
    Dim channelcount
    Dim i, j, k, g, a, m
    cyclemin = CMin ('Cycle Count')
    cyclemax = CMax ('Cycle Count')
    ChannelCount = 0
    Call ChnAlloc ('Single Cycle Count',(cyclemax-cyclemin) + 1, 1, DataTypeFloat64)
    Call ChnAlloc ("pressure min. sample",(cyclemax-cyclemin) + 1, 1, DataTypeFloat64)
    Call ChnAlloc ("Sample pressure Max",(cyclemax-cyclemin) + 1, 1, DataTypeFloat64)
    a = 1
    For cyclecount = cyclemin to cyclemax
    g = 1
    ChannelCount = channelcount + 1
    I = ChnFind ('Ch ()' ' Cycle Count"" "") = "& str (cyclecount)(,a)"
    j = ChnFindReverse ("Ch (" 'Cycle Count) "" ") =" & str (cyclecount), CL ('Cycle Count')) "
    Dial the ChnAlloc ("sample Pressure data retention", (d - i) + 1, 1, DataTypeFloat64 ')
    For k = i j
    ChD (g, "" data keeps the pressure of the sample "") = SMC (k, "Sample pressure")
    g = g + 1
    Next
    Call the ChnCharacter (' "data keeps the pressure of the sample" ")
    ChD (channelcount, ' Single Cycle Count "" ") = cyclecount
    ChD (channelcount, "Max pressure sample") = CMax ("Data Hold sample pressure")
    ChD (channelcount, "pressure min. sample") = CMin ("Data Hold sample pressure")
    Call the ChnDelete (' "data keeps the pressure of the sample" ")
    a = j
    Next

    Can someone please help me find a way to do it quickly? Thank you.

    Hello Steinmeister85

    just in case you are using a newer version of DIAdem, here is an alternative solution.

    Also, I used your example file and it concatenated 50 times to create a file which has about 560,000 values in each channel.

    I run your script from a new one to get a reference number

    Sample file

    Original screenplay: 2.5 seconds

    Optimized script: 0.2 seconds

    50 x example file

    Original screenplay: 111,493 seconds

    Optimized script: 0.25 seconds

    I have to admit that I've sorted the concatenated file. This way I have 216 cycle segments to be analyzed. If I simple concatenate files (without sorting) I get 50 x the number of segments and the new script takes about 9 seconds to execute. The original script doesn't work properly like you woul then have to repeat the cycle count numbers in different areas of the channel...

    I hope that the new versions work for you.

    Andreas

  • How to find the value max and min for each column in a table 2d?

    How to find the value max and min for each column in a table 2d?

    For example, in the table max/min for the first three columns would be 45/23, 14/10, 80/67.

    Thank you

    Chuck,

    With color on your bars, you should have enough experience to understand this.

    You're a loop in the table already.  Now you just need a function like table Max and min. loop.  And you may need to transpose the table 2D.

  • How to get the second clue high table max and min

    Hello

    I need help to find the second clue to the data table, how can I do?

    Secure with the extract of vi.

    Thank you and best regards,

    Simon

    You forgot to make the comparison to see where the 2nd most high index has declined compared to the highest.

    You only add 1 if the 2nd above comes after the highest.  You added 1 regardless of the position.

  • ORA-31693: Data Table object 'AWSTEMPUSER '. "' TEMPMANUALMAPRPT_273 ' failed to load/unload and being ignored because of the error:

    Dear all,

    OS - Windows server 2012 R2

    version - 11.2.0.1.0

    Server: production server

    ORA-31693: Data Table object 'AWSTEMPUSER '. "' TEMPMANUALMAPRPT_273 ' failed to load/unload and being ignored because of the error:

    ORA-02354: Error exporting/importing data

    ORA-00942: table or view does not exist

    When taken expdp and faced error mentioned above. but expdp completed successfully with waring as below.

    Work "AWSCOMMONMASTER". "" FULLEXPJOB26SEP15_053001 "finished with 6 errors at 09:30:54

    (1) what is the error

    (2) is there any problem in the dump because file as above of the error. If Yes, then I'll resume expdp.

    Please suggest me. Thanks in advance

    Hello

    I suspect that what has happened, is that demand has dropped a temporary table to during the time that you run the export - consider this series of events

    (1) temp table created by application

    (2) start expdp work - including this table

    (3) the extracted table metadata

    (4) the application deletes the table

    (5) expdp is trying to retrieve data from the table - and gets the above error.

    Just to confirm with the enforcement team that the table is just a temporary thing - it certainly seems it name.

    See you soon,.

    Rich

  • How to join two tables to retrieve the data from the columns in table two. Tables have primary and foreign key relationships

    Hello

    I want to join the two tables to retrieve the data from the columns of the two table passing parameters to the join query. Tables have primary and foreign key relationships

    Details of the table

    Alert-1 - AlertCode (FK), AlerID (PK)

    2 AlertCode-AlertDefinition-(PK)

    Help, please


    ----------

    Hi Vincent,.

    I think that you have not worked on adf 12.1.3.  In adf 12.1.3 you don't have to explicitly create the association. When you create the EO to your table, Association xxxxFkAssoc, will be created by ADF12.1.3 for you automatically. Please try this and do not answer anything... You can also follow the links below. I solved the problem by using the following link

    Oracle ADF Guide step by step - Oracle ADF tutorial: creating a relationship of the master / detail using Oracle ADF

    ---

  • CC: How to insert a table row above and below (data)?

    In Dreamweaver CS4, there was two orders "Menu insert > table objects >»

    ' Insert a row above / insert a row below.

    In Dreamweaver CC it not there no more "Menu insert > Table objects.

    My question:

    How can I insert a line of table below with a single step please? (I don't mean of the workarounds)

    The same for the columns?

    Thank you.

    If there is no solution with a hidden configuration method:

    Y at - it a plugin that offers the missing controls?

    Just for your information:

    I often change the data tables. I don't think tables for page layout.

    Hi mistershortcut,.

    It was shortened to add the above line and the front column of the selected cell. There is no shortcut to add the line below and the column to the right. But the same thing can be done using "insert the row or the column"option in the Menu Edit-> Table or context menu option." Now modify Table menu and context menu options are enabled in mode live view as well. You can check it at help Dreamweaver | Summaries news

    Thank you!

    Kusha

  • Read data from table of $ E and insert in the staging table

    Hi all

    I'm new on ODI. I need your help to understand how to read data from a table ' E$ "and insert in an intermediate table.

    Scenario:

    The name of two columns, in a flat file, the employee and the employee id must be loaded into a data EMPstore +. A check constraint is added so that the data with the employee names in capital letters only to load in the data store. Check the command is set to the static movement . Right-click on the data store, select control , then check. The lines that have violated the check constraint are kept in E$ _EMP+ table.

    Problem:

    Problem is I want to read the data in the table E$ _EMP+ and transform in capital letters in the name of the employee and move the corrected data of E$ _EMP+ EMP+. Please advise me on how to automatically manage the 'soft' exceptions in ODI.

    Thank you

    If I understand, you want to change the columns in the tables of $ E and then load into the target.

    Now, if you notice how ODI recycles the error, there is an incremental update to the target using the E table $ after he filled the I$ table.

    I think you can do the same thing by creating an interface using the table of $ E as source and implement the business logic in this interface to fill the target.

  • Grouping data with dates of Max and Min problem

    Ladies and gentlemen,

    I have a problem that I have tried from different angles. It's probably very easy for some of you, or some of you may have met before, so any help on this is greatly appreciated. I will describe below.

    I have the following data:

    User station site code dstamp ref Qty
    -------- --------- ---------- ------------- --------------------------------------------- ------- -------
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.43.06.566193000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.49.31.364224000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.49.47.413252000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.51.48.906793000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.51.56.947312000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.54.29.396052000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.54.37.444307000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.57.00.237546000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.57.04.285148000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.59.24.745162000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.59.44.774318000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 14.01.22.434940000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 14.01.51.291059000 ref_1 1125
    Site_1 user_1 RPT104 Activity_2 16 May 11 14.05.23.572211000 ref_2 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 14.06.01.058978000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.41.341612000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.375972000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.388699000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.401287000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.413361000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.425675000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.437360000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.449079000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.460697000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.472606000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.484031000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.495551000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.513645000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.530405000 ref_1 1125


    and I'm looking for it in this format:


    Site user station code start end ref Qty
    --------     ---------     ----------     -------------     ---------------------------------------------     ---------------------------------------------          ----------     -------
    Site_1 user_1 RPT104 Activity_1 13.43.06.566193000 16 May 11 May 16, 11 14.05.23.572211000 ref_1 1125
    Site_1 user_1 RPT104 Activity_2 14.05.23.572211000 16 May 11 May 16, 11 14.06.01.058978000 ref_2 1125
    Site_1 user_1 RPT104 Activity_1 14.06.01.058978000 16 May 11 May 16, 11 14.06.41.341612000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 14.06.41.341612000 16 May 11 (May 16, 11 14.06.49.530405000 + 4secs ref_1 1125)
    either may 16, 11 14.06.53.530405000)


    I can get the hours start and end without problem using data intial twice and it compensation by a rownum is but using the functions max and min based on the data that I get:

    Site user station code start end ref Qty
    --------     ---------     ----------     -------------     ---------------------------------------------     ---------------------------------------------          ----------     -------
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.43.06.566193000 * May 16, 11 14.06.41.341612000 * ref_1 1125
    Site_1 user_1 RPT104 Activity_2 14.05.23.572211000 16 May 11 May 16, 11 14.06.01.058978000 ref_2 1125
    Site_1 user_1 RPT104 Activity_3 * 14.06.41.341612000 * 16 May 11 (May 16, 11 14.06.49.530405000 + 4secs ref_1 1125)
    either may 16, 11 14.06.53.530405000)

    who is missing on the 3rd line of the previous dataset (if everything goes well in fat after validation) and assigns the wrong time end.

    I think the solution may have soemthing to do using the function dense_rank() (any ORDER by code, start) but I'm not too familiar with it and I think that the facts in the Start column data is unique it affects its functioning.

    If anyone can offer any help or point me in the right direction I'll offer eternal grace and rest a drink we should never meet!

    see you soon

    Published by: MickyMick on June 7, 2011 03:21

    BobLilly wrote:
    Tabibitosan of Aketi method can be applied here (see {: identifier of the thread = 1005478})

    Site_1 user_1 RPT104 Activity_1 2011-05-16 13.43.06.566193000 2011-05-16 14.05.23.572211000 ref_1 1125
    Site_1 user_1 RPT104 Activity_2 2011-05-16 14.05.23.572211000 2011-05-16 14.06.01.058978000 ref_2 1125
    Site_1 user_1 RPT104 Activity_1 2011-05-16 14.06.01.058978000 2011-05-16 14.06.41.341612000 ref_1 1125
    Site_1 RPT104 Activity_3 2011-05-16 14.06.41.341612000 user_1 ref_1 14.06.45.341612000 2011-05-16 1125

    According to OP we may 16, 11 14.06.49.530405000 + 4secs. In any case, use method start_of_group:

    With t as (
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.43.06.566193000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.49.31.364224000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.49.47.413252000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.51.48.906793000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.51.56.947312000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.54.29.396052000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.54.37.444307000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.57.00.237546000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.57.04.285148000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.59.24.745162000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.59.44.774318000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 14.01.22.434940000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 14.01.51.291059000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_2' as Code
    , to_timestamp('16-MAY-11 14.05.23.572211000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_2' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 14.06.01.058978000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.41.341612000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.375972000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.388699000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.401287000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.413361000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.425675000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.437360000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.449079000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.460697000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.472606000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.484031000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.495551000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.513645000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.530405000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual
    ),
    t1 as (
           select  t.*,
                   lead(DTStamp,1,DTStamp + interval '4' second) over(order by DTStamp) ENDTS,
                   case
                     when     lag(Site) over(order by DTStamp)  = Site
                          and
                              lag(Usr) over(order by DTStamp)  = Usr
                          and
                              lag(Station) over(order by DTStamp)  = Station
                          and
                              lag(Code) over(order by DTStamp)  = Code
                          and
                              lag(Ref) over(order by DTStamp)  = Ref
                          and
                              lag(Qty) over(order by DTStamp)  = Qty
                       then 0
                     else 1
                   end start_of_group
             from  t
          ),
    t2 as (
           select  t1.*,
                   sum(start_of_group) over(order by DTStamp) grp
             from  t1
          )
    select  Site,
            Usr,
            Station,
            Code,
            min(DTStamp) STARTTS,
            max(ENDTS) ENDTS,
            Ref,
            Qty
      from  t2
      group by grp,
               Site,
               Usr,
               Station,
               Code,
               Ref,
               Qty
      order by STARTTS
    /
    
    SITE   USR    STATIO CODE       STARTTS                             ENDTS                               REF          QTY
    ------ ------ ------ ---------- ----------------------------------- ----------------------------------- ----- ----------
    Site_1 user_1 RPT104 Activity_1 16-MAY-11 01.43.06.566193000 PM     16-MAY-11 02.05.23.572211000 PM     ref_1       1125
    Site_1 user_1 RPT104 Activity_2 16-MAY-11 02.05.23.572211000 PM     16-MAY-11 02.06.01.058978000 PM     ref_2       1125
    Site_1 user_1 RPT104 Activity_1 16-MAY-11 02.06.01.058978000 PM     16-MAY-11 02.06.41.341612000 PM     ref_1       1125
    Site_1 user_1 RPT104 Activity_3 16-MAY-11 02.06.41.341612000 PM     16-MAY-11 02.06.53.530405000 PM     ref_1       1125
    
    SQL> 
    

    SY.

  • data tables store information of groups and users?

    Hi all

    I want to export all the information of users and groups on the Administration of BI tool. only I can copy them one by one. are there other methods?
    who knows what data tables store information of groups and users?

    Thank you
    Dan.

    Hi dan,.

    As you can not access the link which is very informative. Never I've implemented it but john's suggestion, it should work

    Courtesy John: -.

    OBIEE get all RPD users
    I had to get all the users a repository very large because they where to implement a new security model. Wrote a small script to make life easier:

    ' Read_Users.VBS
    "John Minkjan
    "http:// http://www.obiee101.blogspot.com/
    ' Get all the users from a repository
    1: do an export the PRD UDML using nqgenudml.exe
    2: change the location/name of file in this script
    3: run the script in the cscript Read_Users.VBS command line > users.txt
    Set objFSO = CreateObject ("Scripting.FileSystemObject")

    "this point your EXPORTSet UDML
    objFile = objFSO.OpenTextFile ("E:\names.txt", ForReading)

    Const ForReading = 1
    Dim arrFileLines()
    Dim strRLinedim strTemp1dim strTemp2

    I have = 0

    Up to objFile.AtEndOfStream
    strRline = objFile.ReadLine
    If left(strRline,12) = "USER to DECLARE" then
    ReDim Preserve arrFileLines (i)
    arrFileLines (i) = strRline
    i = i + 1
    end if
    Loop

    objFile.Close
    "Then you can iterate over it like that"
    For each strLine in arrFileLines
    strTemp1 = MID (strLine, 15: 50)
    IF instr (strline,"}" ") > 0 THEN
    strTemp2 = MID (strLine, instr(strline,"{") + 1, (instr(strline,"}") - (instr(strline,"{") + 1))) ELSE strTemp2 = «»
    END IF
    WScript.Echo MID (strTemp1, 1, instr(strTemp1, """)-1) &"; '& strtemp2 '.
    Next

    OBIEE get all users and roles of RPD
    In this http://obiee101.blogspot.com/2009/06/obiee-get-all-users-from-rpd.html post I showed you how to get users to the RPD. That take as a point of departure it is a small step to get users and roles they have and put the export in a XLS:

    ' Read_Usergroups.VBS 'John Minkjan' http: / / http://www.obiee101.blogspot.com/
    ' Get all the users from a repository
    1: do an export the PRD UDML using nqgenudml.exe
    2: change the location/name of file in this script
    3: run the script in the cscript Read_Usergroups.VBS command line > users.txt
    4: put the export in a pivot table XLS

    Set objFSO = CreateObject ("Scripting.FileSystemObject")
    "this point your EXPORT UDML
    Set objFile = objFSO.OpenTextFile ("E:\usergroup.txt", ForReading)
    Const ForReading = 1
    Dim arrFileLines()
    Dim strRLine
    Dim strTemp1
    Dim strTemp2
    Dim strTemp3
    Dim intRoles
    intRoles = 0
    I have = 0
    WScript.Echo "username; FULL_NAME; ROLE; COUNT. "
    Up to objFile.AtEndOfStream
    strRline = objFile.ReadLine
    If left(strRline,12) = arrFileLines (i) 'DECLARE the USER', then Redim Preserve
    strTemp1 = MID (strRLine, 15, 50)
    strTemp1 = MID (strTemp1, 1, instr(strTemp1, """)-1)
    IF instr (strRline,"}" ") > 0 THEN
    strTemp2 = MID (strRLine, instr(strRline,"{") + 1, (instr(strRline,"}") - (instr(strRline,"{") + 1)))
    ON THE OTHER
    strTemp2 = «»
    END IF
    arrFileLines (i) = strTemp1 &"; "& strtemp2
    intRoles = 1
    i = i + 1
    end if
    If intRoles > = 1 then
    If instr (strRline, "has ROLES (" ") > 0 then
    intRoles = 2
    end if
    If intRoles = 2 and instr (strRline, "a of the ROLES (" ") = 0 then
    strTemp3 = MID (strRline, instr (strRline, "" "") + 1.50)
    strTemp3 = MID (strTemp3, 1, instr(strTemp3, """)-1)
    WScript.Echo arrFileLines(i-1) &"; "& strTemp3 &"; 1 "
    end if
    If intRoles = 2 and instr (strRline)",") > 0 then intRoles = 0
    end if
    end ifLoop
    objFile.Close

    UPDATE POST
    Is your on the right track, work these steps you will find glory... I force try it or needed me.

    hope helped you

    Kind regards
    Murielle.

    Published by: Kranthi.K on June 1st, 2011 02:28

Maybe you are looking for

  • Toshiba SD - C 2612 not runs in SP4600

    Hi Guy´s,I have a SP4600 that runs on a great way. now I got the Toshiba SD - C 2612 dvd - rom, one I have installed in my SP4600.annoyed, now it won't turn in my notebook.It is not recognized in the bios and that neither in Windows.you have an idea,

  • Unknown devices after installing Vista on Satellite Pro U200

    Yesterday, I installed vista ultimate, but unfortunately my U200 is not compatible with vista because the tile would make me believe.What does not work I have identified to date:-Speakers and headphone jack-Function keys-Fingerprint reader Almost any

  • Satellite A200-1TS (PSAE6E) freezes

    Hello world First of all, my English is very bad, so I hope someone can understand me. In any case thanks for the effort. I have a Toshiba A200-1TS I bought 10 months ago. I have two problems that I don't know how to solve. One of the problems is tha

  • Set up freshly - WIFI system does not

    Hey there, After some virus problems, I set my system, formatting the hard drive and reinstall Windows. I also downloaded the update of the Lenovo system, all installed updates and downloaded the center of Solution ThinkVantage. Now the PC is working

  • The issue of upgrading processor on HP Pavilion DV5 (Intel Core 2 Duo T5800 @ 2.00 Ghz) No. 1138...

    From this thread because I have to say how confused I'm on processor options for my DV5 1138 I want a Quad Core, or at least a faster dual core... but my main question is if I can use one of the Intel Core i # series processors. Anyone happens to kno