I hope that the question simple Analytic Functions

Hi, we are on 10.2.0.4 under Red Hat Linux.

I have a situation I don't know can be answered better with analytics, but I'm fighting to get the best solution.

First of all defined data:

CREATE TABLE POSITION_ASGN
(THE VARCHAR2 (5) OF EMPLID NOT NULL)
DATE OF ASOFDATE
, ACT_POSN VARCHAR2 (5) not null
, SUB_POSN VARCHAR2 (5) not null
RPT_POSN VARCHAR2 (5) not null)
;
INSERT INTO POSITION_ASGN
VALUES (' EMP01', to_date('01-JAN-2013','dd-mon-yyyy'), '00065', '00065','00033 ' ")
;
INSERT INTO POSITION_ASGN
VALUES (' EMP01', to_date('01-FEB-2013','dd-mon-yyyy'), '00096', '00065','00054 ' ")
;
INSERT INTO POSITION_ASGN
VALUES (' EMP02', to_date('01-JAN-2013','dd-mon-yyyy'), '00096', '00096','00054 ' ")
;
INSERT INTO POSITION_ASGN
VALUES (' EMP03', to_date('01-JAN-2013','dd-mon-yyyy'), '00103', '00096','00054 ' ")
;
INSERT INTO POSITION_ASGN
VALUES (' EMP04', to_date('01-JAN-2013','dd-mon-yyyy'), '00117', '00096','00054 ' ")
;
INSERT INTO POSITION_ASGN
VALUES ('MGR01', to_date('01-JAN-2013','dd-mon-yyyy'), '00054', ' 00054', ' 00017')
;
INSERT INTO POSITION_ASGN
VALUES ('MGR02', to_date('01-JAN-2013','dd-mon-yyyy'), '00054', ' 00054', ' 00017')
;



Titles of the table where a person is in the organization.
ASOFDATE - history of tracks over time
ACT_POSN - acting, where the person is physically
SUB_POSN - substantive position, where the person should be, usually the same as ACT_POSN, but if you're ready to someone else it's your original position
RPT_POSN - that make you in your acting

What I do is on a given date, with a number of given position return a data set that shows all holders acting in a column, all substantive holders in a second column and all holders acting position reports in a third column.

Ignoring the notion of date at the moment, I can create a simple union query to:

SELECT 'ACTING' 'MODE', 'NUMBER', EMPLID ACT_POSN
OF POSITION_ASGN
WHERE ACT_POSN = '00096'
UNION
SELECT 'SUBST' 'MODE', SUB_POSN, EMPLID
OF POSITION_ASGN
WHERE SUB_POSN = '00096'
UNION
SELECT "MGR" 'MODE', A.ACT_POSN, A.EMPLID
OF POSITION_ASGN A, POSITION_ASGN B
Where A.ACT_POSN = B.RPT_POSN
AND B.ACT_POSN = '00096'


Produce a single output
ACTING EMP01 00096
ACTING 00096 EMP02
BISHOP 00054 MGR01
BISHOP 00054 MGR02
SUBST 00096 EMP02
SUBST 00096 EMP03
SUBST 00096 EMP04

But I can throw it in a table of 3 lines according to the value of 'MODE' so that I find myself with
MANAGER - BACKGROUND - ACTING
EMP01 - EMP02 - MGR01
EMP02 - EMP03 - MGR02
vacuum - EMP04 - white

I could see how I could generate a Rank() on parittion by NUMBER, MODE to find out the fact that I need 3 rows (because there are three background PGE in 00096), maybe as an argument of the no_lig object and then select join based on the MODE and the value of the no_lig argument, but it all feels so round the houses that I wonder if doe analytics that can better.

Does anyone have advice / sample code I could bone?

Published by: Paula Scorchio on 17 March 2013 22:49

I forgot to say that this is my rough code that kind of did what I need
SELECT A.RN, MAX (DECODE (B.RN, A.RN, DECODE (PMODE, "ACTING", EMPLID, ' '))) 'INTERIM', MAX (DECODE (B.RN, A.RN, DECODE (PMODE, "SUBST", EMPLID, ' '))) 'SUBST', MAX (DECODE (B.RN, A.RN, DECODE (PMODE, "MGR", EMPLID, ' '))) 'MG '.
Of
(SELECT ROWNUM 'RN' FROM DUAL CONNECT BY LEVEL < =)
(SELECT MAX (RN) IN)
SELECT 'ACT', 'VALUE', ACT_POSN, EMPLID, RANK() OVER (PARTITION OF ACT_POSN ORDER OF EMPLID) "RN".
OF POSITION_ASGN
WHERE ACT_POSN = '00096'
UNION
SELECT 'SUBST' 'VALUE', SUB_POSN, EMPLID, RANK() OVER (PARTITION OF SUB_POSN ORDER OF EMPLID) "RN".
OF POSITION_ASGN
WHERE SUB_POSN = '00096'
UNION
SELECT "VALUE'" MGR", A.ACT_POSN, A.EMPLID, RANK() OVER (PARTITION BY A.ACT_POSN OF A.EMPLID ORDER)"RN ".
OF POSITION_ASGN A, POSITION_ASGN B
Where A.ACT_POSN = B.RPT_POSN
AND B.ACT_POSN = '00096'))) (),
SELECT 'ACT', 'VALUE', ACT_POSN, EMPLID, RANK() OVER (PARTITION OF ACT_POSN ORDER OF EMPLID) "RN".
OF POSITION_ASGN
WHERE ACT_POSN = '00096'
UNION
SELECT 'SUBST' 'VALUE', SUB_POSN, EMPLID, RANK() OVER (PARTITION OF SUB_POSN ORDER OF EMPLID) "RN".
OF POSITION_ASGN
WHERE SUB_POSN = '00096'
UNION
SELECT "VALUE'" MGR", A.ACT_POSN, A.EMPLID, RANK() OVER (PARTITION BY A.ACT_POSN OF A.EMPLID ORDER)"RN ".
OF POSITION_ASGN A, POSITION_ASGN B
Where A.ACT_POSN = B.RPT_POSN
AND B.ACT_POSN = '00096') B
A.rn group

Hello

Thanks for posting the CREATE TABLE and INSERT statements; It's very useful!

That means each line of output represent? It seems that the nth line output has the distinct interim nth emplid, the nth emplid background and the nth Manager happens to one of the many. The columns of each row are all related to the same ('00096' in this case) target id, but other than that, they seem to have no real connection between them.
It looks like a Fixed price query , like this:

WITH     targets          AS
(
     SELECT     '00096'     AS target_id     FROM dual
)
,     all_modes     AS
(
     SELECT     CASE LEVEL
              WHEN  1  THEN  'ACT'
              WHEN  2  THEN  'SUB'
              WHEN  3  THEN  'RPT'
          END     AS mode_abbr
     FROM     dual
     CONNECT BY     LEVEL     <= 3
)
,     unpivoted_data     AS
(
     SELECT  am.mode_abbr
     ,     t.target_id
     ,     CASE
              WHEN  am.mode_abbr = 'ACT'
               AND  e.act_posn   = t.target_id  THEN  e.emplid
              WHEN  am.mode_abbr = 'SUB'
               AND  e.sub_posn   = t.target_id  THEN  e.emplid
              WHEN  am.mode_abbr = 'RPT'
               AND  e.act_posn   = t.target_id  THEN  m.emplid
          END     AS emplid
     FROM               targets     t
     CROSS JOIN        all_modes     am
     JOIN                  position_asgn  e  ON   t.target_id IN ( e.act_posn
                                              , e.sub_posn
                                          )
     LEFT OUTER JOIN  position_asgn     m  ON   m.act_posn  = e.rpt_posn
)
,     got_r_num     AS
(
     SELECT     u.*
     ,     DENSE_RANK () OVER ( PARTITION BY  target_id
                               ,                    mode_abbr
                         ORDER BY        emplid
                       )         AS r_num
     FROM     unpivoted_data  u
     WHERE     emplid  IS NOT NULL
)
SELECT       MIN (CASE WHEN mode_abbr = 'ACT' THEN emplid END)     AS acting
,       MIN (CASE WHEN mode_abbr = 'SUB' THEN emplid END)     AS substantive
,       MIN (CASE WHEN mode_abbr = 'RPT' THEN emplid END)     AS manager
,       target_id
FROM       got_r_num
GROUP BY  target_id, r_num
ORDER BY  target_id, r_num
;

There may be times when you'll want to do this for multiple targets together, is not only a target (for example, ' 00096'). The first auxiliary request, target, specify any number you want, including 1
Instead of making a UNION 3 way to unpivot data, it will be probably more effective to cross the join to a 'table' with 3 rows. What's the next sub-querry, all_modes, for.
The 3rd subquery, unpivoted_data, finds the relevant lines in position_asgn and unpivots or divided) 3 ranks, for the 3 modes.
The following subquery, got_r_num, gives the line number for each distinct to display value. This could be done in the previous subquery, no cross data dynamic, but to avoid repeating the expression BOX great, I used a separate subquery. If you don't care the order of the items, you can easily use DENSE_RANK in unpivoted_data.
The application main re - merges the nth lines for each target 1 row. If you don't have only one target, then, of course, then you need not display target_id.

For more information on queries to fixed price, see {message identifier: = 4459268}

Tags: Database

Similar Questions

  • Auto rigging is really slow on the Mixamo website. I rarely a rigging generate smoothly since Mixamo has been acquired by Adobe. I always use 1.3 fuse. I would use CC fuse to download, but the download seems to remain at 0%. I really hope that the Autorig

    Auto rigging is really slow on the Mixamo website. I rarely a rigging generate smoothly since Mixamo has been acquired by Adobe. I always use 1.3 fuse. I would use CC fuse to download, but the download seems to remain at 0%. I really hope that the Autorig does not go far, it helped me a lot in my career as an animation.

    Questions Severral to address here:

    • Fuse CC has a bug with the download bar that visually is stuck at 0%, but it is downloading in the background.  You can just let it run a while and it's finally finished on its own.  The bug has a fix in place who should come out with the next update to the CC Download Manager.
    • Fuse rigging 1.3 should continue to work, but there is a bug that we are exploring that blocks some.  Nothing of what you can really do for the moment that we just need time to focus on the issue.
    • Auto-greeur will not go far.
  • With the help of analytical functions

    Hi all

    I'm using ODI 11 g (11.1.1.3.0) and I'm doing an interface using analytical functions in the column map, something like below.
    Salary on (partition of...)

    The problem is that when ODI saw the sum he considers this an aggregate function and the group. Is it possible to understand that it is not an aggregate in ODI function?


    I tried to create an option to specify whether it is analytic, then updated IKM with no luck.
    < % if (odiRef.getUserExit("ANALYTIC").equals("1")) {% >}
    < %} else {% >}
    < % = odiRef.getGrpBy (i) % >
    < % = odiRef.getHaving (i) % >
    < %} % >

    Thanks in advance

    Seth,

    Try this thing posted by Uli:
    http://www.business-intelligence-quotient.com/?p=905

  • Question of analytic function

    Hello world

    I used the HR diagram, to test 2 Querys

    1 - SELECT deptno, empno, sal.
    Sum (SAL) OVER (PARTITION BY deptno ORDER BY sal RANGE
    PREVIOUS BETWEEN UNBOUNDED PRECEDING AND sal/2) CNT_LT_HALF
    WCP
    WHERE deptno IN (20, 30)
    ORDER BY deptno, sal

    the first request has been authorized,

    but when I pulled the second, results was not clear, can someone explain me what schould be the result of the question (because I had no data):

    SELECT deptno, empno, sal,
    Sum (SAL) OVER (PARTITION BY deptno ORDER BY sal RANGE
    PREVIOUS BETWEEN UNBOUNDED PRECEDING AND (sal)) CNT_LT_HALF
    WCP
    WHERE deptno IN (20, 30)
    ORDER BY deptno, sal

    MDK.

    Hello

    MDK wrote:
    Hello world

    I used the HR diagram, to test 2 Querys

    Your hr schema is not the same as mine. It seems I have a similar picture in my scheme of scott.
    >

    1 - SELECT deptno, empno, sal.
    Sum (SAL) OVER (PARTITION BY deptno ORDER BY sal RANGE
    PREVIOUS BETWEEN UNBOUNDED PRECEDING AND sal/2) CNT_LT_HALF
    WCP
    WHERE deptno IN (20, 30)
    ORDER BY deptno, sal

    the first request has been authorized,

    but when I pulled the second, results was not clear, can someone explain me what schould be the result of the question (because I had no data):

    Do you mean that the cnt_lt_half column is always NULL.

    SELECT deptno, empno, sal,
    Sum (SAL) OVER (PARTITION BY deptno ORDER BY sal RANGE
    PREVIOUS BETWEEN UNBOUNDED PRECEDING AND (sal)) CNT_LT_HALF
    WCP
    WHERE deptno IN (20, 30)
    ORDER BY deptno, sal

    We will do both calculations in the same query:

    SELECT  deptno
    ,     empno
    ,     sal
    ,     SUM (sal) OVER ( PARTITION BY  deptno
                               ORDER BY      sal
                    RANGE BETWEEN UNBOUNDED PRECEDING
                          AND     (sal/2)       PRECEDING
                   ) AS cnt_lt_half
    ,     SUM (sal) OVER ( PARTITION BY  deptno
                               ORDER BY      sal
                    RANGE BETWEEN UNBOUNDED PRECEDING
                          AND     (sal)       PRECEDING
                   ) AS cnt_lt_full
    FROM      scott.emp     -- or whatever
    WHERE      deptno IN (20, 30)
    ORDER BY  deptno
    ,            sal
    ;
    

    Output:

    .   DEPTNO      EMPNO        SAL CNT_LT_HALF CNT_LT_FULL
    ---------- ---------- ---------- ----------- -----------
            20       7369        800
            20       7876       1100
            20       7566       2975        1900
            20       7788       3000        1900
            20       7902       3000        1900
            30       7900        950
            30       7521       1250
            30       7654       1250
            30       7844       1500
            30       7499       1600
            30       7698       2850        3450
    

    So you understand cnt_lt_half, but you do not understand cnt_lt_full.
    Let us look at cnt_lt_half.
    On the first line, where empno = 7369, sal is 800. The BEACH of cnt_lt_half is all the lines where the sal is at least (800/2) = 400 before the current sal. The current sal is 800, which means that the range is up to (and including) 800-400 = 400. There is nobody in this Department with a low sal, so cnt_lt_half is NULL.
    On the last row, where empno = 7698, sal is 2850. The BEACH of cnt_lt_half is all the lines where the sal is at least (2850/2) = 1425 before the current sal. The current sal is 2850, which means that the range is up to (and including) 2850-1425 = 1425. There are 3 rows in this Department, with wages high in this range 950 + 1250 + 1250 = 3450, and cnt_lt_half is 3450.

    Now let's look at cnt_lt_full.
    On the first line, where empno = 7369, sal is 800. The RANGE for cnt_lt_half will be all the lines where the sal is at least 800 before the current sal. The current sal is 800, which means that the range is up to (and including) 800-800 = 0. There is nobody in this Department with a low sal, so cnt_lt_full is NULL.
    On the last row, where empno = 7698, sal is 2850. The BEACH of cnt_lt_half will be all the lines where the sal is at least 2850 before the current sal. The current sal is 2850, which means that the range is up to (and including) 2850 2850 = 0. There is no line in this Department with wages in this range, so cnt_lt_full is NULL.

    The highlight of the RANGE BETWEEN... AND PREVIOUS sal will always be 0. If sal is always greater than 0, then it will be never all rows in this range.

    Did you expect anything else? If Yes, what?
    You want to get specific results? If Yes, what?

  • Expect that the Question of the tools

    Hello all, I use the vim3WaitToolsStarted action in a workflow.  My problem is that if the timeout is reached, it is not and the workflow ends.  The success of the workflow can end, but failure, I would like to continue with the workflow.

    I tried to create an exception in vim3WaitToolsStarted, then ask the next step is a decision, but I do not think that the exception happens as tools step fails just.

    Thank you, Zach.

    Hello Zach,

    When the time-out is reached, vim3WaitToolsStarted has just throws an exception. You can try with component "Error of Handle" (near the bottom of the elements in a range of 'Generic'), or write Javascript code to encapsulate the invocation of the action vim3WaitToolsStarted in try... catch/finally block.

  • application of the hierarchy and analytical functions

    Hi I have 2 tables that are

    TBL ACCOUNT
    ACCOUNT_ID        PAYING_ACCOUNT_ID           PARENT_ACCOUNT_ID
    4571111               4571111                    4571111   
    4571112               4571112                    4571111
    4571113               4571113                    4571111
    3995313               3995313                    3995313
    3996786               3995313                    3995313
    4008375               3995313                    3995313
    CUSTOMER_STATUS
    CUSTOMER_ID        CUSTOMER_STATUS
    4571111                Active   
    4571112                Active       
    4571113                Active       
    3995313                Active     
    3996786                Deactive       
    4008375                Active       
    I need to produce an output like below:
    Level_Label            ACCOUNT_ID       PAYING_ACCOUNT_ID    PARENT_ACCOUNT_ID    MY_TOTAL_PR_A   MY_TOTAL_NPR_A   TOTAL_PR_A   TOTAL_NPR_A     MY_TOTAL_PR_D       MY_TOTAL_NPR_D      TOTAL_PR_D      TOTAL_NPR_D
    3995313                  3995313             3995313              3995313               0            1               0            1                  0                 1                   0               1
          4008375            4008375             3995313              3995313               0            0               0            1                  0                 0                   0               1
          3996786            3996786             3995313              3995313               0            0               0            1                  0                 0                   0               1
    4571111                  4571111             4571111              4571111               2            0               2            0                  0                 0                   0               0
          4571112            4571112             4571112              4571111               0            0               2            0                  0                 0                   0               0
          4571113            4571113             4571113              4571111               0            0               2            0                  0                 0                   0               0
    It's the logic and rational to fill up over fields.
    MY_TOTAL_PR_A            Sum of all child accounts of current account that are PR (PAYING_ACCOUNT_ID = ACCOUNT_ID) and in sates considered Active. 
                                         The current account is not included in the sum, only child accounts
    
    MY_TOTAL_NPR_A          Sum of all child accounts of current account that are NPR (PAYING_ACCOUNT_ID != ACCOUNT_ID)  and in sates considered Active. 
                                         The current account is not included in the sum, only child accounts
    
    TOTAL_PR_A                  Sum of all accounts of the structure that are PR and in sates considered Active. 
                                         The TOP account is not included in the sum, only TOP account childs
    
    TOTAL_NPR_A                Sum of all accounts of the structure that are NPR and in sates considered Active. 
                                         The TOP account is not included in the sum, only TOP account childs
    
    
    MY_TOTAL_PR_D            Sum of all child accounts of current account that are PR and in sates considered Deactive. 
                                         The current account is not included in the sum, only child accounts
    
    
    MY_TOTAL_NPR_D          Sum of all child accounts of current account that are NPR and in sates considered Deactive. 
                                          The current account is not included in the sum, only child accounts
    
    
    TOTAL_PR_D                Sum of all accounts of the structure that are PR and in sates considered Deactive. 
                                          The TOP account is not included in the sum, only TOP account childs
    
    
    TOTAL_NPR_D              Sum of all accounts of the structure that are NPR and in sates considered Deactive. 
                                           The TOP account is not included in the sum, only TOP account childs
    This is my code, I managed to calculate the MY_TOTAL_XXX filed but was unable to calculate TOTAL_XXX. Appreciate any information / comment. Thank you
    WITH     got_descendants          AS
    (
         SELECT     CONNECT_BY_ROOT a.account_id     AS ancestor_id
         ,     a.paying_account_id
      ,  a.account_id
      , a.parent_account_id
         ,     LEVEL                    AS lvl
      , c.customer_status
         FROM     account a inner join customer_status c
      on a.account_id = c.customer_id
         CONNECT BY NOCYCLE     PRIOR a.account_id     = a.parent_account_id
              --AND          account_id          != parent_account_id
    ), DUMMY2 AS
    (
    select g.*  from got_descendants g
    ), DUMMY AS
    (
    SELECT       ancestor_id
    ,       COUNT (CASE WHEN lvl             > 1
                      AND  account_id  = paying_account_id
                And  ancestor_id != account_id
                AND  customer_status = 'A' THEN 1 END)     AS my_total_pr_a
    ,       COUNT (CASE WHEN ancestor_id  = paying_account_id
               AND  customer_status = 'A' THEN 1 END)     AS total_pr_a
    ,       COUNT (CASE WHEN lvl             > 1
                      AND  account_id != paying_account_id
                And  ancestor_id != account_id
                AND  customer_status = 'A' THEN 1 END)     AS my_total_npr_a
    ,       COUNT (CASE WHEN ancestor_id != paying_account_id 
               AND  customer_status = 'A'
               And  ancestor_id != parent_account_id THEN 1 END)     AS total_npr_a
    ,       COUNT (CASE WHEN lvl             > 1
                      AND  account_id  = paying_account_id
                And  ancestor_id != account_id
                AND  customer_status = 'D' THEN 1 END)     AS my_total_pr_d
    ,       COUNT (CASE WHEN ancestor_id  = paying_account_id
               AND  customer_status = 'D' THEN 1 END)     AS total_pr_d
    ,       COUNT (CASE WHEN lvl             > 1
                      AND  account_id != paying_account_id
                And  ancestor_id != account_id
                AND  customer_status = 'D' THEN 1 END)     AS my_total_npr_d
    ,       COUNT (CASE WHEN ancestor_id != paying_account_id 
               AND  customer_status = 'D' THEN 1 END)     AS total_npr_d
    FROM       DUMMY2
    GROUP BY ancestor_id
    )
    SELECT  lpad(' ', 2*level) || ACCOUNT.ACCOUNT_ID AS LEVEL_LABEL, LEVEL, CONNECT_BY_ISCYCLE "Cycle",
    ACCOUNT.PAYING_ACCOUNT_ID, ACCOUNT.PARENT_ACCOUNT_ID, ACCOUNT.ACCOUNT_ID,
    DUMMY.my_total_pr_a, DUMMY.total_pr_a, DUMMY.my_total_npr_a, DUMMY.total_npr_a,
    DUMMY.my_total_pr_d, DUMMY.total_pr_d, DUMMY.my_total_npr_d, DUMMY.total_npr_d
    from ACCOUNT INNER JOIN DUMMY  ON  ACCOUNT.account_id = DUMMY.ancestor_id
    START WITH ACCOUNT.parent_account_id = ACCOUNT.account_id  
    CONNECT BY NOCYCLE PRIOR ACCOUNT.account_id = ACCOUNT.parent_account_id
    DDL
    CREATE TABLE ACCOUNT
      (
        "CUSTOMER_ID"       NUMBER(20,0) NOT NULL ENABLE,
        "PAYING_ACCOUNT_ID" NUMBER(20,0),
        "PARENT_ACCOUNT_ID" NUMBER(20,0),
        "ACCOUNT_ID"        NUMBER,
        "COMPANY_ID"        NUMBER
      )
    
    CREATE TABLE CUSTOMER_STATUS
      (
        "CUSTOMER_ID"     NUMBER(10,0),
        "CUSTOMER_STATUS" VARCHAR2(1 BYTE)
      )
    
    
    Insert into ACCOUNT (ACCOUNT_ID,PAYING_ACCOUNT_ID,PARENT_ACCOUNT_ID) values (4571111,4571111,4571111);
    Insert into ACCOUNT (ACCOUNT_ID,PAYING_ACCOUNT_ID,PARENT_ACCOUNT_ID) values (4571112,4571112,4571111);
    Insert into ACCOUNT (ACCOUNT_ID,PAYING_ACCOUNT_ID,PARENT_ACCOUNT_ID) values (4571113,4571113,4571111);
    Insert into ACCOUNT (ACCOUNT_ID,PAYING_ACCOUNT_ID,PARENT_ACCOUNT_ID) values (3996786,3995313,3995313);
    Insert into ACCOUNT (ACCOUNT_ID,PAYING_ACCOUNT_ID,PARENT_ACCOUNT_ID) values (4008375,3995313,3995313);
    Insert into ACCOUNT (ACCOUNT_ID,PAYING_ACCOUNT_ID,PARENT_ACCOUNT_ID) values (3995313,3995313,3995313);
    
    
    Insert into CUSTOMER_STATUS (CUSTOMER_ID,CUSTOMER_STATUS) values (3996786,'D');
    Insert into CUSTOMER_STATUS (CUSTOMER_ID,CUSTOMER_STATUS) values (4008375,'A');
    Insert into CUSTOMER_STATUS (CUSTOMER_ID,CUSTOMER_STATUS) values (3995313,'A');
    Insert into CUSTOMER_STATUS (CUSTOMER_ID,CUSTOMER_STATUS) values (4571111,'A');
    Insert into CUSTOMER_STATUS (CUSTOMER_ID,CUSTOMER_STATUS) values (4571112,'A');
    Insert into CUSTOMER_STATUS (CUSTOMER_ID,CUSTOMER_STATUS) values (4571113,'A');

    Hello

    user11432758 wrote:
    Hi, this is the error msg

    ORA-00904: "O_NUM": invalid identifier
    00904. 00000 -  "%s: invalid identifier"
    

    The reason for this error is that the only "table" in the query is

    ...
    FROM      dummy
    GROUP BY ancestor_id
    ORDER BY  MIN (o_num)
    

    but the model has no column named o_num.

    I'm still not sure that understand what you want to do.
    He gets the results you want from the data sample:

    WITH     got_customer_status     AS
    (
         SELECT  a.account_id
         ,     a.paying_account_id
           ,     a.parent_account_id
           ,     c.customer_status
         ,     CASE
                  WHEN  a.account_id = a.parent_account_id
                  THEN  1
              END     AS is_root
         FROM          account          a
         INNER JOIN     customer_status c  ON     a.account_id = c.customer_id
    )
    ,     got_descendants          AS
    (
         SELECT     CONNECT_BY_ROOT account_id               AS ancestor_id
         ,     paying_account_id
           ,     account_id
           ,     parent_account_id
         ,     LEVEL                              AS lvl
         ,     (CONNECT_BY_ROOT is_root) * LEVEL          AS rlvl
           ,     customer_status
         ,     is_root
         ,     (CONNECT_BY_ROOT is_root) * ROWNUM          AS rnum
         FROM     got_customer_status
         CONNECT BY      PRIOR account_id     = parent_account_id
              AND     account_id          != parent_account_id
    )
    ,     got_my_totals     AS
    (
         SELECT       ancestor_id
         ,       COUNT ( CASE WHEN lvl               > 1
                                  AND  account_id          = paying_account_id
                                 AND  ancestor_id        != account_id
                                 AND  customer_status      = 'A'
                          THEN 1
                      END
                   )     AS my_total_pr_a
         ,       COUNT ( CASE WHEN lvl               > 1
                                  AND  account_id           != paying_account_id
                                       And  ancestor_id      != account_id
                                       AND  customer_status      = 'A'
                          THEN 1
                      END
                   )     AS my_total_npr_a
         ,       COUNT (CASE WHEN lvl             > 1
                      AND  account_id  = paying_account_id
                           And  ancestor_id != account_id
                           AND  customer_status = 'D' THEN 1 END)     AS my_total_pr_d
         ,       COUNT (CASE WHEN lvl             > 1
                      AND  account_id != paying_account_id
                           And  ancestor_id != account_id
                           AND  customer_status = 'D' THEN 1 END)     AS my_total_npr_d
         FROM       got_descendants
         GROUP BY  ancestor_id
    )
    SELECT    LPAD (' ', 2 * d.rlvl) || d.account_id     AS level_label
    ,       NVL (d.is_root, 0)                    AS cycle
    ,       d.paying_account_id
    ,       d.parent_account_id
    ,       d.account_id
    ,       t.my_total_pr_a
    ,       FIRST_VALUE (my_total_pr_a)  OVER ( PARTITION BY  d.ancestor_id
                                                 ORDER BY      d.rnum
                                 )              AS total_pr_a
    ,       t.my_total_npr_a
    ,       FIRST_VALUE (my_total_npr_a) OVER ( PARTITION BY  d.ancestor_id
                                                 ORDER BY          d.rnum
                                 )              AS total_pr_a
    ,       t.my_total_pr_d
    ,       FIRST_VALUE (my_total_pr_d)  OVER ( PARTITION BY  d.ancestor_id
                                                 ORDER BY      d.rnum
                                 )              AS total_pr_d
    ,       t.my_total_npr_d
    ,       FIRST_VALUE (my_total_npr_d) OVER ( PARTITION BY  d.ancestor_id
                                                 ORDER BY          d.rnum
                                 )              AS total_pr_d
    FROM       got_descendants     d
    JOIN       got_my_totals     t  ON     d.account_id     = t.ancestor_id
    WHERE       d.rnum     IS NOT NULL
    ORDER BY  d.rnum
    ;
    

    This should be faster than what I've posted before, because there is only one CONNECTION PER request. It avoids join them in the same application that CONNECT BY and does not use NOCYCLE. Once more, visit the forum FAQ {message identifier: = 9360003}

  • Need a Simple answer to the question Simple Contribute/Firefox

    Hello all;

    I tried using Adobe CS3, technical support, telephone support, this forum, other forums, Mozilla, and nowhere can I get a straight answer, so I'll try one last time before giving up on Adobe products.

    I have CS3 Web Premium. The entire suite is on the iMac and MacBook Pro. I need to update a Dreamweaver site on a trip with a MacBook Air.

    Please help me understand this.

    Contribute apparently has the ability to put a plug-in in Firefox, then you can update the remote sites. It is not clear from all sources above if the module of the suite of Contribute must be on my MB Air, or if I can just get Firefox for the MB Air card and put in place the web site for the distance of publishing using Contribute on the iMac.

    It would be the simplest: use Ct on the iMac to some portions of the site for editing by using Firefox from the MB air. It seems simple enough for me but no one or no info I can find can confirm or deny that this is how it would work.

    Please, if anyone knows if this is how the thing works, save my sanity and convince me to stay with Adobe products to let me know if I'm right and how to get the Firefox plug in.

    Thank you!

    My dear think;

    Thanks for the reply, you're right that it's as simple as that.

    I'm a little confused how this is a good solution, if you need to have on your mobile machine Ct, you would also be more or less must have Dw there (if you use the Web Premium package), then why would you also need of a plugin Firefox?

    I have this huge and capable package on iMac and laptop family but I am not allowed to use one of his abilities, even the smallest, or a plug-in compatible with, a third machine. So far from home, I have to either spend a lot more money on another piece of software for the few times of the year I travel, or return to raw html in a basic text editor and an ftp program. All I want to do is change one page per day with some new features.

    Shaking his head with frustration at Adobe.

  • 32 bit or 64 bit that the question

    I need some advice for emergency aid, while I have Office 2xPC, I have a Toshiba A10 Satellite Pro that I am about to change. It has so slow and running CS4 is extremely slow. I need advice on is that should I replace this aging with a new Toshiba laptop and install Windows 7 64 bit, or keep in the 32-bit. I have some problems: -.

    1. I currently have CS4 update on my PC I have to buy another version of Photoshop CS4 (so so full version or upgrade)

    2. all my computers are connected to a network and image files are supported up to 1 TB of external hard drives, I have a mix of 32 & 64 bit computers on the same network.

    3 Toshiba seem to be offering laptops with Vista, (don't want that). I wait I want to upgrade all computers to Windows 7, if I got the upgrade for Windows 7 at Toshiba, I could use this update on my PC? -Wouldn't be better to buy the upgrade to Windows 7 from another source, so I can upgrade the Toshiba and a PC without problems.

    4. in general the 32-bit update would be the best way to go. I realize that I may have problems with other software, but I guess Office Pro would work ion both, or would it be.

    Any guidance would be appreciated.

    superflash26: those who are good choices. Thumbs up on all of them.

  • Receive error messages when trying to install McAfee and it said their support that the question is related to the error in Device Manager.

    How can you diagnose errors in Device Manager

    Tried installing McAffee, but error message. Contacted McAfee... says the problem is in the Device Manager. How can I determine the error in Device Manager?

    Are you sure you don't have another Antivirus always installed; even an expired subscription?

    A lot of AVs will not install if they find one already installed (and you must have an AV installed in any case).

    Another problem may be how you try to install it:

    You need to download and save on the desktop > then right click on the saved files > select run as administrator.

    FYI: Here's how to look in the Device Manager:

    http://Windows.Microsoft.com/en-us/Windows-Vista/open-Device-Manager

    See you soon.

  • I have Lightroom CC on my desktop but Lightroom 5 on my laptop. I uninstalled 5 on my laptop with the hope that the creative cloud would offer CC, but I see only 5 Lr offered on the creative cloud

    What should I do? LR 5 will not open my Catalogues Lr CC

    Hi CarloCC

    Please check the system requirements for Adobe Lightroom 6/CC 2015

    Reference: system requirements for Adobe Lightroom for Mac OS and Windows

    And as mentioned earlier in the posts sound, Lightroom 6 is only good and works on 64-bit platform.

    Last version supported on 32-bit platform was 5 Lightroom.

    For this purpose, you check with your factory system if you change the operating system to 64-bit Version.

    It will be useful,

    ~ Mohit

  • I hope that the answer easy. How can I get a cursor to only have integer values?

    I try a slider from the example page to allow the chooice 5-100 users, but the slider also allows values to be doubles. Is it possible to limit the slider to lock only on values when they double?

    I'm not sure, but have you tried minorTickCount and snapToTicks?

  • With the help of analytical functions above and follow

    Hello

    Assume, I the date as follows:
    customerid    orderid                      orderdate    
    -------------    ----------                    --------------
    xyz                       1                       01/10/2010
    xyz                       2                       02/11/2010
    xyz                       3                       03/12/2011
    xyz                       4                       03/01/2011
    xyz                       5                       03/02/2011
    xyz                       6                       03/03/2011
    abc                       7                       10/09/2010
    abc                       8                       10/10/2010
    abc                       9                       10/11/2010
    abc                       10                     10/01/2011
    abc                       11                     10/02/2011
    abc                       12                     10/03/2011
    Now I want to gerenate a report based on the above data as below:

    CustomerID, number of orders placed in the last 30 days of the new year (01/01/2011), no orders placed with 60 during the last days of the new year, no.. orders placed in the last 90 days of the new year, no orders placed within 30 days of the new year, no.. orders placed within 60 days of the new year, no.. orders placed within 90 days of the new year


    I am trying to do this using the following code, but could not succeed:
        select c.customerid,
        count(*) over (partition by c.customerid order by c.orderdate RANGE interval '30' DAY PRECEDING) as "Last 1 month",
        count(*) over (partition by c.customerid order by c.orderdate RANGE interval '60' DAY PRECEDING) as "Last 2 months",
        count(*) over (partition by c.customerid order by c.orderdate RANGE interval '90' DAY PRECEDING) as "Last 3 months",
        count(*) over (partition by c.customerid order by c.orderdate RANGE interval '30' DAY FOLLOWING) as "Following 1 month",
        count(*) over (partition by c.customerid order by c.orderdate RANGE interval '60' DAY FOLLOWING) as "Following 2 months",
        count(*) over (partition by c.customerid order by c.orderdate RANGE interval '90' DAY FOLLOWING) as "Following 3 months"
        from customer_orders c where orderdate < to_date('01/01/2011','dd/mm/yyyy')
    Kindly help. Thanks in advance.

    Published by: 858747 on May 13, 2011 03:40

    Published by: BluShadow on May 13, 2011 11:57
    addition of {noformat}
    {noformat} tags to retain formatting.  Please read: {message:id=9360002}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                

  • Using the analytic function

    Oracle 11g Release 2

    I'm assuming that the best solution is the use of analytical functions.

    create table test3
    ( part_type_id  varchar2(50)
    ,group_id      number
    ,part_desc_id  number
    ,part_cmt      varchar2(50)
    )
    /
    
    insert into test3 values( 'ABC123',1,10,'comment1');
    insert into test3 values( 'ABC123',1,10,'comment2');
    insert into test3 values( 'ABC123',2,15,'comment1');
    insert into test3 values( 'ABC123',2,15,'comment2');
    insert into test3 values( 'EFG123',25,75,'comment3');
    insert into test3 values( 'EFG123',25,75,'comment4');
    insert into test3 values( 'EFG123',25,75,'comment5');
    insert into test3 values( 'XYZ123',1,10,'comment6');
    insert into test3 values( 'XYZ123',2,15,'comment7');
    commit;
    
    select * from test3;
    
    PART_TYPE_ID           GROUP_ID PART_DESC_ID PART_CMT
    -------------------- ---------- ------------ --------------------
    ABC123                        1           10 comment1
    ABC123                        1           10 comment2
    ABC123                        2           15 comment1
    ABC123                        2           15 comment2
    EDG123                        25          75 comment3
    EDG123                        25          75 comment4
    EDG123                        25          75 comment5
    XYZ123                        1           10 comment6
    XYZ123                        2           15 comment7
    
    9 rows selected.
    
    Desired output:
    
    PART_TYPE_ID           GROUP_ID PART_DESC_ID PART_CMT
    -------------------- ---------- ------------ --------------------
    ABC123                        1           10 comment1 
    ABC123                        2           15 comment1
    XYZ123                        1           10 comment1
    XYZ123                        2           15 comment2
    
    RULE: where one part_type_id has multiple (2 or more distinct combinations) of group_id/part_desc_id
    
    NOTE: There are about 12 columns in the table, for brevity I only included 4.
    
    
    
    

    Post edited by: orclrunner was updated desired output and rule

    Hello

    Here's one way:

    WITH got_d_count AS

    (

    SELECT part_type_id, group_id, part_desc_id

    MIN (part_cmt) AS min_part_cmt

    COUNT AS d_count (*) OVER (PARTITION BY part_type_id)

    OF test3

    GROUP BY part_type_id, group_id, part_desc_id

    )

    SELECT DISTINCT

    group_id, part_desc_id, part_type_id, min_part_cmt

    OF got_d_count

    WHERE d_count > 1

    ;

    Output:

    GROUP_ID PART_DESC_ID MIN_PART_CMT PART_TYPE_ID

    ------------ ---------- ------------ ------------

    ABC123 1 10 comment1

    ABC123 2 15 comment1

    XYZ123 1 10 comment6

    XYZ123 2 15 comment7

    Analytical functions, such as the COUNTY and MIN, many global versions, in addition, it can give the same results.  Use the analytical versions when each row of output corresponds to exactly 1 row of input and the aggregate and GROUP BY version when each line of output corresponds to a group of lines 1 or more input.  In this issue, each line of output appears to be a group of input lines having the same group_id, part_type_id, and part_desc_id (I'm guessing just, this only has never stated), so I used GROUP BY to get 1 row of output for every input lines.

  • May result by analytic function that follows

    Hi all

    I am currently using oracle 10g.

    create table
    CREATE TABLE fortest
    (  PROD             VARCHAR2(40 BYTE),
      prodvalues     number);
    INSERT statement
    insert into fortest values ('dental',10)
       insert into fortest values ('dental',4)
        insert into fortest values ('dental',13)
         insert into fortest values ('dental',3)
          insert into fortest values ('vision',2)
           insert into fortest values ('vision',11)
           insert into fortest values ('vision',33) 
            insert into fortest values ('vision',7)
    I need the output as follows
    prod        prodvalues <5         prodvalues >=5 and less than 10               prodvalues >=10
    dental         2                                     0                                      2
    vision         1                                     1                                      2
    first column should give me separate prod, prodvalues5 column: should give me the number of similar prod with prodvalues less than 5, same thirdcolumn should have County of similar prod with prodvalues > = 5 and prodvalue < 10 and so on.

    Please, not the names of columns of the output tables are just for reference, and I will not use them.


    Thanks in advance.

    Hi Bob,

    You don't have to use analytical functions. Here's a solution that doesn't use analytical functions:

    with temp as (select prod, case
                               when prodvalues <=5 then 1
                               else                    0
                               end  as prod5,
                               case
                               when prodvalues >=5 and prodvalues <10 then 1
                               else                    0
                               end  as prod5_10,
                               case
                               when prodvalues >= 10 then 1
                               else                    0
                               end  as prod10
                               from fortest)
                               select prod,sum(prod5) as prod5 ,  sum(prod5_10) as prod5_10 ,sum(prod10) as prod10
                               from temp
                               group by prod
    
  • confusion with the analytical functions

    I created an example where I am right now with the help of analytical functions. However, I need the query below to return an additional column. I need to return the result from:-' factor_day_sales * max (sdus)'. Any ideas?

    If the first column is located and must have the following results

    777777, 5791, 10, 1.5, 15, 90, 135, 7050

    the 1350 is the result, I don't know how to do. (some how to multiply factored_day_sales max (sdus) 15 470 = 7050
    create table david_sales (
    pro_id number(38),
    salesidx number (38,6),
    tim_id number(38));
    
    truncate table david_sales
    
    create table david_compensations (
    pro_id number(38),
    tim_id number(38),
    factor number(38,6));
    
    
    insert into david_sales values
    (777777, 10.00, 5795);
    insert into david_sales values
    (777777,20.00, 5795);
    insert into david_sales values
    (777777, 30.00, 5794);
    insert into david_sales values
    (777777, 40.00, 5794);
    insert into david_sales values
    (777777, 100.00, 5793);
    insert into david_sales values
    (777777, 10.00, 5793);
    insert into david_sales values
    (777777,80.00, 5791);
    insert into david_sales values
    (777777, 10.00, 5791);
    
    insert into david_compensations values
    (777777, 5795, 1.5);
    insert into david_compensations values
    (777777, 5793, 2.0);
    insert into david_compensations values
    (777777, 5792, 1.0);
    insert into david_compensations values
    (777777, 5791, 1.5);
    
    
    
        SELECT  s.pro_id sales_pro
        ,       c.pro_id comp_pro
        ,       s.tim_id sales_tim
        ,       c.tim_id comp_tim
        ,       s.salesidx day_sales
        ,       NVL(c.factor, 1) factor
        ,       s.salesidx * NVL(c.factor, 1) factored_day_sales
        ,       sum(s.salesidx                   ) over (partition by s.pro_id order by s.pro_id, s.tim_id) Sdus
        ,       sum(s.salesidx * NVL(c.factor, 1)) over (partition by s.pro_id order by s.pro_id, s.tim_id) sumMjCj 
          FROM david_sales s
          ,    david_compensations c
          WHERE s.pro_id    = c.pro_id(+)
          AND s.tim_id      = c.tim_id(+)
          AND s.tim_id     BETWEEN 5791  AND 5795
    Thanks for looking

    Is that what you want?

        SELECT  s.pro_id sales_pro
        ,       c.pro_id comp_pro
        ,       s.tim_id sales_tim
        ,       c.tim_id comp_tim
        ,       s.salesidx day_sales
        ,       NVL(c.factor, 1) factor
        ,       s.salesidx * NVL(c.factor, 1) factored_day_sales
        ,       sum(s.salesidx                   ) over (partition by s.pro_id order by s.pro_id, s.tim_id) Sdus
        ,       sum(s.salesidx * NVL(c.factor, 1)) over (partition by s.pro_id order by s.pro_id, s.tim_id) sumMjCj
        , (s.salesidx * NVL(c.factor, 1) * sum(s.salesidx                   ) over (partition by s.pro_id order by s.pro_id, s.tim_id))
          FROM david_sales s
          ,    david_compensations c
          WHERE s.pro_id    = c.pro_id(+)
          AND s.tim_id      = c.tim_id(+)
          AND s.tim_id     BETWEEN 5791  AND 5795
    
    SALES_PRO              COMP_PRO               SALES_TIM              COMP_TIM               DAY_SALES              FACTOR                 FACTORED_DAY_SALES     SDUS                   SUMMJCJ                SUMMEDMULTI
    ---------------------- ---------------------- ---------------------- ---------------------- ---------------------- ---------------------- ---------------------- ---------------------- ---------------------- ----------------------
    777777                 777777                 5791                   5791                   80                     1.5                    120                    90                     135                    10800
    777777                 777777                 5791                   5791                   10                     1.5                    15                     90                     135                    1350  
    

    I get the 1350

    or did you mean:

        SELECT  s.pro_id sales_pro
        ,       c.pro_id comp_pro
        ,       s.tim_id sales_tim
        ,       c.tim_id comp_tim
        ,       s.salesidx day_sales
        ,       NVL(c.factor, 1) factor
        ,       s.salesidx * NVL(c.factor, 1) factored_day_sales
        ,       sum(s.salesidx                   ) over (partition by s.pro_id order by s.pro_id, s.tim_id) Sdus
        ,       sum(s.salesidx * NVL(c.factor, 1)) over (partition by s.pro_id order by s.pro_id, s.tim_id) sumMjCj
        ,  s.salesidx * NVL(c.factor, 1) * (sum(s.salesidx * NVL(c.factor, 1)) over (partition by s.pro_id order by s.pro_id, s.tim_id)) summedMulti
          FROM david_sales s
          ,    david_compensations c
          WHERE s.pro_id    = c.pro_id(+)
          AND s.tim_id      = c.tim_id(+)
          AND s.tim_id     BETWEEN 5791  AND 5795 
    
    SALES_PRO              COMP_PRO               SALES_TIM              COMP_TIM               DAY_SALES              FACTOR                 FACTORED_DAY_SALES     SDUS                   SUMMJCJ                SUMMEDMULTI
    777777                 777777                 5795                   5795                   10                     1.5                    15                     300                    470                    7050
    

    Note, in the second block, I changed it just to use sumMjCj instead of sDus which seems to correlate with what you wanted (15 * 470 = 7050) while sdus is 15 * 300 = 4500

    Published by: tanging on December 11, 2009 06:17

Maybe you are looking for