Issue Analytics

Not sure what is wrong, any user can connect to Analytics that we have not configured with any AD, we simply use default authenticator as required in the optional indicator.

One has faced this issue before?

Thank you

Ahmed.

I tried all the things when I tried the optional control indicator and sufficient, that is to say that authentication failed when I connect to Analytics

Tags: Business Intelligence

Similar Questions

  • Pivot does not work; Issue Analytics

    I had a lot of things this weekend on difficult request for me but now have other questions. You must run the query top to understand the understand the second.

    Background:
    It is a reporting period that includes amounts of beginning of year as last year.

    My report example is for 9 period which begins on August 24, 2009
    Period has 5 weeks (most of the times have 4).
    Also if delay is required in week 3 and then just a few weeks 1 and 2 will be reflected on report


    The report looks like this: (data provided should produce these numbers except perhaps the column for a YEAR.)
                 WEEK1       WEEK2           WEEK3         WEEK4             WEEK5       Period-to-date          Year-to-date
    Net - Landau     11,485.79   11,416.60      11,609.01     11,049.76      12,867.10      58,428.00             454,231.37       
    Net - AW     0.00               0.00          0.00        0.00                   0.00          0.00              0.00
    Net - LJS     0.00                   0.00           0.00         0.00           0.00          0.00            0.00
    Net - TB     7,118.17     7,228.13    7,657.94      7,699.53           7,958.53      37,662.00            306,115.59       
    Total Net     18,603.96    18,644.73      19,266.95     18,749.29         20,825.63      96,091.00            760,346.96       
    
    Last Year Sales     23,515.95    24,244.37       23,962.74    23,134.79      24,440.87      119,299.00           856,363.36      
    Increase     (4,911.99)  (5,599.64)      (4,695.79)  (4,385.50)           (3,615.24)     (23,208.00)           (96,016.40)      
    Last year 
      Next Week     24,244.37    23,962.74       23,134.79    24,440.87      23,055.87      118,839.00           879,419.23      
    -= Is the current year dates
    Beginning of the period: (BOP) Mon Aug 24
    Week 1: Mon Aug 24-30 August
    Week 2: Monday 11 August - 6 September
    Week 3: my Sep 7 - September 13
    Week 4: Monday 14 September - 20 September
    Week 5: Lun 21 Sept - 27 September
    Beginning of the fiscal year (BOY) = 28 December 08 '

    -= Is last year's dates
    Beginning of the period: Mon Aug 25 (BOP_LY)
    Week 1: Mon Aug 25-31 August
    Week 2: Monday 1st August - 7 September
    Week 3: my 8 Sep - 7 14
    Week 4: Monday, September 15-21 7
    Week 5: Mon Sept 22-28 Sept
    Beginning of the fiscal year (BOY) = 31 December 07 '


    My weekend challenge was to get a full year of data vs only period data.
    There are 7 columns on the report. 5 weeks period, (the weeks total) PeriodToDate and YearToDate (PeriodToDate + sum of all data from the beginning of the year at the end of the previous period.
    I'm not really interested in the PTD, the program can handle it. I had the BOY date with the following code:

    All data is summarized for the week and grouped by storeid and TRUNC (date, 'IW') which is set at the Monday of each week.
    each week. (The result set contains data for 2 stores instead of 1. This is only to make sure that my request was correctly filtering)
    drop table my_csh_main;
    
     CREATE TABLE MY_CSH_MAIN
       (     "FK_STR_MAIN_ID" NUMBER, 
         "BUSI_DATE" DATE, 
         "CONF_NUMB2" NUMBER, 
         "CONF_NUMB49" NUMBER, 
         "CONF_NUMB44" NUMBER, 
         "CONF_NUMB3" NUMBER, 
         "CONF_NUMB4" NUMBER, 
         "CONF_NUMB38" NUMBER, 
         "CONF_NUMB56" NUMBER
       );
     
    REM INSERTING into MY_CSH_MAIN
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('28-AUG-08','DD-MON-RR'),22103.69,0,0,119,0,4605.21,0);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('27-AUG-09','DD-MON-RR'),18081.37,0,0,0,0,3533.45,0);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('17-SEP-09','DD-MON-RR'),18211.29,0,0,0,0,3806.32,0);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('04-SEP-08','DD-MON-RR'),24244.37,0,0,284.94,0,0,9395.63);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('03-SEP-09','DD-MON-RR'),18644.73,0,0,85.48,0,0,7228.13);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('24-SEP-09','DD-MON-RR'),16809.21,0,0,64.99,0,3014.61,0);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('25-SEP-08','DD-MON-RR'),24440.87,0,0,0,0,0,9469.64);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('28-AUG-08','DD-MON-RR'),23515.95,0,0,0,80.38,0,9379.9);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('24-SEP-09','DD-MON-RR'),20825.63,0,0,73.97,0,0,7958.53);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('17-SEP-09','DD-MON-RR'),18749.29,0,0,0,0,0,7699.53);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('11-SEP-08','DD-MON-RR'),22839.3,0,0,206.39,116.74,4493.28,0);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('04-SEP-08','DD-MON-RR'),22627.74,0,0,279.98,0,4423.83,0);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('27-AUG-09','DD-MON-RR'),18603.96,0,0,81.25,0,0,7118.17);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('11-SEP-08','DD-MON-RR'),23962.74,0,0,153.1,0,0,9335.35);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('18-SEP-08','DD-MON-RR'),23134.79,0,0,44.12,0,0,8978.87);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('25-SEP-08','DD-MON-RR'),24950.45,0,0,129.98,0,5330.22,0);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('10-SEP-09','DD-MON-RR'),19266.95,0,0,0,0,0,7657.94);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('03-SEP-09','DD-MON-RR'),17183.25,0,0,0,0,3487.12,0);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('18-SEP-08','DD-MON-RR'),21372.82,0,0,0,0,4546.15,0);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('10-SEP-09','DD-MON-RR'),17688.41,0,0,113.12,0,3424.17,0);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('31-DEC-08','DD-MON-RR'),611016.24,0,0,1276.62,724.96,122236.02,0);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('31-DEC-08','DD-MON-RR'),667612.63,0,0,1018.81,0,0,269777.87);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('02-JAN-08','DD-MON-RR'),1676737.13,0,0,5652.47,3850.68,345971.1,500.5);
    Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('02-JAN-08','DD-MON-RR'),1786451.11,0,0,3167.61,175.38,0,788438.73);
    
    
    
      CREATE TABLE  LANDAU_REPORT_STORES 
       (     "COMPANYNAME" CHAR(40 BYTE), 
         "DISTRICTNAME" VARCHAR2(50 BYTE), 
         "STOREID" NUMBER(4,0) NOT NULL ENABLE, 
         "STORENAME" VARCHAR2(70 BYTE), 
         "STORENBR" CHAR(4 BYTE) NOT NULL ENABLE
       )  
    
    
    REM INSERTING into LANDAU_REPORT_STORES
    Insert into LANDAU_REPORT_STORES (COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau                                  ','DIST 10',64,'N_Main','0004');
    Insert into LANDAU_REPORT_STORES(COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau                                  ','DIST 10',65,'Belvidere','0005');
    Insert into LANDAU_REPORT_STORES(COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau                                  ','DIST 10',104,'Mulford','0032');
    Insert into LANDAU_REPORT_STORES(COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau                                  ','DIST 50',141,'Charleston','0043');
    Insert into LANDAU_REPORT_STORES(COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau                                  ','DIST 10',61,'Kilburn','0002');
    Insert into LANDAU_REPORT_STORES(COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau                                  ','DIST 10',62,'11th_St','0003');
    
       with WeeklyTotals as 
       ( select                           StoreId
                                         , week_date
                                         , net_sales
                                         , ljs_food_sales
                                         , total_drink_sales
                                         , net_food
                                         , CASE
                                              WHEN net_food is null  then 0
                                              WHEN net_food  = 0  then 0
                                              ELSE Ljs_food_sales / net_food
                                           END as LJS_Percent
                                         , CASE
                                              WHEN net_food is null  then 0
                                              WHEN net_food = 0  then 0
                                              else total_drink_sales * (ljs_food_sales / net_food)
                                           END as LJS_Drinks
                                         , aw_sales
                                         , tb_sales
              FROM                            (Select fk_str_main_id as StoreId, 
                                                                               trunc(csh.busi_date, 'IW') as week_date, 
                                                                         sum(csh.conf_numb2) As net_sales,
                                                                         sum(conf_numb49) as ljs_food_sales,
                                                                         sum(conf_numb44)  As total_drink_sales,
                                                                         sum(csh.conf_numb2) - sum(CONF_NUMB44) - sum(conf_numb3) - sum(conf_numb4) AS net_food,
                                                                         sum(conf_numb38) As aw_sales,
                                                                         sum(conf_numb56)  as tb_sales
                                                     from my_csh_main csh
                                                     WHERE BUSI_DATE BETWEEN  TO_DATE( '28-Dec-08' ,'DD-MON-YY') AND  TO_DATE( '27-SEP-09' ,'DD-MON-YY') 
                                                    and fk_str_main_id in (141, 221)
                                                    GROUP BY CSH.FK_STR_MAIN_ID,    trunc(csh.busi_date, 'IW')
                                               ) 
       )
       ,  WeeklyFoodSalesLY as
     (
        SELECT    fk_str_main_id AS storeid
               ,  TRUNC(busi_date, 'iw') week_date
               ,  SUM(csh.conf_numb2) AS net_sales
        FROM   my_csh_main csh
        WHERE busi_date BETWEEN   to_date('31-DEc-07', 'dd-Mon-yy')  and  to_date('28-Sep-08', 'dd-Mon-yy')
        GROUP BY fk_str_main_id, TRUNC(busi_date, 'iw')
     )
      , StoreDetails  AS
     (
             select * from LANDAU_REPORT_STORES where STORENAME NOT like  '%CLOSED%'  order by DistrictNAme, StoreNbr
     )
       Select  
                 foods.storeid
              ,   CASE
                   WHEN   to_date(Foods.week_date, 'dd-Mon-yy')  =  to_date('24-AUG-09', 'dd-Mon-yy')  then 1
                   WHEN   to_date(Foods.week_date, 'dd-Mon-yy')  =  to_date('24-AUG-09', 'dd-Mon-yy') + 7 then 2
                   WHEN   to_date(Foods.week_date, 'dd-Mon-yy')  =  to_date('24-AUG-09', 'dd-Mon-yy') + 14 then 3
                   WHEN   to_date(Foods.week_date, 'dd-Mon-yy')  =  to_date('24-AUG-09','dd-Mon-yy') + 21 then 4
                   WHEN   to_date(Foods.week_date, 'dd-Mon-yy')  =  to_date('24-AUG-09', 'dd-Mon-yy') + 28 then 5
                   ELSE  0
             end as WeekNBr 
        ,   foods.week_date  as CurrWeekDate
        ,   foodsLY.week_date as LastYearWEekDate    
        ,   ROUND(NVL(foods.net_sales - (aw_sales + tb_sales + ljs_drinks + ljs_food_sales ), 0), 2) as Landau_Net_Sales
        ,   ROUND(NVL(aw_sales, 0), 2) as aw_sales
        ,   ROUND(NVL(tb_sales, 0), 2) as tb_sales
        ,   ROUND(NVL(ljs_drinks + ljs_food_sales, 0),2)    as ljs_net_sales
        ,   ROUND(NVL(foods.Net_Sales, 0), 2) as net_sales
        --==
        -- Last Year Sales and Last Year Next Year Sales
       --==
     ,    ROUND(NVL(foodsLY.Net_Sales, 0),2)    as    WTDLYNetSales
     ,    ROUND(NVL(Foods.Net_Sales -  FoodsLY.Net_Sales, 0),2)  as    WTDSalesIncrease
     -- ,    ROUND(NVL(FoodsLY_NextWeek.Net_Sales, 0), 2)          as    WTDFoodsSalesLY  
     ,   stores.*
     from WEeklyTotals Foods  
     LEFT OUTER JOIN Weeklyfoodsalesly foodsly ON foodsly.storeid = foods.storeid
                                              AND foodsly.week_date = DECODE(foods.week_date,
                                                                             to_date('24-AUG-09', 'dd-Mon-yy') ,       to_date('25-AUG-08', 'dd-Mon-yy') ,
                                                                             to_date('24-AUG-09', 'dd-Mon-yy')  + 7,   to_date('25-AUG-08', 'dd-Mon-yy') + 7,
                                                                             to_date('24-AUG-09', 'dd-Mon-yy')  + 14,  to_date('25-AUG-08', 'dd-Mon-yy') + 14,
                                                                             to_date('24-AUG-09', 'dd-Mon-yy')  + 21,  to_date('25-AUG-08', 'dd-Mon-yy') + 21,
                                                                             to_date('24-AUG-09', 'dd-Mon-yy')  + 28,  to_date('25-AUG-08', 'dd-Mon-yy') + 28)
    LEFT OUTER  JOIN  StoreDetails             stores  ON     stores.storeid   = Foods.storeid;
    one exception. In a With statement, I get a recordset containing all the data of the last few years. I couldn't understand the Pivot
    for the past year next week. This means that next week, a year ago. I pulled out a snippet to work with, but could not
    at work:


    The problem came when I tried to rotate the last year's sales. I shot this code snippet to test with, but couldn't make it work.
     with   WeeklyFoodSalesLY  as
     (         SELECT  fk_str_main_id AS storeid
                 ,   TRUNC(busi_date, 'iw') week_date
                 ,   sum(conf_numb2)   conf_numb2 
               FROM  my_csh_main csh     
               WHERE busi_date BETWEEN   to_date('31-dec-07', 'dd-Mon-yy')  and  to_date('28-Sep-08', 'dd-Mon-yy') + 7
               and fk_str_main_id = 141   
               GROUP BY  fk_str_main_id , TRUNC(busi_date, 'iw')       
     )
     ,  sales_ly as 
     (    SELECT      storeid 
                    , week_date 
                    , sum(conf_numb2) as net_sales 
          from WeeklyFoodSalesLY
          WHERE week_date BETWEEN   to_date('25-Aug-08', 'dd-Mon-yy')  and  to_date('28-Sep-08', 'dd-Mon-yy')  
          GROUP BY  storeid , week_date     
          UNION ALL
          SELECT      storeid 
                    , week_date 
                    , sum(conf_numb2) as net_sales 
          from WeeklyFoodSalesLY
          WHERE week_date BETWEEN   to_date('25-Aug-08', 'dd-Mon-yy') + 7   and  to_date('28-Sep-08', 'dd-Mon-yy') + 7
          GROUP BY  storeid , week_date 
          UNION ALL
          SELECT      storeid 
                    , week_date 
                    , sum(conf_numb2) as net_sales 
          from WeeklyFoodSalesLY
          WHERE week_date < to_date('25-Aug-08', 'dd-Mon-yy')  
          GROUP BY  storeid , week_date 
     ) 
    ,  pivoted_sales_ly as 
    (    SELECT          storeid 
                         week_date
                    ,    MAX(DECODE(week_date, to_date('25-Aug-08', 'dd-Mon-yy'),          net_sales))        as lastyear
                    ,    MAX(DECODE(week_date, to_date('25-Aug-08', 'dd-Mon-yy') + 7,      net_sales))        as lastyearnextweek
                    ,    MAX(DECODE(week_date, to_date('31-dec-07', 'dd-Mon-yy'),          net_sales))        as lastyeartotal
          from sales_ly
          GROUP BY  storeid, week_date
     ) 
    Select * 
    from pivoted_sales_ly;  
    What am I do wrong here? Once I get this I can work in the original query.


    Analytics:


    Boneist gave me a code to try it last week:
    The combination refCursors and summary
    who created the PTD column.

    I could never make it work because I was twisting so much with the request, but is there a way of getting the PTD without
    to add an additional column of PTD for each week.



    Another thing:
    I would like to know how to use analytics to roll up report of division in another cursor.

    Published by: TheHTMLDJ on October 26, 2009 04:50

    Published by: TheHTMLDJ on October 26, 2009 04:59

    TheHTMLDJ wrote:
    Can you tell me why this code in my original query do not rotate correctly: (:)

    ,  pivoted_sales_ly as
    (    SELECT          storeid
    week_date
    ,    MAX(DECODE(week_date, to_date('25-Aug-08', 'dd-Mon-yy'),          net_sales))        as lastyear
    ,    MAX(DECODE(week_date, to_date('25-Aug-08', 'dd-Mon-yy') + 7,      net_sales))        as lastyearnextweek
    ,    MAX(DECODE(week_date, to_date('31-dec-07', 'dd-Mon-yy'),          net_sales))        as lastyeartotal
    from sales_ly
    GROUP BY  storeid, week_date
    ) 
    

    Yes two things:

    1 missing the comma after storeid, which is causing the request to think that storeid is actually the column week_date (confusion!)
    2. When you are rotating on a particular column (in your case, week_date), you don't want to include this column in the selection or group of clauses.

    This should do what you are looking for:

    with   WeeklyFoodSalesLY  as
     (         SELECT  fk_str_main_id AS storeid
                 ,   TRUNC(busi_date, 'iw') week_date
                 ,   sum(conf_numb2)   net_sales
               FROM  my_csh_main csh
               WHERE busi_date BETWEEN   to_date('31-dec-07', 'dd-Mon-yy')  and  to_date('22-Sep-08', 'dd-Mon-yy') + 7
               --and fk_str_main_id = 141
               GROUP BY  fk_str_main_id , TRUNC(busi_date, 'iw')
     )
     ,  sales_ly as
     (    select storeid,
                 week_date,
                 net_sales,
                 sum(net_sales) OVER (partition by storeid order by week_date) net_sales_ytd
          from   WeeklyFoodSalesLY
     )
    ,  pivoted_sales_ly as
    (    SELECT          storeid,
                         MAX(DECODE(week_date, to_date('25-Aug-08', 'dd-Mon-yy'),          net_sales))        as lastyear
                    ,    MAX(DECODE(week_date, to_date('25-Aug-08', 'dd-Mon-yy') + 7,      net_sales))        as lastyearnextweek
                    ,    MAX(DECODE(week_date, to_date('25-Aug-08', 'dd-Mon-yy'),          net_sales_ytd))        as lastyeartotal
          from sales_ly
          GROUP BY  storeid
     )
    Select *
    from pivoted_sales_ly
    order by storeid;  
    
       STOREID   LASTYEAR LASTYEARNEXTWEEK LASTYEARTOTAL
    ---------- ---------- ---------------- -------------
           141   22103.69         22627.74    1698840.82
           221   23515.95         24244.37    1809967.06
    

    I don't know where you get your numbers from 2nd and 3rd, but I couldn't have the query to generate?

  • Issue Analytics ORA-30089

    Hello-
    worked with analytical sql in a little bit so maybe I'm just rusty.

    Goal: run the query to return data) 1 week and 2) a current view an average of 52 weeks

    Sample data and the ddl:

    DDL:
    CREATE TABLE my_fact
    (product_id NUMBER,
     customer_id NUMBER,
     week_no number,
     transaction_date DATE, 
     stock_on_hand NUMBER,
     stock_in_transit number);
    
    INSERT INTO my_fact VALUES (1,45, 29, SYSDATE, 100,70);
    INSERT INTO my_fact VALUES (1,45, 29, SYSDATE, 200,170);
    INSERT INTO my_fact VALUES (1,45, 15, SYSDATE -100, 300,70); -- added
    INSERT INTO my_fact VALUES (1,55, 29, SYSDATE, 100,70);
    INSERT INTO my_fact VALUES (2,32, 15, SYSDATE -100, 100,70);
    INSERT INTO my_fact VALUES (2,32, 10, SYSDATE-130, 100,70);
    INSERT INTO my_fact VALUES (3,32, 20, SYSDATE-66, 100,70);
    INSERT INTO my_fact VALUES (3,78, 29, SYSDATE, 100,70);
    now my query:
    SELECT product_id, customer_id, week_no,
    avg(stock_on_hand)  avg_current_week, 
    avg(avg(stock_on_hand)) OVER (ORDER BY (trunc(transaction_date)) RANGE BETWEEN INTERVAL '364' DAY(3) PRECEDING AND INTERVAL '1' PRECEDING) avg_52week
    FROM my_fact
    WHERE to_char(SYSDATE, 'WW') = week_no -- OTHER THAN THE ANALYTICAL AGGREGATE JUST WANT TO LOOK AT CURRENT WEEK
    GROUP BY product_id, customer_id, week_no, TRUNC(transacion_date)
    Get the error ora-30089 and don't know why?
    Also, according to my goal above is this structured correctly query (the analytical part)?

    Thank you!

    Published by: padawan on July 19, 2010 11:32

    Hello

    It would be better to display full ressults you want, not just a line.

    I think not only analytical help in this problem. Just use the AVG aggregation, with a CASE expression to get an average based on a subset of the results (for example, current week).

    SELECT        product_id
    ,       customer_id
    ,       MAX (week_no)          AS week_no
    ,       AVG ( CASE
                  WHEN  week_no           = TO_CHAR (SYSDATE, 'WW')
                  AND       transaction_date     > SYSDATE - 7     -- to guard against same week in an earlier year
                  THEN  stock_on_hand
              END
               )               AS avg_current_week
    ,       AVG (stock_on_hand)     AS avg_52_week
    FROM       my_fact
    WHERE       transaction_date     >= TRUNC (SYSDATE) - 364
    AND       transaction_date     <  TRUNC (SYSDATE) + 1
    GROUP BY  product_id
    ,       customer_id
    HAVING       MAX (week_no)     = TO_CHAR (SYSDATE, 'WW')
    ;
    

    If you average per week and then take the average of the averages, you are then weighted averages; in the contrary case, the average of 150 of the current week and 300 a another week would be 150 + 300 / 2 = 225, which is not what you want you want each of the lines that contributes to the average of the first (on 150) to be considered separately.

  • Revolution Analytics R - RxExec does not work on multiple cores when Random Forest model run.

    I have a model of analysis of the survival of random forests for learning and trying to run on Multicore Cluster using the revolution analytics company R 7.1.0 version.

    I used the code below

    Library (RevoScaleR)
    rxSetComputeContext (RxLocalParallel ())
    rxOptions (numCoresToUse = 5)

    And then execution of the RxExec to perform the function of random forests by using the randomForestSRC package.

    I could see 10% CPU utilization which is by default kernel 1.

    Am I missing something here? Must the randomForestSRC package be a parallel?

    This issue is beyond the scope of this site (for consumers) and to be sure, you get the best (and fastest) reply, we have to ask either on Technet (for IT Pro) or MSDN (for developers)

    If you give us a link to the new thread we can point to some resources it
  • Cannot open a session in analytics if the user is not mapped to the BIAdministrators group

    Hello

    I created a new user, lets say 'abc '.

    The issue I'm facing is: for the 'abc' user, I traced only groups 'Author BI' and 'BI consumer' as I forced her to test the SPR where I didn't that correspond to the BIAdministrators group.

    With the above user, if I try to log in analytics, I get the below error message:

    An error occurred during authentication. Try again later or contact your system administrator.

    When I added BIAdministrators group through the console of the 'abc' user, I am able to connect successfully.

    Is the same for all users by default those too - weblogic etc.

    But when I tried the same steps (not to add BIAdministrators group to a user) in the system of my colleague, its working fine.

    Could someone help me with this problem.

    I also checked privileges section to manage both the system and there seems to be no change.

    Thank you

    Srikanth.

    I found the answer on my own. It seems that the role of the authenticated user is not having access to the object.

    Therefore, open the RPD-> right click-> properties-> change permissions for the authenticated user to read field

    Deploy the RPD now.

    Problem solved.

  • Has anyone been able to Stand Up Essbase Analytics link (EAL) for financial management in version 11.1.2.2?

    Anyone no matter what chance standing places the link Analytics Essbase (EAL) for the purposes of financial management in 11.1.2.2?  We are at the point where when you click on 'Create the Bridge Application' it creates an application and a database in the EA and then blocks the Hyperion Essbase Analytics Link - Web Application Server.  Before this we were receiving an error NetRetry and NetDelay and increased, these parameters are do not receive the same mistakes.  I'm curious to know if 11.1.2.2 there's even a functional version of EAL or if we are out of luck until the next version.  Any comment is appreciated.

    FYI, we were able to raise the EAL and running after adding the following entries in the registry on our servers...

    Solution

    This timeout issue can be fixed by adding two TCP/IP registry settings, but first you must identify the customer who communicated with essbase when the time-out period exceeded so that you know which machine to add the settings.  If all the EAL and EMP components are installed on a single machine, while the machine would also host the client.  If the products are installed in a distributed environment, you determine the client machine depending on how the Essbase EAL server component is defined.

    The URL of the APS is part of the Analytics Server Essbase in EAL definition link.

    If (APS URL) value is "incorporated" means that the application EAL server communicates via the JAPI directly with the Essbase server. In this case the EAL AppServer is the client to the Essbase server. In this case, the following registry settings tcpip need to go on the computer running the application EAL server.

    If the value of (APS URL) is http://serverName:13080/aps/JAPI then the EAL Application Server communicates by way of Hyperion provider of Services (APS). In this case queries EAL proxies to and from the Essbase by APS Server. This means that the APS is the client to the Essbase server and registry settings tcpip need to go on the machine running Hyperion service provider.

    Once you have determined what machine acts as a customer of essbase, set the TcpTimedWaitDelay = 30 and MaxUserPort = 65534 settings via the windows registry.

    1. open the Windows registry.

    2. go in HKEY_LOCAL_MACHINE\System\CurrentControlSet\Services\TCPIP\Parameters.

    3. Add the new DWORD value named "TcpTimedWaitDelay.
    -Right click and select Edit
    -Select "decimal" radio button, type 30.

    4. Add the new DWORD value called MaxUserPort
    -Right click and select Edit
    -Select "decimal" radio button, type 65534

    5. a server restart is necessary.

  • DPS Analytics problem

    DPS for Analytics System State shows everything works fine but when I connect to my dashboard for my application, I see a bunch of folios which do not belong to me, what is happening?

    Hi Automator,

    I just want to close the loop on this issue. After a discussion on a separate thread, the client has confirmed that the problem has disappeared.

    After the investigation, we have found that the problem why you saw data that does not belong to you yesterday, it's that the segment has been ignored by Site Catalyst for some reason any. Now is back to normal.

    Thanks again for helping us investigate this problem,

    Ioan

  • opt-in analytics

    We are looking at Analytics our folio downloads and they have been on a steady decline for our quarterly publication, with our current issue is terribly low. The question posed to me was the number of "download completed" was correct. This question was asked once we saw the message of dialogue of the app on the "follow-up to its use."

    My question is, with this "use Tracking" opt-in, the player actually remove the pursuit and indeed to make a download without analytics in the dashboard having demonstrated that it can? How this opt-in is different to those who ask if the player wants to share by email and other information with the Publisher?

    We are currently a client company, but frankly can't justify the lamentable cost with such figures. My hope is that we are losing something obvious.

    Thank you

    Scott

    If you have enabled the prompt follow-up of use when you generated your app then Yes, your readers can choose to track usage. This means that you will get not all Analytics numbers in the dashboard for the readers. That's what it is invited: gives your readers a way to disable usage tracking.

    Neil

  • Understanding analytics on Folios update

    We have an application that has been in the iTunes store for more than a year that contains 4 folios.

    Unlike a publication as "Wired" magazine, our customer does not want the previous numbers to appear in the app - each monthly issue updated effectively replaces the previous question (this is because each question has its own monthly set of prices for products within the folio).

    By refreshing the folio, change the name of question for the new month and update the relevant thumbnails etc., this allows us to create a new version of the publication, although it is in fact a whole new folio.

    So far so good.

    The problem for us and our client, is that we cannot find a way to determine how many downloads of the folio arrive each month (we can look at the consolidated reports etc. to get an idea of total folios being downloaded, we can look at the Analytics to show similar details, but it seems that to register a 'download' when someone download this folio for the first time). If that someone has already downloaded the folio, then subsequently downloads folio updated next month, this does not seem to register for download.

    I suspect it's a victory for us and a loser for Adobe (so I probably shouldn't be mentioned here), but we would really like to be able to show the stats to our customer so that they feel like they're actually has people reading each new edition of the publications.

    Does anyone have any suggestions on how we can make the analytical work for us in this case?

    See you soon,.

    Eoghan.

    There is no way to split this output, and honestly, it's a really wacky approach to content delivery. When your customer has published printed catalogues, they tried to remove the catalog from the previous month, they just sent a new with the new pricing. Why do they want to do with the digital version?

    Neil

  • Check folio download numbers. Analytics

    Hello world

    in analytics > Folio > download

    I have

    Download 37 began

    12 download is complete,

    Download 10 errors

    and 7 download canceled.

    I'm a bit confused. I think that the amount completed + errors + cancelled (12 + 10 + 7 = 29) should give the result of download started...

    The difference of 8, what it means?

    Thank you.

    Hello Andy,

    There could be 3 causes why the sum of the issue download (Completed + errors + cancelled) is different from the question download started:

    • The user begins to download A folio ("Problem started download" event is sent to A folio). Before the current download ends, the user begins to download folio B. In this case, another event 'Question download start' is sent for folio B folio will stay in State "Download paused" without reach a final download status (Completed, errors, Cancelled). Only if the user will click again on the folio, it will resume the download. This is probably the reason why you see different numbers
    • The user will start downloading in the current day, but the final download status will be sent the next day. In this case, if the selected interval ends today, then the report will not contain the final State of download
    • If the opt-in user dialogue enabled, it can select "Do not allow" follow-up after the start of the download and before the download will reach a final State. In this case, the events will not be sent

    Hope this helps,

    Ioan

  • Google Webmaster tools, analytics, Manager of tags

    I am trying to optimize my Web site using Google's free products.  However, I have noticed that Google recommends to change the html code of the site in several cases: Add the Analytics script and tag Manager and structured data, improvements.  Structured data, I have noticed, is already in the raw text of my page properties keywords but not seen optimally by google.  Tag Manager recommends that I insert my manager of script tags in the body, but I don't see how this would be possible in Muse. If I cut and paste in the head box on the page properties metadata option, it will of course appear in the head. I noticed I can do only by file in catalyst for business manager by opening the page in html, but willl edit only html have a destructive impact on my site; s ability to download.  I have no experience in code so I rely on the design features of Muse exclusively, but I wonder if it's one of the comprimises with the help of Muse as opposed to Dreamweaver...

    Just in case anyone has this issue in the future, there is now a widget that will do that for you: Google tag Manager | Free Muse Widget Adobe

  • 9.3.1 Web Analytics: An error has occurred while opening documents. Document ID = {0}

    Hi Experts,

    I'm having a problem when opening a web analytics dashboard report. When I try to open it, web analytics is running for 15 minutes & then get following error:

    Error occurred when opening the document. Document ID = {0}

    The report opens with its header with two buttons & journal of the society, but the data grid is empty.


    The source of the grid is Essbase. Tunning in file Essbase.cfg settings are in place.


    Could someone please by the comment more on this issue. Let me know if any further information is required!


    Please suggest me some way to solve it.

    Thank you
    Krishan

    The isapi interface (if you are using one) time-out parameter can be smaller than the time essbase needs to retrieve information from you.
    but most likely the applicationserver jre heapsize analysis is either small or needs a lot of time to return it. Don't forget to increase heapsize and max equal to the min value (try 800 MB, if your hardware has this resource)

  • Configuration of HR analytics questions?

    Hello

    We install Oracle BI Apps on GNU / LINUX. The installation phase is completed and we are about to launch a full load ETL. But we're confuded on various issues such as:

    1. for a full load ETL under say to analytical HR set first as described in the install docc HR analytics. Is - this mandatory?

    2. while I try to run the ETl the first task itself does not throw an error:

    LOG FILE:
    ======

    Running session... 'SIL_InsertRowInRunTable '.
    Application to start the workflow: 'SIL_InsertRowInRunTable' completed with error code 2
    Error message: the specified task name, workflow or folder does not exist.
    Opportunity to order: pmcmd startworkflow u Administrator PEI * s - appsnet3.wipro.com:4006 f SILOS lpf - u03.01/DAC_LINUX/Informatica/parameters/input/SILOS. SIL_InsertRowInRunTable.txt SIL_InsertRowInRunTable
    Workflow message:

    =====================================
    STD OUTPUT
    =====================================

    Informatica (r) PMCMD, version [8.1.1 SP4], build [43.0817], LINUX 32-bit
    Copyright (c) Informatica Corporation 1994-2007
    All rights reserved.

    Called to Thu Aug 28 14:00:27 2008

    Connected to the Service of integration [appsnet3.wipro.com:4006]
    ERROR: File [ELEVATORS] not found.
    Disconnection of Service integration

    Completed at Thu Aug 28 14:00:27 2008

    =====================================
    ERROR OUTPUT
    =====================================

    Notification of DAC: Task ' load Row in run table: SIL_InsertRowInRunTable "failed, and so the other tasks for this step will not be executed
    =========================================================================
    =========================================================================

    It shows a few 'SILOS' missing folder. Can anyone please help on this.

    3 are their integration measures that we must follow to integrationg OBIEE with EBS R12 suite?

    Thanks in advance,
    Soumya

    Soumya salvation,

    Yes, you will need to follow all the steps to set up HR analytics or EBS R12 as pdf. And you receive error message that the specified task name, workflow or folder is not. this problem is due to the incompatibility of the name of the folder in DAC client Informatica, you must mention the name of the DAC created in Informatica customer record, you restored the Oracle analytics company. RPD in Informatica. If this is not the case, the you must restore the Informatica repository.

    If in doubt please return back.

    Concerning

    Norman John

  • Appeal iPhone issues 7 more

    I recently bought the new iPhone 7 more.  Not only am I having problems with the LTE service, bounce back between 3 G and no service I am now having problems with call connectivity.  When you try to make an outgoing call, I get an automatic response from my carrier (Verizon) "it's Verizon, we are unable to complete your call at this time, please turn off your phone and back and try again."

    It is a known issue with the iPhone 7 and Verizon. Supposed to be Apple and Verizon are seeking him. It has been a problem since the phone was released. Do a Google search or search here on the forums for a temporary solution but not a cure.  But basically to stop the LTE oice and data into data only in the settings.

  • iPhone 6 more hang freeze issues

    My iPhone 6 more guard suspended (gel) for the past 6 months, I have to reset the whole phone sometimes up to 2 times just to get it to work again. Been a loyal (including my entire family) custome since gen 1 iPhone, iPad, mac book. Am really disappointed with the quality now. Did not bring to the service center since the first day of the problem because I thought that was just a few bugs & was sure that apple will fix it, but after many update of iOS so far, the pesisit still problem.

    iPhone is more quality I've known, hope apple can fix this problem before finally losing the most loyal customers

    Hello BennyFoo,

    Thank you for bringing your iPhone freeze questions here for Apple Support communities. I understand how it is important to have a reliable phone. I'm happy to check this with you.

    The first thing we want to make sure that all of your data on your device is safe. You can perform a backup in iCloud or iTunes using this article: backing up your iPhone, iPad, and iPod touch.

    Once your data is safe, try to turn off your device, then again on the March. Sometimes, that's all that is needed for connections reset and everything working properly again. This article help from Apple will help you to do: restart your iPad, iPhone or iPod touch. If you are still having issues, try force reset then. It is designed to be used when the iPhone does not respond, and it looks to be the exact symptoms that you are experiencing right now.

    If you still experience the same problems after reboot, please try to restore your device from a backup by using this article help: restore your iPhone, iPad or iPod touch from a backup. If you don't have a current backup, please try this help article: If you can not update or restore your iPhone, iPad, or iPod touch. Please pay close attention when you get to step 4 who said "when you see the option restore or put to update, choose update. iTunes will try to reinstall iOS without erasing your data. "to get your iOS reinstalled. Test if the issue is still ongoing.

    Thanks again and have a great rest of your day.

Maybe you are looking for

  • Satellite L40: Standby/sleep Modes

    I have my L40 to enter the mode "sleep" when I close the lid.How ca, I wake it if it won't wake up itself on the opening in the lid. I tried to click; and pressing the button THERE. I even tried to scream at him. What is the trick?

  • RN102 3 on 4 hard drives failed in 2 years

    Hi all New for communities, but I got the RN102 (purchased from NETGEAR on Amazon with 2 Seagate (ST3000DM001) of 3 TB drives) Since all mid 2013. Long story short, I had not set up correctly on the NAS notification alerts, and on mid 2014 did not ha

  • How can I download a file of old emails from Windows Mail (not Windows Live Mail) to a flash drive?

    Left Sode of the screen of Windows Mail are the names of files where the old emails can be stored. How to find the and then download them to a file separately on aflash drive as a back-up? They can be found using Windows Explorer or by using the regu

  • Dry well EMC CX4 - 120 disks before the gift

    We have an installed SC8000 Compellent, and I would like to find a way to erase (wipe of multipass DOD) data from the drives before you make a donation or surplussing safely our old CX4-120. I came across this old thread, but it's a little bit conclu

  • Mouse clicks

    How to change double click of mouse to single click on?