Helps the query using LESS

Hello Experts

I can't in select the record_sequence in the output. Please see the part of the desired effect.
Please help solve this problem.

Is the version of Oracle, I'm working on that

Oracle Database 11 g Enterprise Edition Release 11.1.0.7.0 - 64 bit Production
With partitioning, OLAP, Data Mining and Real Application Testing options

Thank you

RB

TABLE1 AS
(
SELECT '28' EXAM_CD1, EXAM_CD2 '29', '10' EXAM_CD3, 111' CAND_ID FROM DUAL UNION ALL
SELECT '21' EXAM_CD1, EXAM_CD2 '39', '20' EXAM_CD3, 112' CAND_ID FROM DUAL UNION ALL
SELECT '22' EXAM_CD1, EXAM_CD2 '49', '30' EXAM_CD3, 113' CAND_ID FROM DUAL UNION ALL
SELECT 'EXAM_CD1 23', '59' EXAM_CD2, EXAM_CD3 ' 40', 114' CAND_ID FROM DUAL UNION ALL
SELECT '24' EXAM_CD1, EXAM_CD2 '69', '50' EXAM_CD3, 115' DOUBLE CAND_ID)
AS TABLE2
(
SELECT EXAM_CD '28', '111' CANDID, 1 RECORD_SEQ OF DOUBLE UNION ALL
SELECT '30' EXAM_CD, '113' CANDID, 2 RECORD_SEQ FROM DUAL UNION ALL
SELECT EXAM_CD '94', '111' CANDID, 3 RECORD_SEQ OF DOUBLE UNION ALL
SELECT EXAM_CD '69', '115' CANDID, 4 DOUBLE RECORD_SEQ)
(
SELECT EXAM_CD FROM TABLE2, CANDID
LESS
SELECT CAND_ID,
MAX (L CASE WHEN 1 EXAM_CD1 THEN WHEN 2 THEN of OTHER EXAM_CD2 EXAM_CD3 END) exam_code
FROM TABLE1,
(SELECT LEVEL L FROM DUAL CONNECT BY LEVEL < = 3)
CAND_ID GROUP, L)

The aim is

CAND_ID, EXAM_CD, RECORD_SEQ
* 111, 94, 3 *.

Hello

Rb2000rb65 wrote:
The solution use not less as long as I get my results using the latest features, it is good with me.

Good idea!
UNMIS is not the best tool for this task. The saved query gets the exam_cd and the Candide you want, but you can't find the record_seq because there of nothing like record_seq in table1.

You can do this way:

SELECT     *
FROM     table2     m
WHERE     NOT EXISTS (
                    SELECT  1
                FROM    table1
                WHERE   m.candid      = candid
                AND     m.exam_cd  IN ( exam_cd1
                                         , exam_cd2
                             , exam_cd3
                             )
             )
;

I guess you could use LESS, like this

SELECT     *
FROM     table2
WHERE     (exam_cd, candid)
     IN (
            SELECT  ...  -- The MINUS query you posted goes here
        )
;

but it is unecessarily complicated.

Tags: Database

Similar Questions

  • Helps the query using summary on partition

    I don't know that miss me something small here. I need to make an inventory of end.
    The formula is the following:


    For the 501, end = store inventory
    -closing inventory 7292.19
    -supplies - 30,64 closing stock
    -closing inventory for buns - 1002.34
    = -----------
    6259.21
    I can get the stock of closing with analytics, but cannot end inventory.

    My dollar gap is also swollen. It should be 780.55 for store 501;

            
      CREATE TABLE "SUBQUERY_CATEGORIES" 
       (     "STOREID" NUMBER NOT NULL ENABLE, 
         "WEEK_NBR" NUMBER, 
         "DESCRIPTION" VARCHAR2(100 BYTE) NOT NULL ENABLE, 
         "OPEN_INVENTORY" NUMBER, 
         "CLOSING_INVENTORY" NUMBER, 
         "TRANSFER_IN_COST" NUMBER, 
         "TRANSFER_OUT_COST" NUMBER, 
         "DELV_COST" NUMBER, 
         "PREV_DELV_COST" NUMBER, 
         "TOTAL_COST" NUMBER, 
         "DOLLAR_VARIANCE" NUMBER
       ) ;
     
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (501,1,'Other Foods-I',880.04,837.16,17.32,0,491.92,880.044,837.158,35124.92);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (27,1,'Shortening-I',199.7,200.32,0,0,99.85,199.7,200.324,390.45);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (25,1,'Meat-I',154.69,464.06,168.75,42.188,1406.25,154.688,464.063,1239.85);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (25,1,'Bacon-I',74.99,62.63,19.405,0,154.16,74.992,62.628,1239.85);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (501,1,'Repairs & Maint',0,0,0,0,195,0,0,780.55);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (501,1,'Supplies',25.92,30.64,0,0,139.43,25.923,30.637,37466.58);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (25,1,'Drinks-I',585.06,750.36,0,0,715.87,585.058,750.358,8678.93);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (501,1,'Chili Ingridients-I',177.99,214.47,5.918,5.918,302.88,177.995,214.466,4683.32);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (27,1,'Meat-I',540,264.38,14.063,28.125,1181.25,540,264.375,780.89);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (27,1,'Paper-I',955.71,839.86,0,15.308,600.54,955.71,839.859,19131.9);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (27,1,'Supplies',11.78,9.43,0,0,158.85,11.783,9.427,17570.11);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (27,1,'Produce-I',180.5,98.64,0,0,206,180.498,98.638,3904.47);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (25,1,'Condiments-I',170.14,153.46,8.668,0,164.86,170.14,153.456,6819.16);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (27,1,'Repairs & Maint',0,0,0,0,500,0,0,390.45);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (25,1,'Fries-I',78.22,120.33,54.15,18.05,631.75,78.217,120.333,619.92);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (501,1,'Paper-I',1113.09,1093.63,50.884,14.07,846.22,1113.089,1093.633,43711.01);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (27,1,'Cheese-I',63.78,58.7,0,0,197.07,63.783,58.704,1171.34);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (501,1,'Produce-I',201.56,304.85,0,0,554.85,201.56,304.847,7805.54);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (27,1,'Buns-I',1064.44,793.73,0,0,191.36,1064.44,793.73,780.89);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (27,1,'Bacon-I',251.95,155.9,0,0,115.62,251.95,155.902,780.89);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (25,1,'Paper-I',806.87,871.74,12.113,8.448,674.56,806.871,871.741,30376.25);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (501,1,'Chicken-I',561.16,570.93,94.457,0,1568.81,561.156,570.929,5463.88);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (25,1,'Chicken-I',285.86,534.67,73.007,35.97,1402.86,285.858,534.67,4339.46);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (27,1,'Drinks-I',1061.6,1040.97,0,0,584.59,1061.597,1040.971,5466.26);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (501,1,'Breakfast-I',437.9,376.44,0,0,272.42,437.904,376.438,12488.86);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (27,1,'Condiments-I',173.67,159.72,0,3.251,187.55,173.671,159.721,4294.92);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (25,1,'Chili Ingridients-I',93.59,149.49,2.445,0,253.85,93.594,149.489,3719.54);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (25,1,'Buns-I',873.54,914.48,0,0,441.6,873.54,914.48,1239.85);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (501,1,'Fries-I',39.11,126.35,36.1,36.1,884.45,39.108,126.35,780.55);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (25,1,'Supplies',4.71,4.71,0,0,195.53,4.713,4.713,27896.56);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (25,1,'Other Foods-I',615.63,627.42,1.701,0,374.4,615.63,627.419,26656.71);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (27,1,'Chicken-I',702.5,471.66,0,65.64,1120.39,702.502,471.664,2733.13);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (501,1,'Dairy-I',176.9,128.3,0,0,332.14,176.904,128.304,5463.88);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (27,1,'Dairy-I',171.78,85.3,0,0,109.38,171.783,85.297,2342.68);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (25,1,'Dairy-I',122.71,89.86,0,0,288.98,122.706,89.856,3719.54);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (25,1,'Cheese-I',140.03,109.46,0,0,131.38,140.028,109.461,1859.77);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (25,1,'Produce-I',151.15,169.85,0,3.44,270.85,151.148,169.852,6199.23);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (501,1,'Shortening-I',249.63,259.61,0,0,259.61,249.625,259.61,780.55);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (501,1,'Bacon-I',116.58,115.62,0,0,308.32,116.581,115.62,1561.11);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (25,1,'Repairs & Maint',0,0,0,0,200,0,0,619.92);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (501,1,'Cheese-I',130.9,174.52,0,0,328.45,130.897,174.516,2341.66);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (501,1,'Condiments-I',225.65,207.39,0,0,247.81,225.645,207.394,8586.09);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (27,1,'Chili Ingridients-I',159.45,109.79,5.918,0,159.6,159.45,109.788,2342.68);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (501,1,'Meat-I',752.34,696.09,0,28.125,1800,752.344,696.094,1561.11);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (501,1,'Buns-I',1090.2,1002.34,0,0,524.4,1090.2,1002.34,1561.11);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (27,1,'Fries-I',147.41,88.75,0,0,595.65,147.408,88.746,390.45);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (27,1,'Other Foods-I',658.37,645.92,0,1.701,373.34,658.375,645.925,16789.22);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (501,1,'Drinks-I',1290.03,1153.85,0,0,689.84,1290.032,1153.848,12488.86);
    Insert into SUBQUERY_CATEGORIES (STOREID,WEEK_NBR,DESCRIPTION,OPEN_INVENTORY,CLOSING_INVENTORY,TRANSFER_IN_COST,TRANSFER_OUT_COST,DELV_COST,PREV_DELV_COST,TOTAL_COST,DOLLAR_VARIANCE) values (25,1,'Shortening-I',159.76,179.73,0,0,199.7,159.76,179.73,619.92);
    
    SET DEFINE OFF 
    
    with categorycosts as 
     ( 
             SELECT             storeid 
                                 ,  week_nbr
                                 ,  UPPER(TRIM(description))  description                
                                 ,  NVL(prev_delv_cost + transfer_in_cost + delv_cost - transfer_out_cost - total_cost, 0) AS  cost 
                                 ,  open_inventory 
                                 ,  dollar_variance                         
                                 ,  sum(closing_inventory ) over (partition by storeid, week_nbr )  closing_inventory                   
                           --    ,  closing_inventory
                              from  subquery_categories
                              where storeid = 501
     )
      ,  pivoted_cat_costs AS
     (SELECT  storeid
                 , week_nbr
                , MAX(DECODE(closing_inventory, 0, 0, closing_inventory))  -  NVL(MAX(DECODE(UPPER(TRIM(description)), 'SUPPLIES', cost)), 0)  -       NVL(MAX(DECODE(UPPER(TRIM(description)), 'BUNS-I', cost)), 0)      as   closing_inventory
                , MAX(DECODE(dollar_variance, 0, 0, dollar_variance))                as   dollar_variance
               ,    NVL(MAX(DECODE(UPPER(TRIM(description)), 'SUPPLIES', cost)), 0)                 supplies
               ,    NVL(MAX(DECODE(UPPER(TRIM(description)), 'REPAIRS & MAINT', cost)), 0)                     repairs_and_maint        
               ,    NVL(MAX(DECODE(UPPER(TRIM(description)), 'BUNS-I', cost)), 0)             as buns            
          FROM   categorycosts
          GROUP BY storeid,  week_nbr
     )
    select * from pivoted_cat_costs;
    -= = So I separated it and still get incorrect result
          SELECT                   storeid   
                                 ,  week_nbr
                                 ,  UPPER(TRIM(description))  description                
                                 ,  NVL(prev_delv_cost + transfer_in_cost + delv_cost - transfer_out_cost - total_cost, 0) AS  cost 
                                 ,     open_inventory 
                                 ,    dollar_variance                         
                             --  ,   sum(closing_inventory ) over (partition by storeid, week_nbr )  closing_inventory                   
                                 ,             closing_inventory
                              from  subquery_categories;
     
    The results should be:
     store 501                   ending inventory  =  6259.21            buns s/b   1002.34                inv variance 780.55
     Store 27                    ending inventory  =  4220                 buns s/b = 793.73                inv variance 390.45 
     Store 25                    ending inventory  =  4283                 buns       914.48                  inv variance 619.92
    There is an anomaly out Bill so it can produce the vairiance of good quality, but inv variance that my request is back is swollen.
    Can someone tell me what I am doing wrong?

    Published by: TheHTMLDJ on December 9, 2009 06:42
    SET DEFINE OFF has added and removed the subquery_categories schema name

    Well, I asked for the logic as the specification and not necessarily a query. :)
    Here's the query that gets your inventory of desired end.

    with categorycosts as
     (
             SELECT             storeid
                                 ,  week_nbr
                                 ,  UPPER(TRIM(description))  description
                                 ,  NVL(prev_delv_cost + transfer_in_cost + delv_cost - transfer_out_cost - total_cost, 0) AS  cost
                                 ,  open_inventory
                                 ,  dollar_variance
                                 ,  sum(closing_inventory ) over (partition by storeid, week_nbr )  closing_inventory
                                 ,  DECODE(UPPER(TRIM(description)), 'SUPPLIES', closing_inventory) supp_cls_inv
                                 ,  DECODE(UPPER(TRIM(description)), 'BUNS-I', closing_inventory) bun_cls_inv
                              from  subquery_categories
                              where storeid = 501
     )
      ,  pivoted_cat_costs AS
     (SELECT  storeid
                 , week_nbr
                , MAX(DECODE(closing_inventory, 0, 0, closing_inventory))  -  NVL(MAX(supp_cls_inv), 0)  -       NVL(MAX(bun_cls_inv), 0)      as   closing_inventory
                , MAX(DECODE(dollar_variance, 0, 0, dollar_variance))                as   dollar_variance
               ,    NVL(MAX(DECODE(UPPER(TRIM(description)), 'SUPPLIES', cost)), 0)                 supplies
               ,    NVL(MAX(DECODE(UPPER(TRIM(description)), 'REPAIRS & MAINT', cost)), 0)                     repairs_and_maint
               ,    NVL(MAX(DECODE(UPPER(TRIM(description)), 'BUNS-I', cost)), 0)             as buns
          FROM   categorycosts
          GROUP BY storeid,  week_nbr
     )
    select * from pivoted_cat_costs;
    

    I still don't know the logic (or specifications) to derive your dollor expected variance.

  • Helps the query using case and County

    Hello!

    I have problems with a small part of a query below that finally can settle the invoice price.

    (count (distinct spm97.sample_id) * 36.1) as "PROFILE."
    case count (distinct spm33.sample_id)
    When (count (distinct spm97.sample_id) = 0) then (count (distinct spm33.sample_id) * 26)
    another (count (distinct spm33.sample_id) * 4.75)
    put an end to 'CC ',.

    The first line works - 0 or 36.10 is returned depending on whether or not a profile has been ordered for the sample.

    The cost of CC is 4.75 if a profile was also sentenced, if it isn't then the cost is 26. The case statement is supposed to verify this and return the exact amount, but I can't make it work... I get the error message that a closing parenthesis is missing somewhere in the middle of line 3, if that helps.

    Any advice would be much appreciated! Of course, this isn't the entire query - joins and all are all working well, it's just this little section with the County that I can't just.

    Thank you

    JO

    the piece of code below, seems to me is incorrect

        CASE COUNT(DISTINCT spm33.sample_id)
            WHEN
                (
                    COUNT(DISTINCT spm97.sample_id) = 0
                )
            THEN (COUNT(DISTINCT spm33.sample_id) * 26)
            ELSE (COUNT(DISTINCT spm33.sample_id) * 4.75)
        END AS "CC"
    

    It seems that the requirement here is that if * 'COUNT (DISTINCT spm97.sample_id) = 0' *, then * "CC" * should be ' COUNT (DISTINCT spm33.sample_id) (* 26) "ELSE, it should be" COUNT (DISTINCT spm33.sample_id) (* 4.75)»
    If this is the case, the statement must include:

        CASE COUNT(DISTINCT spm97.sample_id)
            WHEN
                0
            THEN (COUNT(DISTINCT spm33.sample_id) * 26)
            ELSE (COUNT(DISTINCT spm33.sample_id) * 4.75)
        END AS "CC"
    
  • change / stop the query using bad plan

    I use 11.2.0.3.  I'm wrong a script with multiple insert into... Select.  One of the insert running for hours because it is using bad plan because of State statistics.   I've now updated the statistics. Is - it there anyway I can do oracle raise this insert or ignore this insert and continue with the other inserts in my script. (I don't want to kill the session, I want to run other sqls).

    Also, for the future is there a way to make oracle dynamic sampling rather than obsolete statistics usage?

    I was able to cancel the query in another session to help

    execDBMS_RESOURCE_MANAGER.SWITCH_CONSUMER_GROUP_FOR_SESS (sid, serial#, 'CANCEL_SQL');

  • Helps the query in the effects control.

    Hello everyone, I hope that someone can help you.

    I have a little trouble trying to move images around the screen in my sequence to create an animation of a cursor moving to another location. The problem is when I put a marker in the query in the effect control to it's own journey sometimes going upward or down or left or right. I see that there are lines that are the focus of the image, as well as those of the tool pen in Photoshop and I think that that is what is causing the image of the cursor moving on its own. The two same lines by an image to make a smooth arch of movement these lines of travel to smooth the movement rather than a simple at the following location.

    I was wandering because this is the case whenever I put a new marker in the motion effects controls if there is something that is originally what I can disable. I can move these lines from the central point, but that does not make the smoother actions in my sequence.

    Any help is really appreciated.

    You set keyframes on linear:

    Adobe Premiere Pro help. Control to make changes using keyframe interpolation

  • Need help with the query using the aggregation

    If I have a table, defined as follows:

    CREATE TABLE range_test
    (
    range_id NUMBER (20) NOT NULL,
    rank of char (1) NOT NULL,
    lower_bound_of_range NUMBER (5.2) NOT NULL,
    upper_bound_of_range NUMBER (5.2) NOT NULL,
    received_date_time_stamp SYSTIMESTAMP NOT NULL DEFAULT TIMESTAMP
    );

    And I wanted to query the table to find the range associated with the last line inserted for each 'class' (for example 'A', 'B', 'C', etc.), how would I go about this?

    I want something like the following, but I know that it will not work right:

    SELECT
    grade,
    lower_bounding_of_range,
    upper_bounding_of_range,
    Max (received_date_time_stamp)
    Of
    range_test GROUP BY received_date_time_stamp;

    Thanks for your help... I am frustrated with this one and I think that it should be possible without having to use the PL/SQL (i.e. the functions of SQL aggregation or subqueries should work).

    Perhaps something along the lines of...

    SQL> ed
    Wrote file afiedt.buf
    
      1  select deptno, empno, ename, hiredate
      2  from emp
      3* order by deptno, empno
    SQL> /
    
        DEPTNO      EMPNO ENAME      HIREDATE
    ---------- ---------- ---------- --------------------
            10       7782 CLARK      09-JUN-1981 00:00:00
            10       7839 KING       17-NOV-1981 00:00:00
            10       7934 MILLER     23-JAN-1982 00:00:00
            20       7369 SMITH      17-DEC-1980 00:00:00
            20       7566 JONES      02-APR-1981 00:00:00
            20       7788 SCOTT      19-APR-1987 00:00:00
            20       7876 ADAMS      23-MAY-1987 00:00:00
            20       7902 FORD       03-DEC-1981 00:00:00
            30       7499 ALLEN      20-FEB-1981 00:00:00
            30       7521 WARD       22-FEB-1981 00:00:00
            30       7654 MARTIN     28-SEP-1981 00:00:00
            30       7698 BLAKE      01-MAY-1981 00:00:00
            30       7844 TURNER     08-SEP-1981 00:00:00
            30       7900 JAMES      03-DEC-1981 00:00:00
    
    14 rows selected.
    
    SQL> ed
    Wrote file afiedt.buf
    
      1  select deptno, empno, ename, hiredate
      2  from (
      3        select deptno, empno, ename, hiredate
      4              ,row_number() over (partition by deptno order by hiredate desc) as rn
      5        from emp
      6       )
      7  where rn = 1
      8* order by deptno, empno
    SQL> /
    
        DEPTNO      EMPNO ENAME      HIREDATE
    ---------- ---------- ---------- --------------------
            10       7934 MILLER     23-JAN-1982 00:00:00
            20       7876 ADAMS      23-MAY-1987 00:00:00
            30       7900 JAMES      03-DEC-1981 00:00:00
    
    SQL>
    
  • Need help with the query using the AVG function

    First post here.
    I am a student taking a SQL class and I can't find a query.
    I think I'm close to get it, but I can not quite all the way there.

    Three tables are involved in this problem. Here is a list of the tables and the areas concerned:
    orders table:
      order#
      shipstate
    
    orderitems table:
      order#
      isbn
      quantity         (How many copies of book purchased on that order)
    
    books table:
      isbn
      retail    (retail price of book)
    Problem:
    I want to get an average of "total amount" by shipstate.

    For example, in these tables, there are 8 records of the State of Florida.
    However, there are only 5 unique order # for this State.
    The amount of detail * quantity for these 8 disks (or 5 orders) is $345,10
    Now to get my average $345,10 should be divided by 5. (number of unique commands)
    In the following query it divides this $345,10 8. (number of records)

    How to make this request to divide by the number of unique order # rather than the number of records?
    SELECT shipstate, AVG(quantity * retail)
    FROM orders JOIN orderitems USING (order#)
    JOIN books USING (isbn)
    GROUP BY shipstate
    HAVING SUM(quantity * retail) =ANY
                                  (SELECT SUM(quantity * retail)
                                    FROM books JOIN orderitems USING (isbn)
                                    JOIN orders USING (order#)
                                    GROUP BY shipstate) 
    According to me, once I get this part down, I can understand the rest of the problem.
    The end result, I need, is to find all the individual commands that have a "total amount due" that is greater than the 'average amount due' for this state of clients.

    Any help, suggestions or comments welcome.
    Matt

    Your average take into account the shipstate (8 disks), you can do that for the expected results.
    With some examples of input data it would be easier, but here a try:

    SELECT shipstate, sum(quantity * retail)/count(distinct order#)
    FROM orders JOIN orderitems USING (order#)
    JOIN books USING (isbn)
    GROUP BY shipstate;
    

    Nicolas.

    delete the alias
    Edited by: N. Gasparotto on October 3, 2008 19:28

  • helps the selection using DOUBLE query

    «I got this manual block where I want to fill a certain field with the student to help number a Times-News-Forum - "block «»
    ' select to_char (sysdate, 'YY') |: sequence.seq_stud.next_val IN: BLK_OLD.student_number from DUAL;

    and I got an error "DOUBLE must be reported.

    * I forgot... error code will only later when I get the laptop.

    Make sure that you can connect to databases and try sys.dual

    and ask your DBA if there a public synonym for table DOUBLE?

  • How to filter the by using (-) less symbol.

    Hello
    We use oracle EBS as OLTP. Sale of data stores the PO_HEADERS_ALL, PO_LINES_ALL table, which contains data of rejection/cancellation order.
    Rejected/cancelled in data store also the same tables. We are able to identify the base denied/cancelled quantity indicated (-) less symbol. (Example: Qty:-30), and the Amount column is also indicated as (-) less symbol. So, how to filter the data on quantity everything is indicated - less symbol in OBIEE.

    Kind regards.
    CHR

    Jay wrote:
    IF it is set to digital use function of cast to convert to a char, then using as check the reason for the less to be filtered in the filter condition formula.

    If you want to filter all values of negetive, then use filter: measure<0 in="" filter="">

    Hope this will help you.

    Thank you
    Jay.

    If the column was purely digital, why cast to CHAR? The OP can do everything suggested Robert Angel. My suggestion was where there was some other non-numeric values in the column that could be the reason why the column is CHAR, then using the SIMILAR filter operand would solve this problem.

    The first part of your suggesting was not necessary if the column is a numeric data type and the second part was just a repetition of what Robert.

  • Help the query group

    HII Guru

    Please help me.

    I have question

    Select

    HP.party_name as a customer,

    RCT.trx_number like Bill,

    RCT.trx_date,

    rctl.attribute3 as Faktur_Pajak,

    rctl.attribute4 as TIN,

    rctl.line_number not,.

    rctl. Description,

    GCC. Segment1 |'. ' | GCC. Segment2 |'. ' | GCC.segment3 |'. ' | GCC.segment4 |'. ' | GCC.segment5 |'. ' | GCC.segment6 as COA,

    rctl.extended_amount as long as dpp,.

    rctl.extended_amount * avt.tax_rate / 100 / rctl.extended_amount * 100 as tax_rate,.

    rctl.extended_amount * avt.tax_rate / 100 as NPP

    Of

    ra_customer_trx_lines_all rctl,

    ra_customer_trx_all rct,

    hz_cust_accounts AOB,

    hz_parties hp,

    ar_vat_tax_all_b avt,

    gl_code_combinations gcc,

    ar_memo_lines_all_b amb

    where 1 = 1

    and rct.org_id = "122"

    and rct.trx_number = 'ExpressMay14.13143. '

    and rctl.memo_line_id = amb.memo_line_id

    and amb.gl_id_rev = gcc.code_combination_id

    and hca.party_id = hp.party_id

    and rct.customer_trx_id = rctl.customer_trx_id

    and rct.sold_to_customer_id = hca.cust_account_id

    and avt.vat_tax_id = rctl.vat_tax_id

    and rctl.description is not null

    and rct.trx_date BETWEEN TO_DATE('05/01/2014','MM/DD/YYYY') AND TO_DATE('05/07/2014','MM/DD/YYYY')

    order of trx_date

    and the result is

    CUSTOMERINVOICETRX_DATEFAKTUR_PAJAKTINNO.DESCRIPTIONCOAGP3TAX_RATENPP
    EVAGENITA, UDExpressMay14.131432 May 14040.001 - 14.8163714907.358.521.8 - 311,0001BKS-Express21.W10.544300.0000.000.0002 664 750126 648
    EVAGENITA, UDExpressMay14.131432 May 14040.001 - 14.8163714907.358.521.8 - 311,0002Stamp duty21.000.807220.0000.000.0006 00000

    actually I want to display as

    CUSTOMERINVOICETRX_DATEFAKTUR_PAJAKTINNO.DESCRIPTIONCOAGP3TAX_RATENPPTotal
    EVAGENITA, UDExpressMay14.131432 May 14040.001 - 14.8163714907.358.521.8 - 311,0001BKS-Express21.W10.544300.0000.000.0002 664 750126 648
    2Stamp duty21.000.807220.0000.000.0006 000002 697 398
    Consider the 
    
    with t
    as
    (
    ...
    )
    
    As your actual query. You can do this.
    
    SQL> with t
      2  as
      3  (
      4  select 'EVAGENITA, UD' customer
      5       , 'ExpressMay14.13143' invoice
      6       , to_date('2-May-14','dd-mon-rr') trx_date
      7       , '040.001-14.81637149' faktur_pajak
      8       , '07.358.521.8-311.000' npwp
      9       , 1 no
     10       , 'BKS-Express' description
     11       , '21.W10.544300.0000.000.000' coa
     12       , 2664750 dpp
     13       , 1 tax_rate
     14       , 26648 pnn
     15    from dual
     16  union all
     17  select 'EVAGENITA, UD' customer
     18       , 'ExpressMay14.13143' invoice
     19       , to_date('2-May-14','dd-mon-rr') trx_date
     20       , '040.001-14.81637149' faktur_pajak
     21       , '07.358.521.8-311.000' npwp
     22       , 2 no
     23       , 'Stamp Duty' description
     24       , '21.000.807220.0000.000.000' coa
     25       , 6000 dpp
     26       , 0 tax_rate
     27       , 0 pnn
     28    from dual
     29  )
     30  select decode(rno, 1, customer) customer
     31       , decode(rno, 1, invoice) invoice
     32       , decode(rno, 1, trx_date) trx_date
     33       , decode(rno, 1, faktur_pajak) faktur_pajak
     34       , decode(rno, 1, npwp) npwp
     35       , no
     36       , description
     37       , coa
     38       , dpp
     39       , tax_rate
     40       , pnn
     41       , case when
     42               rno = cnt then
     43                 sum
     44                 (
     45                    dpp+pnn
     46                 )
     47                 over
     48                 (
     49                   partition
     50                          by customer
     51                           , invoice
     52                           , trx_date
     53                           , faktur_pajak
     54                           , npwp
     55                       order
     56                          by no
     57                 )
     58         end total
     59    from (
     60            select t.*
     61                 , row_number() over
     62                                (
     63                                   partition
     64                                          by customer
     65                                           , invoice
     66                                           , trx_date
     67                                           , faktur_pajak
     68                                           , npwp
     69                                       order
     70                                          by no
     71                                ) rno
     72                 , count(*)     over
     73                                (
     74                                   partition
     75                                          by customer
     76                                           , invoice
     77                                           , trx_date
     78                                           , faktur_pajak
     79                                           , npwp
     80                                ) cnt
     81              from t
     82          );
    
    CUSTOMER      INVOICE            TRX_DATE  FAKTUR_PAJAK        NPWP                         NO DESCRIPTION COA                               DPP   TAX_RATE        PNN      TOTAL
    ------------- ------------------ --------- ------------------- -------------------- ---------- ----------- -------------------------- ---------- ---------- ---------- ----------
    EVAGENITA, UD ExpressMay14.13143 02-MAY-14 040.001-14.81637149 07.358.521.8-311.000          1 BKS-Express 21.W10.544300.0000.000.000    2664750          1      26648
                                                                                                 2 Stamp Duty  21.000.807220.0000.000.000       6000          0          0    2697398
    
    SQL>
    
  • Update the query using nulls update box

    I need to update a column according to the conditions that I've used below, the update query, I used is updated as well to null values. How can I stop this and keep the old values when no match was found for the case.
    create table sample (name varchar2(10),eno number(10),salary number(10));
    insert into sample (name,eno,salary) values ('emp1',1,100);
    insert into sample (name,eno,salary) values ('emp2',2,200);
    insert into sample (name,eno,salary) values ('emp3',3,300);
    select * from sample;
        
    update sample 
    set salary = 
    case when salary = 100 then 10000 else 
    case when salary = 150 then 15000 else 
    case when salary = 200 then 20000 end end end
    where name is not null;
           
    Actual o/p:
           emp1     1     10000
           emp2     2     20000
           emp3     3     
    
    Required o/p:
            emp1     1     10000
           emp2     2     20000
           emp3     3      300

    Hello

    The updated control WHERE clause lines.
    If you do not have a WHERE clause, then updates all rows in the table.

    update  sample
    set      salary = case
                          when salary = 100 then 10000
                   when salary = 150 then 15000
                   when salary = 200 then 20000
               end
    where   salary   IN (100, 150, 200)
    ;
    

    Note that you do not need to nest BOX icies expressions (or almost anywhere else). If the 'salary = 100' condition is true, then its correspondent WHEN the value is returned, and the remaining terms will not be evaluated. If the first condition is not true, then only will be the "salary = 150" condition to be evaluated. (The terms are mutually exclusive in this example anyway, so it does not matter.)

    Published by: Frank Kulash, June 5, 2012 13:09

  • Application of setting fit the values of the query using CFSPREADSHEET

    I'm trying to understand how to correctly apply my border line personalized to my values of query only on an automated report of formatting. Currently, I have where it is hard coded, but long term it would not work because the new values will be added to the database and query of the report values would be generated without the formatting of boundary line, unless I have change the line numbers manually whenever.

    Here's the formatting that I use:

    format4. TopBorder = "Thin";

    format4.topBorderColor = "grey_40_percent";

    format4. BottomBorder = "Thin";

    format4.bottomBorderColor = "grey_40_percent";

    format4. LeftBorder = "Thin";

    format4.leftBorderColor = "grey_40_percent";

    format4. RightBorder = "Thin";

    format4.rightbordercolor = "grey_40_percent";

    SpreadsheetFormatCellRange (report, format4, 2,1,26,8);

    Have you attempted to assign a variable equal to the number of records that you pull from the database?  I did it in several cfspreadsheets and it works as defining a counter to count the number off loops through the data.

  • Optimize the query using SUBSTR

    Hi, I wrote the following to get the string 'abc' in the following input strings


    ' 2010/abc' and
    ' 2010/inv/abc '.

    I have to get the string ("abc") that is after the last ' / '.
    In the entrance of the channels the ' / ' can be 1 or 2

    So I tried the following
    select substr(substr('2010/abc',(instr('2010/abc','/',1,1)+1),length('2010/abc inst')),
    (instr(substr('2010/abc',(instr('2010/abc','/',1,1)+1),length('2010/abc inst')),
    '/',1,1)+1),length (substr('2010/abc',(instr('2010/abc','/',1,1)+1),length('2010/abc inst')))) str from dual
    The Select query above is even working for the string "abc/2010/inv" (until the 2nd "/" only it works)

    Could please minimize the above query, and which may even work you for same 3rd or 4th ' / '.


    Thank you

    Hello

    Alternatively, you can use regexp if you want:

    Scott@my10g SQL>l
      1  with data as (
      2  select '2010/abc' str from dual
      3  union all select 'inv/2010/abc' from dual
      4  union all select 'inv/2010/inv/2010/abc' from dual
      5  union all select 'abc' from dual
      6  )
      7* select str, regexp_substr(str,'[^/]+$') sstr from data
    Scott@my10g SQL>/
    
    STR                   SSTR
    --------------------- ----------
    2010/abc              abc
    inv/2010/abc          abc
    inv/2010/inv/2010/abc abc
    abc                   abc
    

    It would be up to the end of the string from the last source / (or early if no / in the source string)

  • refine the query using dates

    Hello.

    I currently have a datablock, interviewed once brings a list of records. I'm looking to refine this list of records by declaring the from_date and to_date so that it retrieves only records = > from_date and < = to_date. the field that must be between these dates is the reg_date

    in my query before trigger I have:

    : car.reg_date: = * don't know what to put here *.

    any help on this would be great, thanks.

    Published by: user13390506 on 02-Sep-2010 07:10

    I suggest you explicitly cast values:

    Declare
      LC$Where  Varchar2(1000);
    Begin
      LC$Where := 'date between TO_DATE(''' || TO_CHAR(:blk.date_start, 'DD.MM.YYYY') || ''',''DD.MM.YYYY'') and TO_DATE(''' || TO_CHAR(:blk.date_end, 'DD.MM.YYYY') || ''',''DD.MM.YYYY'')' ;
      Set_Block_Property( 'BLK', DEFAULT_WHERE, LC$Where ) ;
      Go_Block ('BLK' ) ;
      Execute_query ;
    End;
    
  • Forming the query using xml forest

    Hello

    I need the code for the following items. I tried... but doing the extra tags.

      Create table patient (pat_mrn varchar2(100)) ; Create table encount (pat_mrn varchar2(100), encounter_id varchar2(1000)); Create table oper    (encounter_id varchar2(1000), comp_name varchar2(1000));  Insert into patient values ('63280'); Insert into encount values ('63280', '42'); Insert  into oper values  (42, 'sugar'); Insert  into oper  values (42, 'sbp'); Insert  into oper  values (42, 'dbp');  CREATE OR REPLACE TYPE COMPONENT AS OBJECT (    "ID" VARCHAR2(1000));  CREATE OR REPLACE TYPE component_list_t AS TABLE OF COMPONENT;  CREATE OR REPLACE TYPE cm_results_o_t AS OBJECT (RES_LIST component_list_t);    O/p required : <Patient> <pat_mrn> 63280 </pat_mrn> <Results> <Component> <ID>sugar</ID> </Component> <Component> <ID>sbp</ID> </Component> <Component> <ID>dbp</ID> </Component>  </Results> </patient> 

    The code I wrote:

    Select P.PAT_MRN, XMLELEMENT("Patient", (XMLELEMENT("pat_mrn", P.pat_mrn)), (XMLELEMENT("Results", XMLForest(cm_results_o_t(CAST(MULTISET (SELECT O.COMP_NAME AS "ID" FROM oper O WHERE O.ENCOUNTER_ID = E.ENCOUNTER_ID) AS component_list_t)) AS "Results")))) AS Orderxml FROM PATIENT P JOIN ENCOUNT E ON P.PAT_MRN = E.PAT_MRN AND P.PAT_MRN = '63280' AND E.ENCOUNTER_ID = 42  

    Thus, we can clearly see there is a lot of extra tags... o/p I get

     <Patient> <pat_mrn>63280</pat_mrn> <Results> <Results> <RES_LIST> <COMPONENT> <ID>sugar</ID> </COMPONENT> <COMPONENT> <ID>sbp</ID> </COMPONENT> <COMPONENT> <ID>dbp</ID> </COMPONENT> </RES_LIST> </Results> </Results> </Patient> 

    I'm new to xml... Thus, any help is appreciated.

    Thank you.

    So what follows is one of the variants which leads to the same result

    SELECT p.pat_mrn,
           XMLElement("Patient",
             XMLElement("pat_mrn", P.pat_mrn),
             (SELECT XMLAgg(XMLElement("Component",
                             XMLElement("ID",o.comp_name)))
                      FROM oper o
                     WHERE e.encounter_id = o.encounter_id))
      FROM patient p
           INNER JOIN encount e
             ON (P.PAT_MRN = E.PAT_MRN
                 AND P.PAT_MRN = '63280'
                 AND E.ENCOUNTER_ID = 42);
    

Maybe you are looking for

  • SD card reader will not work (code 10) on Portege

    Problems with my toshiba portege. When I put a card in the reader, plug n play sound just so he recognizes her. but then, nothing appears, something about an error code 10 showing he cannot start. is it possible to fix it?

  • Where is the customer care xbox 360... !

    I bought the Xbox 360 in 2008 and I am very disappointed because I can ' t find the race Microsoft or Logitech Drive FX wheel to buy for my console in stores in Greece. I've searched on e - bay to buy it from another country, but is unfortuble and I

  • BlackBerry Passport insert/remove the sim card

    Should I turn off my passport before to insert/remove the Sim Card or not? is it dangerous to do when the device is already on?

  • Benefits of using Forall

    Can someone explain the benefits of using Forall on a loop in a PL/SQL block?

  • Evo n620c question

    issue of card mother evo n620c