Related query date

create or replace procedure (v_txndatetime as)
is
v_ename varchar2 (10);
Start
Select ename from v_ename from emp where txn_datetime between v_txndatetime and v_txndatetime;
end;
/

To this date, I want to add these two lines dynamically how can I add can you show me by example:


(1) 12: 00

(2) 23:59:59

Test_Table desc:


Ename varchar2 (10)
date of txndatetime

Let's see if we can do with what I'm bored and waiting for a trial to test at the end BTW: This gives you and your instructor will probably be omitted for cheating.

CREATE OR REPLACE PROCEDURE test(txnDateTime IN emp.hiredate%TYPE) AUTHID DEFINER IS
 empName  emp.ename%TYPE;
 EIString VARCHAR2(256);
 BegDate  DATE := TRUNC(txnDateTime);
 EndDate  DATE := TRUNC(txnDateTime)+(1-(1/86400)); -- fixed from earlier typo
BEGIN
  dbms_output.put_line(TO_CHAR(BegDate));
  dbms_output.put_line(TO_CHAR(EndDate));

  EIString := 'SELECT ename FROM emp WHERE hiredate BETWEEN :B1 AND :B2';
  EXECUTE IMMEDIATE EIstring INTO empName USING BegDate, EndDate;

  dbms_output.put_line(empName);
EXCEPTION
  WHEN NO_DATA_FOUND THEN
    dbms_output.put_line('Too Many Rows because the design is flawed and can not handle more than one return value');
END test;
/

set serveroutput on

exec test(TO_DATE('17-NOV-1981'));

of course, now that you have the name of the employee in empName you need to do something with it, because now what is happening is absolutely nothing.

For all those who want to improve it and it can be done, read Bryn Llewellyn on SQL and PL/SQL white paper available on my site.
www.morganslibrary.org
You will find a link to it on the homepage.

Published by: damorgan on August 21, 2012 20:45

Tags: Database

Similar Questions

  • CSS: Child Edition tips and related technical data

    When you configure CSS in the DataAdminToolkit and the establishment of the railways of publication, there is an option to "publish child advice", which said that the box guide tool to check if 'lower level element advice should be automatically published. ' I tried this at the level of the and have not been able to get it published advice of other specifications. I also have found no great thing on the subject in each CSS guide, so I have a few questions:

    D ' a technical level of trade, there are some child spec references (packaging, material production, form context, etc) which are not checked for a TIP to be published?
    -Must TIP of the child spec be at a certain stage of CSS workflow to publish with the parent?
    -Is that each child published specification out as a separate message is all combined into one with the trade data?

    A second is to understand the options for data of a related specification file in message CSS an another spec. For example, on a trade specification, if I wanted to include incoming material BOM of the formula referenced by of output material on the specifications related to trade spec tab, do I need to do this, use a custom manager or y at - it another option to remove these data in automatically?

    Thank you

    I will answer your second question first:

    Technically, it is possible to include some related specifications data without writing any code - you can change the mapping/model to do, much as files modifying the printing process (and then you need to change the XSL). However, this approach would probably break the web service that allows you to consume the syndication. It can also be difficult to maintain BOM data in the mapping files. If you are Syndication only internally, so maybe it's a pretty easy option.

    The other option is to use the extensibility of Extensions to the export point. Basically, each trade Spec TIP has a node called CssExtension which is all a tag that allows you to add your own data into it. Custom Section data are unionized in this way. We call a manager who looks in the file config\Extensions\exportExtensions.xml, for the type of technique, and treat all managers to generate the desired output. So you would have to write your own handler, then plug it here. This allows the web service contract remain valid and he made sure that all changes are not made in this .mll file, which simplifies the upgrade process.

    Published by: Ron M on March 13, 2013 10:49

  • procedure that will dynamically build the query data and table Medallion

    Hi people,

    I write a procedure that dynamically build the query data and insert in the table "dq_summary".
    enforcement procedure with success and data not inserted into the table 'dq_summary '.

    I have thin problem in code attached below
    -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
    FOR rep IN cur_di_attr
          LOOP
            dbms_output.put_line ('d: ');   
            
            BEGIN
              EXECUTE IMMEDIATE 'SELECT table_name FROM ' || sum_tab || ' WHERE id = ' || rep.attribute_id INTO rep_tab;
              dbms_output.put_line ('rep_tab: '||rep_tab);
              run_query := run_query || ' ' || rep_tab || ' WHERE ' || nvl(wh_cond, '1 = 1');
              EXECUTE IMMEDIATE run_query INTO end_rslt;
            
              EXECUTE IMMEDIATE 'UPDATE dq_summary SET ' || prop || '_' || p_code || ' = ' || end_rslt || ' WHERE attribute_id = ' || rep.attribute_id;
              dbms_output.put_line ('e: ');      
              dbms_output.put_line ('rep_tab: '||rep_tab);
              dbms_output.put_line ('end_rslt: '||end_rslt);
              dbms_output.put_line ('f: '); 
            EXCEPTION
              WHEN no_data_found THEN
                rep_tab := '';
                sum_tab := '';
            END;  
          
          END LOOP;    
    -----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
    but in the procedure below must be run several times
    create or replace
    PROCEDURE DQ_REPORT_PROC
    AS
      prop                              di_proposition.pro_name%type;
      col_var                           VARCHAR2(100);
      p_code                            dq_parameter.para_code%type;
      sum_tab                           di_proposition.summary_table%type;
      run_query                         dq_parameter.run_query%type;
      wh_cond                           dq_parameter.where_cond%type;
      end_rslt                          VARCHAR2(20);
      rep_tab                           VARCHAR2(50);
      v_error_msg                       VARCHAR2(200);   
      v_error_code                      VARCHAR2(200);  
      v_object_name                     VARCHAR2(50)                          DEFAULT 'DQ_REPORT_PROC';
      v_iss_no                          VARCHAR2(20)                          DEFAULT NULL;
      CURSOR cur_di_prop IS 
        SELECT upper(replace(replace(pro_name, ' '),'-')) pro_name
          FROM di_proposition;
      
      CURSOR cur_di_para IS
        SELECT upper(para_code) para_code, run_query, where_cond
          FROM dq_parameter;
      
      CURSOR cur_di_attr IS 
        SELECT attribute_id
          FROM dq_summary;
    BEGIN
      
      DELETE FROM dq_summary;
    
      INSERT INTO dq_summary (attribute_id, entity_name, attribute_name, data_champ) 
        SELECT a.attribute_id, b.entity_name, a.attribute_name, a.data_champ
          FROM di_attribute_master a, di_entity_master b
         WHERE a.entity_id = b.entity_id;
    
      FOR c_prop IN cur_di_prop
      LOOP
        prop := c_prop.pro_name;
        
        BEGIN
          SELECT distinct SUBSTR(column_name, 1, INSTR(column_name, '_')-1), summary_table
            INTO col_var, sum_tab
            FROM user_tab_cols a, di_proposition b
           WHERE a.table_name = 'DQ_SUMMARY'
             AND upper(replace(replace(b.pro_name, ' '),'-')) = prop
             AND SUBSTR(a.column_name, 1, INSTR(a.column_name, '_')-1) = upper(replace(replace(b.pro_name, ' '),'-'))
             AND upper(b.status) = 'Y';
             
             dbms_output.put_line ('col_var: '||col_var);
             dbms_output.put_line ('sum_tab: '||sum_tab);
             
        EXCEPTION
          WHEN no_data_found THEN
            col_var := '';
            sum_tab := '';
        END;
    
        dbms_output.put_line ('a: ');
    
        FOR para IN cur_di_para
        LOOP
         dbms_output.put_line ('b: ');
          p_code := para.para_code;
          run_query := para.run_query;
          wh_cond := para.where_cond;
          dbms_output.put_line ('c: ');
          FOR rep IN cur_di_attr
          LOOP
            dbms_output.put_line ('d: ');   
            
            BEGIN
              EXECUTE IMMEDIATE 'SELECT table_name FROM ' || sum_tab || ' WHERE id = ' || rep.attribute_id INTO rep_tab;
              dbms_output.put_line ('rep_tab: '||rep_tab);
              run_query := run_query || ' ' || rep_tab || ' WHERE ' || nvl(wh_cond, '1 = 1');
              EXECUTE IMMEDIATE run_query INTO end_rslt;
            
              EXECUTE IMMEDIATE 'UPDATE dq_summary SET ' || prop || '_' || p_code || ' = ' || end_rslt || ' WHERE attribute_id = ' || rep.attribute_id;
              dbms_output.put_line ('e: ');      
              dbms_output.put_line ('rep_tab: '||rep_tab);
              dbms_output.put_line ('end_rslt: '||end_rslt);
              dbms_output.put_line ('f: '); 
            EXCEPTION
              WHEN no_data_found THEN
                rep_tab := '';
                sum_tab := '';
            END;  
          
          END LOOP;    
        END LOOP;
      END LOOP; 
      COMMIT;   
    EXCEPTION
          WHEN OTHERS THEN
             v_error_msg   := SQLERRM;
             v_error_code  := SQLCODE;  
             TRACKER_LOG_EXECEPTION(v_iss_no, v_object_name, CURRENT_TIMESTAMP, v_error_msg, v_error_code);
          COMMIT;        
      
    END DQ_REPORT_PROC;
    Published by: BluShadow on February 7, 2012 12:04
    addition of {noformat}
    {noformat} tags.  Please read {message:id=9360002} and learn to do this yourself in future.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                            

    903830 wrote:

    I write a procedure that dynamically build the query data and insert in the table "dq_summary".
    enforcement procedure with success and data not inserted into the table 'dq_summary '.

    I'm sorry. But there is no kind of say that way. The code is undesirable. The approach is wrong. This will not happen. This will cause the fragmentation of memory in the shared Pool. This will lead to another session being impossible to analyze the sliders because of the fragmented memory.

    Not only that. The underlying data model is questionable.

    All this seems a candidate perfect as an example of how NOT to design and code and use Oracle.

  • query data values abse, and going to a pdf form

    query data values abse, and going to a pdf form

    Hello
    Hello, I have this html report I wrote out of a report.
    now, I am assigned to pass the same fields on a pdf form so that the fields in the form of adaobe can be filled or the term pre-fill the form.

    can someone help me get started.

    first of all, I would like to know if it was possible.
    Second, what do I need get started (Tools)
    third how to do this.

    I'm really confused at this point.

    can someone give me advice on how to deal with this theme?

    It may be using Adobe Acrobat Designer is bundled with Acrobat
    7 Professional pro. It uses the XML to create and fill out forms.

    If you had the time, you can also create a to have
    their loan and just pass the info to it, if it automatically generates the PDF file on the
    fly.

  • Need help SQL - query data-

    Hello

    I am trying to query data in my database to do the following...

    ' Create a query to show all members of the staff who were working on a Monday and worked for more than 86 weeks.

    I did the following and it works...

    SELECT employe_id, Employee_Name, Hire_Date
    The EMPLOYEE
    WHERE months_between(sysdate,Hire_Date) > 12 * 1.52;

    The query returns all that was necessarily over 86 weeks.

    What I would do, is to understand Monday in the query, but I don't know how to do this.

    Could someone help?

    -Try this:

    SELECT employe_id, Employee_Name, Hire_Date
    The EMPLOYEE
    WHERE months_between(sysdate,Hire_Date) > 12 * 1.52
    and to_char (hire_date, 'fmDAY') = 'MONDAY ';

    Published by: P@K on April 14, 2009 18:26

  • Loading XML into the relational Table data

    Hello

    I get a generated XML file to other tools (Windows), I am trying to create a Linux shell script that will gather the necessary XML file to my Linux server, then ask Oracle to use this file to load the XML data into a relational table. This activity and the data will be needed on an ongoing basis.

    I tried two ways. First, I loaded the XML document into the database and tried to extract the data directly from the document, but it does not work. Now I want to try to read the data directly from the file on the server through select, however I don't get all the returned data. In the Select statement below, I am trying to query the data to see what is returned for my tests.

    Create Table ci_results_table (transactionID Varchar2 (100), //transactionID should be PrimaryKey but became errors in test of insert, PK so deleted NULL value)

    message Varchar2 (200),

    This Varchar2 (50).

    XMLType of the ProcessedDate,

    status Varchar2 (50).

    sourceFile VarChar2 (100));

    Select x.*

    from XMLTable)

    ' TSPLoadResults/results '.

    PASSAGE xmltype (bfilename('CMDB_DEVADHOCRESULTS_DIR','LoadResults-HP_146.results.xml'), nls_charset_id ('AL32UTF8'))

    COLUMNS

    transactionID PATH Varchar2 (100) 'TransactionID '.

    Result XMLType PATH 'result ',.

    Message Varchar2 (200) PATH "Message."

    PrimaryKey Varchar2 (50) PATH "PrimaryKey"

    Date of ProcessedDate path "ProcessedDate."

    Status Varchar2 (50) PATH "Status."

    SourceFile VarChar2 (100) PATH "SourceFileName.

    ) x

    ;

    Eventually, I'll have to build on that to limit the returned data to records where SourceFileName is like '% PA' and insert what is returned in to the ci_results_table. Attached is an example of the XML results file I am trying to load, it is named "ResultsTransformedtoUnix" because I used dos2Unix to convert Unix which can be good or bad. (I send the output file must be converted to the format BACK until the other application can read). Original (before converting Unix) file named in the script is also attached.

    Help, please. Thank you!

    Hello

    I see some bad things in your query.

    (1) obvious one, explaining why you get all the data: there is a typing error in the XQuery expression, there 'result' not'slead.

    (2) ProcessedDate can be extracted as a date (at least not directly) because it actually represents a timestamp, use the TIMESTAMP WITH time ZONE HOURS and cast back to DATE data type in the SELECT clause

    (3) transactionID is an attribute, it must be accessible with ' @' (or ' attribute:' axis)

    (4) if the encoding of file really is ISO-8859-1 as suggested in the prologue, then do not use AL32UTF8 but the name of the corresponding character set: WE8ISO8859P1

    Here is the query to work:

    select x.transactionID
         , x.Message
         , x.Primarykey
         , cast(x.ProcessedDate as date) ProcessDate
         , x.Status
         , x.SourceFile
    from XMLTable(
           '/TSPLoadResults/Result'
           PASSING xmltype(bfilename('XML_DIR','LoadResults-HP_146.results.xml'), nls_charset_id('WE8ISO8859P1'))
           COLUMNS
             transactionID Varchar2(100)            PATH '@transactionID',
             Message       Varchar2(200)            PATH 'Message',
             PrimaryKey    Varchar2(50)             PATH 'PrimaryKey',
             ProcessedDate timestamp with time zone PATH 'ProcessedDate',
             Status        Varchar2(50)             PATH 'Status',
             SourceFile    VarChar2(100)            PATH 'SourceFileName'
         ) x
    ;
    

    Directly on the file using this query will only be decently (for large files) on 11.2.0.4 and beyond.

    On older versions, first load the file into a (temporary) XMLType column with binary XML storage and CHOOSE from there.

    because I used dos2Unix to convert Unix which can be good or bad.

    This conversion should not be necessary.

  • Query data cards

    Hello

    Does anyone has experiences querying maps of data by using the SOAP api?  We lack of this via a cloud connector (can we use the REST API safely with connectors cloud E9 and E10?  "I have the impression that we could not).

    We have a data card implemented, which lists the appointments.  What we want to do is using the API, pull information and send it to a planning system.

    I can't by pulling the card data information.

    To simplify, (using c#)

    I have filled my list of fields

    fields var is gcnew List < string > ();.

    fields. Add ("DataCardID");

    I can get the Type of Entty DataCard

    ID = 28

    Type = DataCardSet

    Name = list of appointments

    We have a transformed column (internal name Sent1) that we use to indicate whether or not it is sent.  Y sent =, N = not sent.

    Then trying to query all items processed - we do this:

    var result = _eloquaInstance.serviceProxy.Query (mergeDataSource, "", fields.) Count == 0? NULL: fields. ToArray(), currentPage, 200);

    (currentPage is set to 0 for the first page of data).

    We get an error to the appellant of this issue:

    "The creator of this fault did not specify a reason."

    I can't see the details of the exception

    [{CloudConnector.EloquaServiceLibrary.EloquaServiceNew.ValidationDetail 2}]

    Any ideas?  What have I done wrong?

    Thank you

    Mark

    Hey Mark,

    I suggest that you check this page if you don't have it already - https://secure.eloqua.com/api/docs/Dynamic/Rest/1.0/Reference.aspx

    You will need to connect to Eloqua via the user interface before you visit this page. The page is a great resource, because there all Eloqua REST 1.0 available for testing API calls.

    Assuming that you're stuck in step 3:

    In order to update the custom object data, you can make a post/data/Objetpersonnalise / {id}. When posting, you will need to maintain the structure of the GET response. If you don't want to update all data cards in the card game of data, you can remove the fieldValue objects.

    Hope this helps,

    SAI

  • Why should he use CANCEL query data?

    First, JOHN the user launches an application.

    Second, DANIELLE user updates a row that will be included in the query.

    Thirdly, the request of JOHN full.

    Fourth, MALIK commits the change.

    Fifth, JOHN runs its query again.


    Will be First request of JOHN will use the undo data? This book says Yes (without any explanation), but I say no

    Because Danielle has not committed his change, so the line interviewing JOHN has the value it had when the request has been started (see the table). Therefore, there is no need to read UNDO data (which will degrade the performance anyway).

    Where I'm wrong?

    Even if Eric did not commit its changes might still be written to the data file, even though it marked as "not committed". So in this case, John would require access to undo to give his point of view consistent in reading.

  • SQL query date with century / year

    We have a database of oracle with about 6 million documents. There is a date field called Date entered from 1985 to the present day.
    Most of these recordings was entered into before January 1, 2000.
    If I run a query like
    Select count (*) (TableName) where entered_date < 1 January 00 ' I get 0
    If I do
    Select count (*) (TableName) where entered_date < 31 December 99 ' I get 0
    BUT IF I DO
    Select count (*) (TableName) where entered_date < 1 January 00 ' get 6 million documents
    or
    Select count (*) in the TREASURY.ctrc where entrydate > 31 December 99 ' get 6 million documents

    I tried the same queries using 4 digit years but get the same results. He thinks that 2000 is at least 1999
    How to do this?
    Thank you

    Hello

    975204 wrote:
    There are 6 million documents on the table
    about two-thirds have a date prior to 1 January 2000

    How do you know this? Knowledge of the application, you can find out that two-thirds of them are supposed to have dates from before 2000, but if

    SELECT  COUNT (*)
    FROM      TABLE_NAME
    WHERE      ENTRYDATE < TO_DATE ( '01-JAN-2000'
                           , 'DD-MON-YYYY'
                       )
    ;
    

    Returns the value 0, which has strong enough evidence that none of them actually do.

    When I look at the dates, they appear as December 31, 86

    Another example of why use 2 digit years is a bad idea.

    I can't actually provide these confidential customer data dump

    Seriously; You can provide an output DUMP? You already said that it displays as December 31, 86, so even if the fact that 1 entry in the table of the line 6 million was made on December 31, 1986 was such a big secret, he is already out, and you don't cause any harm more by showing the results of DUMP.

    I ran the query with the same format as the date, that is to say
    SELECT COUNT (*) FROM THE DEPARTMENT OF FINANCE. CRTC WHERE ENTRYDATE< to_date="" (="">
    "DD-MON-YY."
    )

    but same results
    If the column is defined as a DATE type, Oracle made a distinction based on the way the data is displayed, which means that she think that December 31, 86 is different from December 31, 1986?

    No, all the DATE columns have the same format. A DATE can be displayed in one way of the other, but it is stored in the form of or.

    Should I convert all the data from one year to 4 digits?

    A 4-digit year dates should be always displayed.

    Run an update query that said if year > = 80 and<= 99="" add="" 19="" in="" front="" of="" the="" year,="" else="" add="" 20="" in="" front="" of="" the="">

    We will find out exactly what the problem is, first of all.

    Have you seen the Ascheffer message? ^ 1 ^ it was dated just a couple of minutes before your last post, so you might not notice. Run it to see what are the years of 4 real numbers.
    If it shows, for example, there are a lot of entrydates in the years 2080 to 2099, and if you decide that all who should really be 100 years earlier, use ADD_MONTHS to remedy:

    UPDATE     table_name
    SET     entrydate = ADD_MONTHS ( entrydate
                               , -100 * 12
                          )
    WHERE   entrydate >= TO_DATE ( '01-JAN-2080'
                             , 'DD-MON-YYYY'
                        )
    AND     entrydate <  TO_DATE ( '01-JAN-2100'
                             , 'DD-MON-YYYY'
                        )
    ;
    

    Published by: Frank Kulash, 15 March 2013 18:09

    ^ 1 ^ you see of course, message Ascheffer; I was still typing the message above when you posted another.

  • querying data from the buffer_cache

    Hello
    I use an experience on my own database of House to understand how data is retrieved from the buffer_cache. Using Oracle 10 g 10.2.0.3.0. I have completed 2 tables with 10 million documents each. The structure of the table is as follows.

    create the bigtable table (varchar2 (50) col1, col2 varchar2 (50))
    create table bigtablechild (varchar2 (50) col1, col2 varchar2 (50))

    bigtablechild.Col1 is a foreign key to bigtable.col1, and there is no index on the tables.

    When I run this query, it takes about 30 seconds to return the data whenever it is running. I can also see the the hard disk activity running on the usefulness of my practical hard drive monitor each time. I thought that once the data is loaded into the buffer_cache, the query would not scan the hard drive for data. Could someone please help me understand what is happening?

    Select bigtable b.col2 a, bigtablechild b where a.col1 = b.col1 and a.col1 = "ABC8876."
    -------------------------------------------------------------------------------------------------------
    | Id  | Operation          | Name          | Starts | E-Rows | A-Rows |   A-Time   | Buffers | Reads  |
    -------------------------------------------------------------------------------------------------------
    |*  1 |  HASH JOIN         |               |      1 |      5 |      5 |00:00:23.00 |   93671 |  90663 |
    |*  2 |   TABLE ACCESS FULL| BIGTABLE      |      1 |      1 |      1 |00:00:14.26 |   57799 |  54931 |
    |*  3 |   TABLE ACCESS FULL| BIGTABLECHILD |      1 |      5 |      5 |00:00:08.74 |   35872 |  35732 |
    -------------------------------------------------------------------------------------------------------
    Published by: arizona9952 on January 19, 2013 09:38

    Jonathan Lewis wrote:
    One of the diffficulties we face when trying to figure out how to size the table affects caching on the full tablescans, is that algorithms exist to protect the blocks being scrapped from the cache by excessive scanning - but if you clear the cache of the buffer before testing, then there is no blocks which need protection , and the algorithm has a branch that takes advantage of this fact to load more blocks in the cache.

    Oracle is smarter than I realized. Thank you for this release, if I ever get a test methodology reproducible for it I'll post it here.

    And in the meantime:

    @arizona9952, I hope see you why it is more that likely that your repeated scans large tables will always drive. The only exception to this that I can see would be if you were to cache tables in a pool to keep, like stefan suggested.
    (by the way, I have to warn you that the situation is complicated in the current version, due to in part to changes in treatment parallel and direct series of e/s)

  • Swivel in query data

    Greetings from Oracle gurus!

    Problem:
    I'm writing a query that performs some data pivoting. I've done this before in the past on smaller data sets, and they worked very well. However, now I do against a table that has more than 1 million records. What I'm looking for is the most effective method by doing this. I saw the means to do so using "alls union" in a query WITH. I've seen how create columns in the query with max() and decode() functions. So... What is the best way to rotate the data? I've seen listagg(), but that only comes with Oracle 11 + I think... so I have to bust out some magic here sql codes.

    All good things:

    Running Oracle 10.2

    Sample data:
    drop table WO_COMMENTS;
      CREATE TABLE "WO_COMMENTS"
        (
          "ORDER_NO"      varchar2(10),
          "COMMENT_SEQ"   number,
          "COMMENT_TYPE"  VARCHAR2(4) ,
          "COMMENT_TEXT"  VARCHAR2(80)
        );
        
    SET DEFINE OFF;
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',1,'WOMM','Test1');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',2,'WOMM',null);
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',10,'WOMM','The ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',11,'WOMM','big ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',12,'WOMM','blue ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',13,'WOMM','dog ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',14,'WOMM','died ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',20,'WOMM','Yet ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',21,'WOMM','again');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',22,'WOMM',' an ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',23,'WOMM','issue');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',24,'WOMM',null);
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',30,'WOMM','will ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',31,'WOMM','it ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',32,'WOMM','get ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',33,'WOMM','fixed');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',34,'WOMM','?  ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',35,'WOMM','    ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',36,'WOMM','No ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',37,'WOMM','One ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',38,'WOMM','will ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',39,'WOMM','ever ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W00284',40,'WOMM','know!');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W33005',1,'DOCR','Holy ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W33005',2,'DOCR','cow ');
    insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W33005',3,'DOCR','pie! ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W33005',1,'RTMM','This ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W33005',2,'RTMM','is ');
    insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W33005',3,'RTMM','an ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W33005',4,'RTMM','& ');
    Insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W33005',5,'RTMM','!!!  ');
    insert into WO_COMMENTS (ORDER_NO,COMMENT_SEQ,COMMENT_TYPE,COMMENT_TEXT) values ('00W33005',1,'WOMM','Test9');
    commit;
    
    
    SELECT   
          ORDER_NO  as OBJECT_ID       ,
          COMMENT_TYPE as ATTACHMENT_REF     , 
          RTRIM (XMLAGG (xmlelement (E, COMMENT_TEXT || ' ' ) order by comment_seq).extract ('//text()')
          , ',')       as NOTE      
      from WO_COMMENTS a
      where order_no in ('00W00284', '00W33005')
      GROUP BY order_no      ,
        comment_type ;
    What I would like the data to look like:
    OBJECT_ID     ATTACHMENT_REF     NOTE
    00W00284     WOMM     Test1  The  big  blue  dog  died  Yet  again  an  issue  will  it  get  fixed ?        No  One  will  ever  know! 
    00W33005     DOCR     Holy  cow  pie!  
    00W33005     RTMM     This  is  an  &  !!!   
    00W33005     WOMM     Test9 
              
    With the query used the '&' in the third record appears as "& amp;'." How can I manage with special characters in this case?

    I know that these data have absolutely nothing to do with XML, but using the xmlagg is sort of something I got to do what I need, so it is very easy to implement. I don't know how this affects performance though. Also, note this is part of a data conversion effort, so he plans to have some of these columns to come back completely zero at the moment. All "most effective"?

    I think I've covered everything that people may need...

    Would be very happy for any help anyone can offer :)

    Edit: New problem with special characters. New data of the sample and the output provided.

    Published by: dvsoukup on August 16, 2012 11:21

    Hello

    Thank you. The simplified problem is much better to work with. I think it will help you to understand better the solutions, too.

    dvsoukup wrote:
    ... What I would like the data to look like:

    OBJECT_ID     ATTACHMENT_REF     NOTE
    00W00284     WOMM     Test1  The  blue  died  again issue will  get  know! ever  will  One  No       ?   fixed it    an  Yet  dog  big
    00W33005     DOCR     Holy  pie!  cow
    00W33005     RTMM     I  !!!   gurus love  SQL
    00W33005     WOMM     Test9           
    

    Make sure what you ask you might get it.
    Is this really what you want, or is it an example of the unordered output, you don't want to?
    Here's what you really want?

    OBJECT_ID  ATTA NOTE
    ---------- ---- ------------------------------------------------------------
    00W00284   WOMM Test1 The  big  blue  dog  died  Yet  again  an  issue will
                    it  get  fixed ?   No  One  will  ever  know!
    
    00W33005   DOCR Holy  cow  pie!
    00W33005   RTMM I  love  SQL  gurus !!!
    00W33005   WOMM Test9
    

    The 'NOTE' should actually be in order based off of the comment_seq, but don't know how to do in this regard.

    To specify that you want the results in order to XMLAGG, use XMLAGG (... ORDER BY x), like this:

    select
          ORDER_NO          as OBJECT_ID       ,
          COMMENT_TYPE      as ATTACHMENT_REF     ,
          rtrim ( xmlagg ( xmlelement ( E
                                             , COMMENT_TEXT || ' '
                          )
                     order by  COMMENT_SEQ          -- *****  NEW  *****
                   ).extract ('//text()')
                , ','
             )          as NOTE
    from  WO_COMMENTS      a
    where ORDER_NO        in ('00W00284', '00W33005')
    group by ORDER_NO      ,
                COMMENT_TYPE ;
    

    Have you tried SYS_CONNECT_BY_PATH? To get the right order with CONNECT BY, you must be able to identify the first item in the list and account with any article, the following. For example, if you have a variable called r_num which is whole consecutive, starting with 1, you can use

    ...
    START WITH      r_num  = 1
    CONNECT BY      r_num  = PRIOR r_num + 1
    

    Looks like comment_seq is unique within each group, but unpredictable, but you can use the function ROW_NUMBER analytic something predicatable derive from it:

    WITH      got_r_num    AS
    (
         SELECT     order_no          AS object_id
         ,      comment_type          AS attachement_ref
         ,      comment_text
         ,     ROW_NUMBER () OVER ( PARTITION BY  order_no
                                   ,                    comment_type
                             ORDER BY        comment_seq
                           )    AS r_num
         FROM     wo_comments
    )
    ...
    
  • Querying Dates

    Hi all

    I'm trying a few dates in the calendar of an output database. I have a field named StartDate in the table in the format mm/dd/yyyy. I want to output "future dates" for each month, so I wrote this query:

    < name cfquery = "January" datasource = "dsn" >

    SELECT *.

    SITE: calendar

    WHEN DatePart (', [StartDate]) = 1 and StartDate > = #Today # and Archive = 0

    ORDER BY StartDate ASC, StartTime ASC

    < / cfquery >

    FYI: #Today # is in mm/dd/yyyy format

    This query works, but it returns also past dates for the month.  1 question, is why the query above is not remove dates that have passed for the month of January?  Is there a way to do this?

    Somewhere else on the page, I would like to release all dates, less #Today # that are passed for the current calendar year, so I wrote this query:

    < cfquery = name 'Past' datasource 'dsn' = >

    SELECT *.

    SITE: calendar

    WHERE DatePart ("yyyy", [StartDate]) = #Dateformat (Today, 'yyyy') # and StartDate < #Today # and Archive = 0

    ORDER BY StartDate ASC, StartTime ASC

    < / cfquery >

    This query returns 0 results, but there are quite a few records that should be counted.  2 it is, can someone explain why it does not work?

    I thank in advance the assistance!

    My guess is a descrepancy format. You must use the cfqueryparam tag or function createODBCDate() to spend in your date in the query.

    SELECT *.

    SITE: calendar

    WHERE

    DatePart (', [StartDate]) =.

    and StartDate > =.

    and Archive =

    ORDER BY StartDate ASC, StartTime ASC

    I hope this helps.

  • Query date picker display "No data found", works of SQLPlus

    I created 2 date picker points - P2_START_DATE and P2_END_DATE. I put the format of these of 'DD-MON-YYYY '.

    This is the query that I select the data based on the values of the selectors of dates:

    Select sample_date, reading of meter_data where sample_date between to_date(:P2_START_DATE,'DD-MON-YYYY') and to_date(:P2_END_DATE,'DD-MON-YYYY') order by sample_date;

    The P2_START_DATE is January 20, 2011 ', the P3_END_DATE is January 21, 2011'.

    The query returns "No data found" when it is run in the APEX, but when I run the present in SQLPlus on the host I get data:


    Select sample_date, reading of meter_data where sample_date between to_date('20-JAN-2011','DD-MON-YYYY') and to_date('21-JAN-2011','DD-MON-YYYY') order by sample_date

    20 JANUARY 2011.39
    20 JANUARY 2011.14
    20 JANUARY 2011.14
    20 JANUARY 2011.18
    21 JANUARY 2011.13


    Can someone explain what I am doing wrong? I tried a few different formats, without success.

    TIA

    Please update your handle something more staff.

    Watched your applications and found several problems. Make a copy of the Page 2 (Page 4); He is currently working.

    Problems found at Page 2:

    select null link, SAMPLE_DATE label, READING value1
    from  "METER_DATA"."METER_READINGS"
    

    had no Where clause; missing a statement Between; added or replaced Where clause to Page 4 for your original message

    select null link, SAMPLE_DATE label, READING value1
    FROM   meter_readings
    WHERE  sample_date BETWEEN To_date(:P4_START_DATE, 'DD-MON-YYYY') AND To_date(
                               :P4_END_DATE, 'DD-MON-YYYY')
    ORDER  BY sample_date;
    

    Button named; Date range Submit Redirected to Page 3 application; delete this button and adding a Submit button, who introduced the page and graphics rendering

    Maximum lines map series has been set at 50 lines; I fell the value up to 500 in order to get all the dates in your meter_readings table

    Jeff

  • need help with sql query dates

    Hello

    I have a sql query I need to extract some info between data dates. Where clause in this query is:

    WHERE CPD_BUS_UNIT =: ESI_PRM_1
    AND CPD_VOUCHER_DATE > =: P_DATE_FROM
    AND CPD_VOUCHER_DATE < (: P_DATE_TO + 1).

    When I run the query into a toad, I can view the data, but not the execution plan. It gives an error ORA-00932 inconsistent Datatypes.
    But when I remove (+ 1): P_DATE_TO, I can c the execution plan and data. The data will be different from the previous.

    Please suggest how to rewrite the query.

    Can you please give it a try?

    WHERE CPD_BUS_UNIT=:ESI_PRM_1
    AND CPD_VOUCHER_DATE >= :to_date(P_DATE_FROM)
    AND CPD_VOUCHER_DATE < (to_date(:P_DATE_TO)+1) 
    

    Concerning

  • Querying Dates in Access

    I'm trying to run a query on an Access table that lists a series of past events and future. I would like to be able to pull all the records of past events by comparing the date field in the table at the moment. Is it possible to write this in the query? Something like... WHERE eventDate < #Now () #? Help! :)

    OK, so just changed something a bit in my code, and it works now! My field is a date data type. In case anyone is interested...

Maybe you are looking for