SQL Query to extract data between days and hours

Hi friends,

I need, it's for a report.

Description of the requirement:

Need of data in a table located in an OPEN State for more than 30 minutes

Well, this requirement can be obtained from

Select * from xyz where status = 'OPEN' and last_update_date < = sysdate---30 (30/1440) minutes and it is a parameter

the query above will search all data that are in the OPEN State beginning to sysdate - (30/1440). So I want to change the query to restrict the data, by adding another parameter as DAY

for example if I give 10 day, it should recover all the data only within 10 days and sysdate-30 minutes.

We use the last_update_date column to restrict the day.

If I do not give any date for entry must retrieve all records of sysdate-30 minutes.

If I don't give minutes that he must retrieve all the records in the OPEN State.

is the question clear enough? my English is bad.

Please suggest me a query...

Thank you and best regards,

Arun Thomas T

Hello

Select * from xyz where status = 'OPEN' and

last_update_date between nvl (sysdate -: days, last_update_date) and nvl2 (: days, last_update_date, nvl (sysdate -: minutes/1440, last_update_date));

that should be:

If the days parameter is entered then start - days of sysdate and end up last_update_date

If days is not entered then boot from the begingin and put end to:

1 sysdate -: minutes/1440 if minutes are entered

2. up to the last_update_date if the minutes are not entered

Tags: Database

Similar Questions

  • SQL query to group data by Code and dates

    Hello

    I have the following table structure

    col1 col2 col3
    January 21, 2012 tested Code1
    January 20, 2012 tested Code1
    June 1, 2012 tested Code1
    June 1, 2012 tested Code2
    code 3 tested June 4, 2012

    so now

    The output should be something like

    code week1 week semaine2 3 semaine4 week5 until the last 14 weeks from the date that we run
    code 1 1 0 0 0 0
    Code2 1 0 0 0 0
    code 3 0 1 0 0 0

    where 1, 0 is in fact the charges and no sum and the week in this case perhaps should since we are in the second week, he should be

    code... .week3 may semaine4 peut week1 jun week2june


    Was looking for suggestions on how to achieve this.

    I guess that this would require some kind of a pivot query?

    Thank you
    Sun

    Hello

    Here's how you can make this pivot in Oracle 10.2. (In fact, it will work in Oracle 9.1 or higher.)

    WITH  got_week_num  AS
    (
         SELECT  error_code, date_logged
         ,     1 + FLOOR ( ( TO_DATE (:end_dt_txt, 'DD-Mon-YYYY') - date_logged)
                         / 7
                     )     AS week_num
         FROM    data_analysis
         WHERE     date_logged     >= TO_DATE (:start_dt_txt, 'DD-Mon-YYYY')
         AND     date_logged     <  TO_DATE (:end_dt_txt,   'DD-Mon-YYYY') + 1
    )
    ,     all_weeks     AS
    (
         SELECT     LEVEL               AS week_num
         ,     TO_CHAR ( 1 + TO_DATE (:end_dt_txt, 'DD-Mon-YYYY')
                       - (7 * LEVEL)
                   , 'fmDD-Mon-YYYY'
                   )          AS heading
         FROM    dual
         CONNECT BY     LEVEL <= 1 + FLOOR ( ( TO_DATE (:end_dt_txt,   'DD-Mon-YYYY')
                                             - TO_DATE (:start_dt_txt, 'DD-Mon-YYYY')
                                  )
                                / 7
                                )
    )
    SELECT       NULL                                   AS error_code
    ,       MIN (CASE WHEN week_num =  1 THEN heading END)     AS week_1
    ,       MIN (CASE WHEN week_num =  2 THEN heading END)     AS week_2
    --       ...
    ,       MIN (CASE WHEN week_num =  5 THEN heading END)     AS week_5
    FROM       all_weeks
           --
         UNION ALL
                --
    SELECT       error_code
    ,       TO_CHAR (COUNT (CASE WHEN week_num =  1 THEN 1 END))     AS week_1
    ,       TO_CHAR (COUNT (CASE WHEN week_num =  2 THEN 1 END))     AS week_2
    --       ...
    ,       TO_CHAR (COUNT (CASE WHEN week_num =  5 THEN 1 END))     AS week_5
    FROM       got_week_num
    GROUP BY  error_code
                 --
    ORDER BY  error_code     NULLS FIRST
    ;
    

    Output:

    ERROR_CODE WEEK_1      WEEK_2      WEEK_5
    ---------- ----------- ----------- -----------
               4-Jun-2012  28-May-2012 7-May-2012
    a          3           0           0
    b          0           2           1
    c          0           0           1
    

    Once more, the number of columns, such as aliases, is hard-coded in the query.
    If you want the number of columns, or their aliases depends on the data in the table, then you need dynamic SQL. See {message identifier: = 3527823}

    Did you ever what defined a "week" is in this query?
    The query above makes week_1 end to the given date (: end_dt_txt). The first week (in other words, an ioncluding: start_dt_txt) may have less than 7 days.
    If you want all the weeks to start Monday (in which case, the first and the last few weeks may have less than 7 days) see stew solution, using TRUNC (date_logged, 'IW').

  • SQL query for retrieving data based on Certain model

    Hi all

    I want to retrieve all the identifiers of all the people who are permanently seated during the last hour.

    Data are expressed as below:

    -Creation of the activity Table

    CREATE TABLE activity_log

    (

    Username, NUMBER of

    Activity VARCHAR2 (30),

    StartTime VARCHAR2 (6).

    EndTime VARCHAR2 (6)

    );

    -Filling with sample data

    INSERT INTO activity_log VALUES('39','Walking','09:01','09:05');

    INSERT INTO activity_log VALUES('39','Walking','09:06','09:10');

    INSERT INTO activity_log VALUES('39','Sitting','09:11','09:15');

    INSERT INTO activity_log VALUES('39','Sitting','09:16','09:20');

    INSERT INTO activity_log VALUES('39','Sitting','09:21','09:25');

    INSERT INTO activity_log VALUES('39','Standing','09:26','09:30');

    INSERT INTO activity_log VALUES('39','Standing','09:31','09:35');

    INSERT INTO activity_log VALUES('39','Sitting','09:36','09:40');

    INSERT INTO activity_log VALUES('39','Sitting','09:41','09:45');

    INSERT INTO activity_log VALUES('39','Sitting','09:46','09:50');

    INSERT INTO activity_log VALUES('39','Sitting','09:51','09:55');

    INSERT INTO activity_log VALUES('39','Sitting','09:56','10:00');

    INSERT INTO activity_log VALUES('39','Sitting','10:01','10:05');

    INSERT INTO activity_log VALUES('39','Sitting','10:06','10:10');

    INSERT INTO activity_log VALUES('39','Sitting','10:11','10:15');

    INSERT INTO activity_log VALUES('39','Sitting','10:16','10:20');

    INSERT INTO activity_log VALUES('39','Sitting','10:21','10:25');

    INSERT INTO activity_log VALUES('39','Sitting','10:26','10:30');

    INSERT INTO activity_log VALUES('39','Sitting','10:31','10:35');

    INSERT INTO activity_log VALUES('39','Standing','10:36','10:40');

    INSERT INTO activity_log VALUES('39','Standing','10:41','10:45');

    INSERT INTO activity_log VALUES('39','Walking','10:46','10:50');

    INSERT INTO activity_log VALUES('39','Walking','10:51','10:55');

    INSERT INTO activity_log VALUES('39','Walking','10:56','11:00');

    INSERT INTO activity_log VALUES('40','Walking','09:01','09:05');

    INSERT INTO activity_log VALUES('40','Walking','09:06','09:10');

    INSERT INTO activity_log VALUES('40','Sitting','09:11','09:15');

    INSERT INTO activity_log VALUES('40','Sitting','09:16','09:20');

    INSERT INTO activity_log VALUES('40','Sitting','09:21','09:25');

    INSERT INTO activity_log VALUES('40','Standing','09:26','09:30');

    INSERT INTO activity_log VALUES('40','Standing','09:31','09:35');

    INSERT INTO activity_log VALUES('40','Sitting','09:36','09:40');

    INSERT INTO activity_log VALUES('40','Sitting','09:41','09:45');

    INSERT INTO activity_log VALUES('40','Sitting','09:46','09:50');

    INSERT INTO activity_log VALUES('40','Sitting','09:51','09:55');

    INSERT INTO activity_log VALUES('40','Walking','09:56','10:00');

    INSERT INTO activity_log VALUES('40','Sitting','10:01','10:05');

    INSERT INTO activity_log VALUES('40','Standing','10:06','10:10');

    INSERT INTO activity_log VALUES('40','Standing','10:11','10:15');

    INSERT INTO activity_log VALUES('40','Walking','10:16','10:20');

    INSERT INTO activity_log VALUES('40','Walking','10:21','10:25');

    INSERT INTO activity_log VALUES('40','Walking','10:26','10:30');

    INSERT INTO activity_log VALUES('40','Sitting','10:31','10:35');

    INSERT INTO activity_log VALUES('40','Sitting','10:36','10:40');

    INSERT INTO activity_log VALUES('40','Sitting','10:41','10:45');

    INSERT INTO activity_log VALUES('40','Standing','10:46','10:50');

    INSERT INTO activity_log VALUES('40','Walking','10:51','10:55');

    INSERT INTO activity_log VALUES('40','Walking','10:56','11:00');

    Based on the data, the user ID 39 must be found, since it's been sitting since 09:36-10:35, which is continuous 1 hour.

    Any guidance how to do using SQL query will be of great help and appreciation.

    Thank you very much

    Kind regards

    Bilal

    So what exactly is wrong with the request that I already gave you?

    Just to remind one untested (because of lack of insert statements) rewrite according to your new data:

    with grp as)

    Select

    username

    UserRecognizedActivityID activity

    starttime

    starttime + endetime + 1

    row_number() over (partition by order of starttime userid)

    -ROW_NUMBER() over (partition of userid, UserRecognizedActivityID order of starttime)

    RN

    of activity_log

    )

    Select

    username

    min (starttime) starttime

    max (endtime) endtime

    max (activity) activity

    GRP

    Group userid, rn

    with round (max (endtime) - min (starttime) * 24 * 60) > = 59

  • How to extract data from SAP and COBOL using ODI

    Hi people,

    Can you please let me know the procedures in ODI to extract data from SAP and COBOL?

    Thank you all for the help.

    Hello

    You can download the Patch 8571830 to Oracle metalink.

    It has a new technology 'SAP ABAP' and KMs to extract and load data-
    RKM SAP ERP and SAP to Oracle ERP LKM.

    Thank you

  • Help me with SQL Query to retrieve data from a view

    Hello Guru,

    I need help in my sql query.
    I use SQL TeraData.
    I want an Oracle result in the following form-

    Open tickets
    Open months failure / Repair Service s/o improvement request Total general
    2009-01-2 4 4 5 15
    2009-02 1 0 2 3 6
    2009-03 4 1 2 2 9
    Grand Total 7 5 8 10 30


    I wrote the query as where - TIME_PERIOD, RQST_TYPE_DM and DEMAND_SUMMARY_FCT are the points of view and I extract the data from the views only.

    Select NVL (CA. TIME_PERIOD. PERIOD_CD, 'Total') THAT year.
    COUNT (CASE WHEN CA. RQST_TYPE_DM. RQSTTYP_DESC Like '% of Break' THEN 1 END) as BreakFix
    COUNT (CASE WHEN CA. RQST_TYPE_DM. RQSTTYP_DESC as 'N/a', 1 END) by n/a
    COUNT (CASE WHEN CA. RQST_TYPE_DM. RQSTTYP_DESC as 'Improvement' THEN 1 END) accessories
    COUNT (CASE WHEN CA. RQST_TYPE_DM. RQSTTYP_DESC Like '% Service' THEN 1 END) as ServiceRequests
    COUNT (CA. RQST_TYPE_DM. RQSTTYP_DESC) AS grand_total
    FROM CA. TIME_PERIOD, CA. RQST_TYPE_DM, CA. DEMAND_SUMMARY_FCT
    WHERE (CA. DEMAND_SUMMARY_FCT. RQSTTYP_ID = CA. RQST_TYPE_DM. RQSTTYP_ID)
    AND (CASE
    WHEN CA. DEMAND_SUMMARY_FCT. MONTH_ID = CA. TIME_PERIOD. PERIOD_ID, 1
    WHEN {fn concat ({fn concat (SUBSTR (CA. TIME_PERIOD. {(PERIOD_CD, 3, 4),'-')}, SUBSTR (CA. TIME_PERIOD. PERIOD_CD, 7, 2))} BETWEEN ' 2009-01' AND ' 2009-03' THEN 1
    WHEN CA. DEMAND_SUMMARY_FCT. RQSTTYP_ID = '1' then 1
    END) = 1
    GROUP BY ROLLUP (CA. TIME_PERIOD. PERIOD_CD)

    After executing the query, I get the following error:
    3076: syntax Error: Data Type 'Time' does not match a defined Type name.
    :( Kindly help me with this and let me know where I'm wrong... Please.

    Messages indicates something wrong with your data... It would seem that the data does not match your format mask.

    Thus, the data or the format mask.

  • Query to group dates by days of week

    Hello

    I have a table that contains the following format:

    Create table testtable(dates date, day_identifier number);
    
    

    The data in the day_identifier column contains day dates of the corresponding columns of equivalent week, i.e.

    to_char(dates, 'd')
    
    

    The table contains a suite of sample data:

    Dates Day_identifier
    October 1, 20133
    October 2, 20134
    4 October 20136
    October 6, 20131
    8 October 20133
    October 9, 20134
    October 11, 20136
    October 18, 20136
    October 21, 20132
    October 23, 20134

    I am looking for a query that will consolidate the data above, based on the data in the column day_identifier in the following format:

    October 1, 2013 1346 11 October 2013

    18 October 2013 October 23, 2013 246

    The above so well developed data for example

    all dates between October 1, 2013 and October 11, 2013 and the day of the week value in 1,3,4,6

    and

    all dates between October 18, 2013 and October 23, 2013 and having the day of the week in 2,4,6 value

    will give me the result set from the table above.

    Please help me to solve the issue.

    Thank you.

    Like this? (The results of the previous example has not changed)

    with

    flights only

    (select 1 flight_no, to_date('01-10-2013','dd-mm-yyyy') flight_date, 3 day_identifier union double all the)

    Select 1, to_date (March 10, 2013 ', 'dd-mm-yyyy'), 5 Union double all the

    Select 1, to_date (August 10, 2013 ', 'dd-mm-yyyy'), 3 union double all the

    Select 1, to_date (October 10, 2013 ', 'dd-mm-yyyy'), 5 Union double all the

    Select 1, to_date (October 15, 2013 ', 'dd-mm-yyyy'), 3 union double all the

    Select 1, to_date (17 October 2013 ', 'dd-mm-yyyy'), 5 Union double all the

    Select 1, to_date (22 October 2013 ', 'dd-mm-yyyy'), 3 union double all the

    Select 1, to_date (24 October 2013 ', 'dd-mm-yyyy'), 5 Union double all the

    Select 1, to_date (29 October 2013 ', 'dd-mm-yyyy'), 3 union double all the

    Select 1, to_date (31 October 2013 ', 'dd-mm-yyyy'), 5 Union double all the

    Select 1, to_date (11 may 2013 ', 'dd-mm-yyyy'), 3 union double all the

    Select 1, to_date (11 July 2013 ', 'dd-mm-yyyy'), 5 Union double all the

    -select 1, to_date (December 11, 2013 ',' dd-mm-yyyy '), 3 all the double union

    Select 1, to_date (14 November 2013 ', 'dd-mm-yyyy'), 5 Union double all the

    -select 1, to_date (November 19, 2013 ',' dd-mm-yyyy '), 3 all the double union

    Select 1, to_date (November 21, 2013 ', 'dd-mm-yyyy'), 5 Union double all the

    -select 1, to_date (November 26, 2013 ',' dd-mm-yyyy '), 3 all the double union

    Select 1, to_date (28 November 2013 ', 'dd-mm-yyyy'), 5 Union double all the

    Select 1, to_date (March 12, 2013 ', 'dd-mm-yyyy'), 3 union double all the

    Select 1, to_date (May 12, 2013 ', 'dd-mm-yyyy'), 5 Union double all the

    Select 1, to_date (12 October 2013 ', 'dd-mm-yyyy'), 3 union double all the

    Select 1, to_date (12 December 2013 ', 'dd-mm-yyyy'), 5 Union double all the

    Select 1, to_date (17 December 2013 ', 'dd-mm-yyyy'), 3 union double all the

    Select 1, to_date (19 December 2013 ', 'dd-mm-yyyy'), 5 Union double all the

    Select 1, to_date (24 December 2013 ', 'dd-mm-yyyy'), 3 union double all the

    Select 1, to_date (December 26, 2013 ', 'dd-mm-yyyy'), 5 Union double all the

    Select 1, to_date (31 December 2013 ', 'dd-mm-yyyy'), 3 union double all the

    Select 1, to_date (February 1, 2014 "," dd-mm-yyyy '), 5 Union double all the

    Select 1, to_date (July 1, 2014 "," dd-mm-yyyy '), 3 Union double all the

    Select 1, to_date (September 1, 2014 "," dd-mm-yyyy '), 5 Union double all the

    Select 1, to_date (January 14, 2014 "," dd-mm-yyyy '), 3 Union double all the

    Select 1, to_date (January 16, 2014 "," dd-mm-yyyy '), 5 Union double all the

    Select 1, to_date (January 21, 2014 "," dd-mm-yyyy '), 3 Union double all the

    Select 1, to_date (January 23, 2014 "," dd-mm-yyyy '), 5 Union double all the

    Select 1, to_date (28 January 2014 "," dd-mm-yyyy '), 3 Union double all the

    Select 1, to_date (January 30, 2014 "," dd-mm-yyyy'), 5 double

    ),

    all_identifiers as / * collection presents all the day_identifier * /.

    (select flight_no,

    min (from_date) from_date,

    Max (till_date) till_date,

    listagg(day_identifier,'') within the Group (order by day_identifier) day_identifiers

    from (select flight_no,

    min (flight_date) from_date,

    Max (flight_date) till_date,

    day_identifier

    flights

    Flight_no group, day_identifier

    )

    Flight_no group

    ),

    generated_rows as / * generate all the resulting flight_date of day_identifier collected * /.

    (select flight_no,

    flight_date,

    day_identifier

    from (select flight_no,

    level - 1 flight_date, + from_date

    case when instr (day_identifiers, to_char (tier - 1, + from_date '))! = 0

    then to_number (to_char (tier - 1, + from_date '))

    end day_identifier

    of all_identifiers

    connect by level<= till_date="" -="" from_date="" +="">

    and prior flight_no = flight_no

    and prior sys_guid() is not null

    )

    where day_identifier is not null

    ),

    matched_rows as / * flights against generated lines correspondence reveals partially cancelled flights * /.

    (select g.flight_no,

    g.flight_date generated_date,

    g.day_identifier,

    f.day_identifier flights_day_identifier / * null if flight cancelled * /.

    of generated_rows g

    left outer join

    F flights

    on g.flight_no = f.flight_no

    and g.flight_date = f.flight_date

    and g.day_identifier = f.day_identifier

    ),

    grouped_rows as / * grouping of lines before the final response * /.

    (select flight_no,

    generated_date,

    day_identifier,

    flights_day_identifier,

    -case when count (day_identifier) on (partition flight_no, day_identifier

    order of generated_date

    lines between unbounded preceding and following unlimited

    ) =

    Count (flights_day_identifier) more (partition flight_no, day_identifier

    order of generated_date

    lines between unbounded preceding and following unlimited

    )

    Press 'Finish' / * count_of_all = count_of_not_null * /.

    otherwise "incomplete".

    end day_identifier_type

    ROW_NUMBER() (flight_no, day_identifier, generated_date order)-

    ROW_NUMBER() over grp (partition flight_no, day_identifier, flights_day_identifier order of generated_date) / * tabibitosan * /.

    of matched_rows

    )

    Select flight_no,

    TO_CHAR (min (from_date), 'dd-mon-yyyy') from_date,.

    TO_CHAR (Max (till_date), 'dd-mon-yyyy') till_date,.

    listagg(day_identifier,'') within the Group (order by day_identifier) day_identifiers

    from (select flight_no,

    min (generated_date) from_date,

    Max (generated_date) till_date,

    day_identifier

    of grouped_rows

    where day_identifier_type = 'Finish '.

    Flight_no group, day_identifier

    )

    Flight_no group

    Union of all the

    Select distinct

    flight_no,

    TO_CHAR (FIRST_VALUE (generated_date) more (partition flight_no, grp)

    order of generated_date

    ),

    "Mon-dd-yyyy".

    ) from_date,.

    TO_CHAR (FIRST_VALUE (generated_date) more (partition flight_no, grp)

    generated_date desc order

    ),

    "Mon-dd-yyyy".

    ) till_date,.

    TO_CHAR (day_identifier) day_identifiers

    of grouped_rows

    where day_identifier_type = "incomplete".

    and flights_day_identifier is not null

    order of flight_no, day_identifiers, from_date

    FLIGHT_NO FROM_DATE TILL_DATE DAY_IDENTIFIERS
    1 October 1, 2013 November 5, 2013 3
    1 December 3, 2013 January 28, 2014 3
    1 October 3, 2013 January 30, 2014 5

    Concerning

    Etbin

  • SQL query for the date picker

    Page reports there are two 2 day points of selector (a datefrom and dateto a) as well as a text field with AutoComplete element

    When the page load query is called

    SELECT
    "COL1",.
    "COL2",.
    "COL3",.
    the "table".
    WHERE MYDATES BETWEEN: DATEFROM AND: DATETO

    Which returns no records as no dates have been set.

    How can I modify this search in the clause WHERE on the date element is called only when the dates are chosen, and by default, all the records are displayed.

    The other related issue is when I chose a from and to date and research when I return in the view page of report after consulting other pages in the application of the apex, the form fields are always met, how can I delete the session down for these fields when the user leaves the page?

    I hope that all makes sense?
    To_date('01.01.2100', 'dd.mm.yyyy')
    

    will give erroneous results on 02.01.2100 :D

    A solution (perhaps there's a cleaner...):

    WHERE
    ((:DATEFROM IS NOT NULL AND mydates >= :DATEFROM) OR :DATEFROM IS NULL)
    AND
    ((:DATETO IS NOT NULL AND mydates <= :DATETO) OR :DATETO IS NULL)
    

    You can change the message "No data found" by changing the attribute of report "when no. found data Message."

    EDIT: And for the 2nd question, Alex is right, but also be aware that Firefox retains the values of form element when the page refreshes, you must change the Security page 'Form Auto Complete' to 'Disabled' attribute if you do not want to keep the form element values.

    Published by: Yann39 on June 27, 2012 06:30

  • The search syntax of SQL query against the data type varchar2 preserving valid data.

    Have a data model that we are not allowed to change and the column in question is a varchar2 (20). The column has at this stage no foreign key to the list of valid values. So, until we can get those who control the data model in order to make the adjustments we need for a SQL query that root out us bad data on the hours fixed.

    Is what we expect to be good data below:

    -Whole number, without floating point
    -Length of 5 or less (greater than zero but less than 99999)
    -Text "No_RP" can exist.

    Request demo below works most of the time with the exception of 'or Column1 is null' is not contagious in the null record. I tried to change the logical terms around, but did not understand the correct layout still provide it. So help would be greatly appreciated it someone could put me straight on how to properly register a null value in the recordset that has been selected with other types of error for end users to correct their mistakes. Another thing, I suppose there could be a better approach syntactically to a call find all offender characters such as *, &, (and so on.)

    WITH Sample_Data AS (SELECT '0' collar OF DOUBLE UNION ALL)
    SELECT "2" collar OF DOUBLE UNION ALL
    SELECT "99999" col OF DOUBLE UNION ALL
    SELECT "100000" col OF DOUBLE UNION ALL
    SELECT '1 a' collar OF DOUBLE UNION ALL
    SELECT the "ABCD" OF DOUBLE UNION ALL pass
    SELECT 'A1' collar OF DOUBLE UNION ALL
    SELECT ' *' collar OF DOUBLE UNION ALL
    SELECT "/" pass OF DOUBLE UNION ALL
    SELECT '-' col OF DOUBLE UNION ALL
    SELECT ' ' collar OF DOUBLE UNION ALL
    SELECT "pass OF DOUBLE UNION ALL
    4. SELECT 5 6' collar OF DOUBLE UNION ALL
    SELECT "24.5" collar OF DOUBLE UNION ALL
    SELECT '-3' collar OF DOUBLE UNION ALL.
    SELECT 'A' collar OF DOUBLE UNION ALL
    SELECT 'F' OF DOUBLE UNION ALL cervical
    SELECT the 'Z' OF DOUBLE UNION ALL pass
    SELECT the pass 'Bye' FROM DUAL UNION ALL
    SELECT the "Hello World" OF DOUBLE UNION ALL pass
    SELECT "=" col OF DOUBLE UNION ALL
    SELECT "+" col OF DOUBLE UNION ALL
    SELECT '_' pass OF DOUBLE UNION ALL
    SELECT '-' col OF DOUBLE UNION ALL
    SELECT ' (' col OF DOUBLE UNION ALL)
    SELECT ')' collar OF DOUBLE UNION ALL
    SELECT '&' collar OF DOUBLE UNION ALL
    SELECT ' ^' collar OF DOUBLE UNION ALL
    SELECT '%' collar OF DOUBLE UNION ALL
    SELECT the pass of "$" OF DOUBLE UNION ALL
    SELECT the pass ' # ' TO DOUBLE UNION ALL
    SELECT ' @' collar OF DOUBLE UNION ALL
    SELECT '!' collar OF DOUBLE UNION ALL
    SELECT ' ~' collar OF DOUBLE UNION ALL
    SELECT "' collar OF DOUBLE UNION ALL
    SELECT '.' pass FROM DUAL
    )
    SELECT col from Sample_data
    WHERE (translate (col, '_0123456789', '_') is not null
    or length (col) > 5
    col = 0 or
    or col is null)
    and (upper (col) <>'NO_RP');

    One more thing, I also took the approach of the regular expression, but he could not understand. If anyone knows how to do with this approach, I would also appreciate learning this method as well. Below is a close because I had. Impossible to get a range to work as "between 0 and 100000", guessing because of the comparison of varchar2 and # even attempted using to_char and to_number.

    Select to_number (column1) from the testsql where REGEXP_LIKE (column1, ' ^ [[: digit:]] + $') ORDER BY to_number (column1) CSA;

    Thanks in advance for anyone to help.

    Nick

    Hello

    Thanks for posting the sample data in a useable form.
    It would be useful that you also posted the accurate results you wanted from this data. You want the same results as those produced by the query you posted, except that nulls should be included? If so:

    SELECT     col
    FROM     sample_data
    WHERE     CASE
             WHEN  UPPER (col) = 'NO_RP'               THEN  1
             WHEN  col IS NULL                    THEN -1
             WHEN  LTRIM (col, '0123456789') IS NOT NULL     THEN -2
             WHEN  LENGTH (col) > 5               THEN -3
                                           ELSE TO_NUMBER (col)
         END     NOT BETWEEN     1
                  AND          99999
    ;
    

    The requirement that pass! = 0 gives that much more difficult. You could test easily for an integer from 1 to 5 digits, but then you must have a separate condition to make sure that the chain was not '0', '00', '000', ' 0000 'or ' 00000'.
    (Unlike Solomon, I guess that do not want to choose no-0 numbers starting by 0, such as ' 007 'or ' 02138'.)

    Using regular expressions, you may lose a few keystrokes, but you also lose a lot of clarity:

    SELECT     col
    FROM     sample_data
    WHERE     REGEXP_LIKE ( col
                  , '^0{1,5}$'
                  )
    OR NOT     REGEXP_LIKE ( NVL ( UPPER (col)
                     , 'BAD'
                     )
                  , '^(([1-9][0-9]{0,4})|NO_RP)$'
                  )
    ;
    

    Published by: Frank Kulash, December 13, 2010 21:50

    Published by: Frank Kulash, December 13, 2010 22:11
    Added regular expression solution

  • Query to extract data from source DB driver using process mapping pre...

    Hi all
    I have a source for the query to retrieve the data from the source database. Looking for suggestion what could be the best approach to extract data from the source database.using owb 11 GR 1 material.
    I'm in printing, create the process of mapping driver prerequisite for aid to perform immediate with Create table T1 as a sql query. Petition to the sides of the Insert into T1 Select * from Source_Table in the different database.

    Certainly, need to create Db users in the Source database, to get privileges to read the data of source_database.
    Is - this good aproach to or of any other alternative to achieve this.

    Means, create the table in the transit area (desired area, where I am) and using driver mapping process before you run the select query to retrieve data from a data source.

    Would apreciate your previous answer.
    Thank you
    MAK

    You can mark Correct or useful if it solves your purpose.

  • Create the query by combining data from DBSS_Data_Model and HostModel

    Hello

    I am trying to create a dashboard with the host server list and instances of SQL Server running on the host.

    And I am interested in creating a query by combining data model of data in SQL Server (DBSS_Data_Model) and the host (Hosts) data model, say: which connects DBSS_SQL_Server_Host and host.

    I wonder if there is way to do it?

    Thank you

    Mark

    Something like this function should work:

    def physicalHost = nullqueryStatement = server.QueryService.createStatement("(Host where name = '$hostName')")result = server.QueryService.executeStatement(queryStatement)objs=result.getTopologyObjects()
    
    if (objs != null && objs.size() > 0) {     physicalHost = objs.get(0)}return physicalHost
    

    When the input parameter "hostName" is DBSS_SQL_Server_Host.physical_host_name

    Kind regards

    Brian Wheeldon

  • Move data between linux_32 and linux_64 database files

    Hello

    Can I move a database (all data files, controlfiles, etc.) between different architectures? I've already moved the databases between servers with the same os + arch but have not moved between 32 and 64 bit...


    Concerning
    Ricardo

    If you want to move the database for 32-bit linux for 64-bit linux, you can by simply restore rman on 64-bit box of backup performed in 32-bit box. Once this is done, run $ORACLE_HOME/rdbms/admin/utlirp.sql and run $ORACLE_HOME/rdbms/admin/utlrp.sql to recompile your objects from 32-bit to 64-bit word size.

  • Mapping data between BI and another Oracle module

    Hello

    I'm working on a project for a Government of a county of Florida. We have a problem with project accounting R12 data match with OBIEE 11g database. It is more of a field of marking problem than a problem of integrity of data - for example, 'Customer number' in the accounting of the project is clearly a different data set than 'Customer number' in BI. We know the same data exists in BI but we do not know what it is called and therefore how interrogate him for reporting purposes. Why data marking is so messed up, I can't say, but I was in charge of mapping 'Customer number' (and everything else) in project accounting to the equivalent of the BI field. Supposedly this mapping does not already exist (which seems impossible since someone set up the flow of data between the two parts of the system at some point!). I hope someone can recommend an approach that doesn't end up being heavy and manual. Thank you!

    Warm greetings to all,

    John

    https://support.us.Oracle.com/OIP/faces/secure/km/DocumentDisplay.JSPX?ID=1274680.1#BODYTEXT

    OR

    Oracle Business Intelligence Applications ETL Data Lineage Guide version 7.9.6.2 and 7.9.6.3 [ID 1274680.1]

    Check this box... It gives you a document of data lineage... Mapping between source and target stage and...

    It certainly helps you the way that you look.

    MAK is useful if it helps...

    Kind regards
    Rayan Vieira

  • Sharing data between planning and HFM

    Hi all

    How can we share data and metadata between HFM and planning, can you help me to list the possible ways? Epma is can be used to do?

    Thank you.

    Check this one as well:
    http://www.SlideShare.NET/Ranzal/HFM-integration-collaborate-2010
    http://www.Network54.com/Forum/58296/thread/1250272593/HFM+data+to+Essbase

    See you soon... !!!

  • How can I capture data between STX and ETX series

    I searched this forum and online for what seems like an easy question, but I have not found a good example. I have data of variable length from a serial port and I need to capture the data between the characters STX and ETX. Here's what I have, which returns all the data characters 2 when:

    data from Byte [] = {0x02, (byte) 0 x 03 (byte)};

    Just read bytes until you get the STX, read data until you get the ETX and sure to properly treat the ESC, which is ignore it and take the next character to what whether, without checking it to ETX or ESC.

  • R12 Payables: SQL query for the list of invoices and their status of Validation

    Hi all

    I am looking for a SQL query that gives me the list of all invoices of AP and their Validation status.

    Thank you
    Anil

    Select invoice_id, invoice_num, invoice_amount, invoice_currency_code, AP_INVOICES_V approval_status_lookup_code;

    Prasanna-

Maybe you are looking for