Group by date

Hi all

I have a table which has the informaiton below

LOGON_TIME (timestamp format)
program (varchar)
machine (varchar)

Now, I want to record a day in the table above

Select logon_time, program, machine tabsbkp.monitor_sessions, where trunc (logon_time) = trunc(sysdate-1) the machine, logon_time, program group;

How can I change the above to get the number of records per hour

Date program Machine
19 December 2013 01:00 - sqlplus server1
19 December 2013 02:00 - server2 sqlplus


Please notify.

Try this query:

-----------------------

with t as

(

Select to_timestamp_tz (19 December 13 11.27.38.278000000 AM + 05:30 ',' DD-MON-RR hh. AM TZH:TZM FF') logon_time, program "SQLPLUS", "server1" double machine

Union of all the

Select to_timestamp_tz (20 December 13 07.57.38.278000000 AM + 05:30 ',' DD-MON-RR hh. AM TZH:TZM FF') logon_time, program "SQLPLUS', 'server' 3 double machine

Union of all the

Select to_timestamp_tz (20 December 13 06.47.38.278000000 AM + 05:30 ',' DD-MON-RR hh. AM TZH:TZM FF') logon_time, program "SQLPLUS", "server1" double machine

Union of all the

Select to_timestamp_tz (19 December 13 11.37.38.278000000 AM + 05:30 ',' DD-MON-RR hh. AM TZH:TZM FF') logon_time, program "SQLPLUS", "server1" double machine

Union of all the

Select to_timestamp_tz (20 December 13 08.23.38.278000000 AM + 05:30 ',' DD-MON-RR hh. AM TZH:TZM FF') logon_time, program "SQLPLUS", "Server2" machine of double

Union of all the

Select to_timestamp_tz (20 December 13 08.27.38.278000000 AM + 05:30 ',' DD-MON-RR hh. AM TZH:TZM FF') logon_time, program "SQLPLUS", "server1" double machine

Union of all the

Select to_timestamp_tz (20 December 13 08.27.38.278000000 AM + 05:30 ',' DD-MON-RR hh. AM TZH:TZM FF') logon_time, program "SQLPLUS", "server1" double machine

Union of all the

Select to_timestamp_tz (20 December 13 09.27.38.278000000 AM + 05:30 ',' DD-MON-RR hh. AM TZH:TZM FF') logon_time, program "SQLPLUS", "server1" double machine

)

LOGON_TIME Select to_char (logon_time, 'dd-mm-yyyy hh24')

program

machine

, count (1) NUM_CONN

t

To_char Group ("logon_time, ' dd-mm-yyyy hh24"), machine program.

LOGON_TIME PROGRAM MACHINE NUM_NODE

--------------------   -----------------   --------------   --------------------

19/12/2013 11 SQLPLUS server1 2

20/12/2013 08 SQLPLUS server1 2

20/12/2013 09 SQLPLUS server1 1

20/12/2013 06 SQLPLUS server1 1

20/12/2013 07 SQLPLUS server3 1

20/12/2013 08 SQLPLUS server2 1

Tags: Database

Similar Questions

  • View threads AND grouped by date/conversation

    Hello

    I try to have messages see the thunderbird grouped by date (with a thread on today, yesterday, last week, etc.) and within these threads, see email conversations. A conversation that began three weeks ago, but has received a new email today is expected to manifest itself in the thread today, but from the first email of the conversation.
    I did not understand if it is possible, as grouped by (Date) and Threads are mutually exclusive. Any ideas?

    Thank you

    As you say, it is there only one group so at the time. So I guess that your out of luck.

    You could fill a better bug report and see if one of the developers interested. https://Bugzilla.Mozilla.org/

  • Group by Date and place of DateTime

    Hello

    I'm looking for some counts that are grouped by date / time to date.  For example, I run the query below.

    ALTER session SET NLS_DATE_FORMAT = "mm/dd/yyyy";

    SELECT count (distinct contact_id) as charges, created in the created_date

    of s_con_chrctr

    where created > = TO_CHAR('07/01/2015') and created < = TO_CHAR('07/10/2015')

    and created_by = 1-J9455J'

    Group by created;

    The results are as follows.

    COUNTIES CREATED_DATE
    102/07/2015
    102/07/2015
    102/07/2015
    102/07/2015

    I think that what is happening because this column is a datetime column and his back one record for each time specific in this day.  The results, I'm looking for are...

    COUNTIES CREATED_DATE
    1502/07/2015
    1703/07/2015
    3504/07/2015
    1105/07/2015

    How to change my query to group by date and not by date time?

    So created is a date, then do not compare with TO_CHAR literals.

    If you want to skip the part time in your account, use TRUNC:

    SELECT count (distinct contact_id) as charges, TRUNC (created) as created_date

    of s_con_chrctr

    where created > = TO_DATE (July 1, 2015 "," MM/DD/YYYY ') and created< to_date('07/11/2015',="">

    and created_by = 1-J9455J'

    TRUNC Group (created);

  • GROUP BY date range to identify duplicates revisited!

    Good afternoon

    It is a continuation of the previous discussion, I previously created GROUP BY date range to identify duplicates

    Your help with the following would be appreciated (examples of data below)

    I've highlighted what I mark as returned to double as below:

    example4.jpg

    Definition of duplicate (However, this is slightly different to the previous post)

    the same account_num

    maximum 20 days apart tran_effective_date

    tran_processed_date maximum 10 days apart

    the same tran_amount

    However, I do not want to return a duplicate if they have both a tran_priced_date filled.

    So, in light of the foregoing, I don't expect the following account_numbers to be marked as duplicate:

    N100283 - one of the records has populated trab_priced_date

    N101640 - none of the records have the tran_priced_date filled

    N102395 - same as N101640

    N102827 - same as N101640

    N108876 - although the two documents have the populated tran_priced_date, the tran_effective_dates are more than 20 days apart.

    BUT for the rest of the accounts, N100284 and N102396 I want to execute the following logic

    Compare the 3rd rank in 4th place and ask the following questions:

    Is tran_effective_date to a maximum of 20 days out?

    Is tran_processed_date maximum 10 days apart?

    If yes then report it as dupe

    Compare line 4 to 5, then ask the same question until you get to the line 4 or 5. When everything is done, I want to examine only the transactions that have the status of normal and if the above question is true for both and then return to my game of result as dupes.

    I hope that makes sense!

    BEGIN
      EXECUTE IMMEDIATE 'DROP TABLE samp_data';
    EXCEPTION
      WHEN OTHERS THEN
        IF SQLCODE = -942 THEN
          DBMS_OUTPUT.put_line('');
        ELSE
          RAISE;
        END IF;
    END;
    /
    
    
    CREATE TABLE samp_data (
      ACCOUNT_NUM             VARCHAR2(17),
      TRAN_ID                 NUMBER(10),
      TRAN_TYPE               VARCHAR2(50),
      TRAN_EFFECTIVE_DATE     TIMESTAMP(6),
      TRAN_PROCESSED_DATE     TIMESTAMP(6),
      TRAN_STATUS             VARCHAR2(17),
      TRAN_PRICED_DATE        TIMESTAMP(6),
      TRAN_AMOUNT             NUMBER(13,2)
      );
    /
    
    
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N100283',140119178,'Regular With',to_timestamp('01-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('22-MAY-15 07.00.34.235000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', to_timestamp('21-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),200);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE,TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N100283',140158525,'Regular With',to_timestamp('13-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('26-MAY-15 08.39.14.090000000 AM','DD-MON-RR HH.MI.SS.FF AM'),'Normal', null,200);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N100284',140118826,'Regular With',to_timestamp('03-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('22-MAY-15 07.00.19.072000000 AM','DD-MON-RR HH.MI.SS.FF AM'),'Normal', to_timestamp('20-MAY-15 03.25.05.438000000 AM','DD-MON-RR HH.MI.SS.FF AM'),450);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N100284',140158120,'Regular With',to_timestamp('06-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('23-MAY-15 08.38.42.064000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Reversed', to_timestamp('21-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),450);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N100284',140158120,'Regular With',to_timestamp('06-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('02-JUN-15 08.38.42.064000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', to_timestamp('31-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),450);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N101640',140118957,'Regular With',to_timestamp('18-MAY-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('22-MAY-15 07.00.25.015000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', null,120);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N101640',140158278,'Regular With',to_timestamp('22-MAY-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('26-MAY-15 08.38.56.228000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', null,130);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102395',140118842,'Regular With',to_timestamp('03-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('22-MAY-15 07.00.19.665000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', null,250);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102395',140158235,'Regular With',to_timestamp('03-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('26-MAY-15 08.38.53.093000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', null,250);
    
    
    
    
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102396',140118823,'Regular With',to_timestamp('09-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('18-MAY-15 07.00.18.931000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', to_timestamp('19-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),750);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102396',140158099,'Regular With',to_timestamp('16-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('24-MAY-15 08.38.39.443000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Reversed', to_timestamp('21-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),750);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102396',140158099,'Regular With',to_timestamp('16-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('29-MAY-15 08.38.39.443000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', to_timestamp('30-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),750);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102396',140158099,'Regular With',to_timestamp('12-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('22-MAY-15 08.38.39.443000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Reversed', to_timestamp('30-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),750);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102396',140158099,'Regular With',to_timestamp('14-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('23-MAY-15 08.38.39.443000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Reversed', to_timestamp('30-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),750);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102827',140118850,'Regular With',to_timestamp('03-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('22-MAY-15 07.00.20.045000000 AM','DD-MON-RR HH.MI.SS.FF AM') , 'Normal',null,157.84);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102827',140158118,'Regular With',to_timestamp('03-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('26-MAY-15 08.38.41.861000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', null,157.84);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N108876',139840720,'Regular With',to_timestamp('01-MAY-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('11-MAY-15 08.35.34.646000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', to_timestamp('20-MAY-15 03.25.05.438000000 AM','DD-MON-RR HH.MI.SS.FF AM'),1000);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N108876',139889880,'Regular With',to_timestamp('22-MAY-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('12-MAY-15 08.49.29.080000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', to_timestamp('21-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),1000);
    /
    
    
    select * from samp_data
    ORDER BY account_num, tran_effective_date, tran_processed_date;
    

    PL continue the discussion in your original post

  • between operator to group by date in the Apex of the interactive reports

    Hello

    In the filter of interactive reports, I could not find the "between" for the date field (got a "group by date" in my (source) sql query.) I wonder, is - this in view of the group by clause date?. Is there a way to show the operator "between" in the interactive reports filter.

    Thank you

    I just opened an existing style IR report, went to the actions, filter, selected a date column and found at the bottom of the list of values... Are you sure of the date that you want to filter on is a real date column?

    Thank you

    Tony Miller
    Webster, TX

    What happens if you were really stalking a paranoid schizophrenic... They would know?

    If you answer this question, please mark the thread as closed and give points where won...

  • Event for groups of data listeners

    Hello

    It's really a silly question, but nobody knows how to define an event listener for a group of data? Is there a 'method called' property for control of presentation "Custom Layout". I don't know how to define a method in a business object so that it shows in the dropdown of listener event.

    The good links on the use of data groups are also appreciated.

    Thank you very much in advance.

    Yes - when you click on the table/group widget in the form editor, there is a section "Event listener" with a property called "invoked method.

    I think you've thought this out on your own, but if you have trouble seeing a method from the drop-down list, create a method that has a type argument Fuego.Util.GroupEvent. If you look at this class, you can see the different types of events that can be received (ADD, INSERT_DOWN, INSERT_UP, REMOVE, and REMOVE_LAST).

    Dan

  • XY graph, initialize multiple groups of data and update a single set of data

    Hello

    I have developed a VI to control the coordinates XY of a platform using two engines. A version a little kwicks xy plotter.

    The detail that I could use help in is the following:

    1. I have 469 (or 400 ~ ish, there are two modes) of the discrete points to be initialized/connected to a XY graph such that a cursor on the XY graph can break discrete points. It's well done.

    2. I have to keep track of the current position of XY. It's also well done

    3. I have to keep track of the former x - y positions, where I was able to extract the data correctly. This is done by a switch.

    4. almost all VI has an a giant loop, although sequenced by 0 of the past a structure of case to another until you reach the final "global dummy.

    Now, signals which are processed to the engine through a USB digital signal generator, I think not that the specifications are necessary. Essentially I needed bitmask each individual movement, so the 'engine of movement' part is embedded in the giant of while loop. This means that whenever I have add a new feature to my VI, engine movements slow down, each time that the operating system on the computer decides to run an automatic update, drivers slow down. I do not have the luxury of multi-threading or a microcontroller immediately.

    At this point, my engines are terribly slowly, and I'll try to find a way to make them run faster. I am sure that the bottleneck is actually treatment for my data XY graph.

    My XY graph manages 3 sets of data,

    a. 469 discrete points (table 2 x 469) or 400 ~ ish discrete points (2 x 400 ~ table)

    position current c. (x, y)

    d. "rescued/captured" (2xincreasing up to table 469) positions

    everything has, b, c are inside the while loop. A switch determines the 'a' is used.

    the berries are grouped and then sent in the graph xy inside the while loop. I need the graph to be interactive to the cursor and data.

    Now, is that I wish to run things.

    1. selection from either a or b is responsible to the xy chart (by a switch).

    2 c is updated constantly.

    3 d is updated whenever the data was captured (by a switch).

    Basically, I need a and or b to run only ONCE at initialization of the (and when I decide to swtich from one to the other) and remains constant. In the same way D updates only when a switch has been pressed.

    I failed to find resources too tracing data to the xygraph, and drawing a live data on top selectively.

    I have attached an outdated version of my front as reference. I would like suggestions. Thank you in advance.

    -JLS


  • Photos of grouping by date library

    I have a PC with 64-bit Windows 7 operating system, and I have 64-bit Windows Media Player version 12.0.7601.17514.  Everytime I open the library and click the images folder in the Navigation pane, I can see the images in the library grouped into sections sorted by date, with a line between each group, appearing in descending order. It is the only option in the view or sort available in the images of the library? Otherwise, where is the setting for a setting? Thank you.

    Duplicate the content of the Post-

    Media Player does not follow the folder hierarchy

    I use Windows 7 64-bit with 64-bit Windows Media Player 12.0.7601.17514. I have the custom navigation pane (right-click) to include the 'folder '. When I open the media player to watch the library and then click under Pictures folder icon in the navigation pane, I can see "C:" in the library with an image (randomly?) one of my photo folders. If I then double-click this picture above the "C:", it seems to descend in my repertoire to a level 'Users' with the same image as above indicated. If I double-click now this photo above 'users', once more, it seems down my a level my file directory %USERPROFILE%, and once again it has the same image as demonstrated previously.

    At this point if I click on my user name, New Media Player explores in my repertoire. However, he did not go straight to the pictures folder, as I expect to do it, but instead, it seems to have broken down in two directions, showing the images folder and the videos folder. The image to the pictures folder is the same as that which began with and the image to the folder of videos is a single snapshot (random?) of one of the video files in my directory. I don't understand, and the desire, reason the program to the forks in two directions different and apparently unrelated at this point.

    I checked in the media player settings in the menu "tools > Options > Library tab", to confirm that I have not checked the option "add video files in the image library. I would like to know if this undesirable behavior can be avoided. Does anyone have an explanation for it? Also, is there a way for me to decide what image should be used to represent the folder images, used in the MP library initially instead of Media Player to choose arbitrarily?

    Hello

    Welcome to the Microsoft Community Forum!

    Bases on the decryption of the question, I understand that you need information about the sorting option in Windows Media Player.

    I will definitely help you.

    Please follow the steps.

    (a) organize click

    (b) click Sort by.

    (c) chose the option in the list (tile, Album, artist, Album, release date, date taken, rating, filename) are the only sort options in Windows Media player.

    Reference links.

    Windows Media Player frequently asked questions

    http://Windows.Microsoft.com/en-in/Windows7/Windows-Media-Player-frequently-asked-questions

    The Windows Media Player library: frequently asked questions

    http://Windows.Microsoft.com/en-in/Windows7/Windows-Media-Player-library-frequently-asked-questions

    Work with libraries

    http://Windows.Microsoft.com/en-in/Windows7/working-with-libraries

  • Building ColdFusion output grouping by Date report

    I build a report in Report Builder CF where he documents will appear are grouped and sorted by date in the CF query

    I need to be able to view the date under the heading grouped on each page, if there is a change of date in the results on the same page, I am eager to the subtitle for the new date grouped to appear on the page as a subtitle.

    the output I need is the same as the output that would be generated from the following code in a cfm output

    < cfoutput query = "queryname" group = "TransactionDate" >

    #TransationDate # < br >

    < cfoutput >

    #transationTime #, #transactiontype #, #transactionamount #.

    < / cfoutput >

    < / cfoutput >

    When I try to generate output similar in CF Builder above I'm not able to have the TransationDate appear as the subheading once the date changes

    the report simply shows the lines containing

    #transationTime #, #transactiontype #, #transactionamount #.

    However in the change of date there is no subtitle appearing to separate the transaction from one date to another via a titre3 void

    Gezahegn on how to solve this problem would be appreciated and or reference to examples of a case fatality rate of the file which produces its results grouped within a page as described above

    the answer to that is that you add the field names in the field p [aramaters so they are referenced in the report as query.fieldname not simply fieldname, then group with the titles of sub as value changes grouped reports]

  • How to add Contacts to a shared list/Contact Group of Data Import

    Hello

    Is it possible to add contacts imported using an Import of data to a specific shared list or the contact group? I see that I can do that when I download contacts from Contacts-> download of Contacts, but I don't see a way to do this with a Data Import.

    Thanks for any help,

    Jeremy DeVoss

    Autosynchs in E10 only supports synchronization from a CRM system, or some other very specific sources such as Omniture Remarketing segments. FTP (or SFTP) autosynchs unfortunately is possible to create. Creation of an Autosynch done now under Setup - integration, Inbound, Data Sources - create a new Data Source with the external call. This wizard, as long as your data source is a CRM system, you will create an autosynch at the end.

    However, for your specific use case, this unfortunately will not work. However, to do this, here's what I recommend:

    (1) put in place as usual, importing data

    (2) on the imported file, create a new field in the field with a specific value, something like Source of lead and make the specific value you can filter that these contacts (thus, for example, lead Source = import SFTP).

    (3) create a shared filter to find contacts with your specific value (in this example, the importation of SFTP)

    (4) your filter shared as a charger for a generator program to program to use.

    (5) in this program, click the add to no program of Contact Group to add them to your shared list.

    Its a bit of a work-around, but the only way I think it would be possible at the moment.

  • Report of the Group of dates for end of month

    Hi guys

    I need to get 6 months of data grouped by months end dates...

    I need to automate this query so that it looks always to date for the last 6 months

    How do I get there?

    The reason for writing the automated query, it's that this sql script is assigned to the business objects reporting.

    I use the pl/sql

    Here's the DOF

    create the table #Something
    (
    base_date datetime, int connections
    )

    Insert #Something
    Select ' 01 jul 2013', 21 Union all the
    Select ' 02 jul 2013', all 22 union
    Select ' 03 jul 2013', 210 Union all the

    ....

    ...

    Select July 31, 2013,"Union 498
    ' Select ' 01 August 2013 ', 44 Union all the
    ' Select ' 05 August 2013 ', 66 Union all the
    ...

    ....

    Select ' 03 dec 2013", 456 union all
    .

    .

    'Select December 31, 2013,"788

    ..............................................................................

    Desired output

    Connection of base Date

    31 jul 500 (not exact, just randomly)

    August 31 600

    30 sep 356

    31 Oct 676

    30 Nov 544

    31 Dec 456

    ..............................................................................

    Hope this helps

    Hello

    In Oracle, you can do this:

    SELECT LAST_DAY (TRUNC (base_date)) AS last_day_of_month

    SUM (connections) AS total_connections

    OF something

    WHERE base_date > = TRUNC (ADD_MONTHS (SYSDATE,-6), 'MONTH')

    AND base_date<  trunc="" (           ="" sysdate,     ="">

    GROUP OF LAST_DAY (TRUNC (base_date))

    ORDER BY last_day_of_month

    ;

    If you would care to post some CREATE TABLE and INSERT statements so you want the results of work from these data of the sample, then I could test this.

    Using Oracle?  #Something (with a sign # at the beginning) is not valid a table name, and datetime is not a valid data type in Oracle.

    Always say what version of Oracle you are using (for example, 11.2.0.2.0).

    See the FAQ forum: https://forums.oracle.com/message/9362002

    The query above works if base_date is a DATE or TIMESTAMP.

    Since it is now January 2014, what are "the last 6 months?  The above query assumes that they are the last 6 months, that is, July to December 2013.

    If you mean the current (incomplete) and 5 months before it (i.e. August 2013 to January 2014), then add 1 month to the two deadlines in the WHERE clause:

    WHERE base_date > = TRUNC (ADD_MONTHS (SYSDATE,-5), 'MONTH')

    AND base_date<  trunc="" (add_months="" (sysdate, ="" 1) ="">

  • groups-3 data urgent in the data model (bi pub 10 g do not show xml tags)

    Hello
    I have a requirement that the needs of data comes from 3 groups.
    In the 10g data model, I saw master - detail... like 2 groups

    How to reach 3 groups
    When I do, not able to see the 3rd Group (Group 3) xml tags

    Group 1
    -Select inv_id
    -Returns inv_id

    Group 2
    -Select col2
    where col2 =: inv_id

    Group 3
    -Select data where
    col_1 =: col2

    pelase help

    >
    Group 1
    -Select inv_id
    -Returns inv_id

    Group 2
    -Select col2
    where col2 =: inv_id

    Group 3
    -Select data where
    col_1 =: col2
    >
    Looks like that ok

    When I do, not able to see the 3rd Group (Group 3) xml tags

    What is the structure of article dataStructure?
    It can be as

    
    

    or post more information on the structure of article dataStructure

    I prefer unique identify related as columns

    group 1
    --select inv_id as grp1_ind_id
    --returns inv_id
    
    group 2
    --select col2 as grp2_col2
    where col2=:grp1_ind_id
    
    group 3
    --select data where
    col_1=:grp2_col2
    

    because I saw the case, then the columns with the same name does not properly

  • Choose the nearest Date of a group of Dates

    Hey people!
    I need to find the closest Date that corresponds to a given Date. The earliest Date of the Group may be lower or higher than the Date that I will try to find.

    I have two columns: VISIT_DATE and ACTUAL_DATE. The VISIT_DATE columns a number of records with different dates, while ACTUAL_DATE column would have only one record per student.

    Here is an example of dates:
    Visit Date      Actual Date
    ==========================
    01-APR-09     19-MAR-10
    16-NOV-09     19-MAR-10
    17-MAR-10     19-MAR-10
    21-MAR-10     19-MAR-10
    04-APR-11     19-MAR-10
    15-JUN-11     19-MAR-10
    19-SEP-11     19-MAR-10
    24-FEB-12     19-MAR-10
    The nearest date you for March 19, 10 are in fact 17 March 10 and March 21, 10. In this case, I would need to pick up the two records. Any help would be greatly appreciated. :-)

    Thank you!
    SQL> select visit, actual
      2  from
      3  (
      4     select visit, actual
      5           ,dense_rank() over (order by abs(visit - actual)) dr
      6     from   t
      7  )
      8  where dr = 1
      9  ;
    
    VISIT                ACTUAL
    -------------------- --------------------
    17-MAR-2010 00:00:00 19-MAR-2010 00:00:00
    21-MAR-2010 00:00:00 19-MAR-2010 00:00:00
    
  • Problem with logic groups combining data at the point of importation

    I really hope that you will be able to help me with a problem I've had for the last few days.

    The scenario I have requires a logical group be created as the source data must be mapped differently if its + ve or - ve even thought the source account is the same.

    I created a logical group complex and added the corresponding logical group account in the account mapping table. The group uses the definition of rule 001806 *.

    The accounts that would be affected by this rule are below the values:
    18060000 - 2093096.69
    18060000 2093096.69
    18060005 - 2474955.48
    18060005 2474955.48
    18060015 11319512.13
    18060015-8000000 (a PEAK partner)
    18060015 - 3319512.13 (a PEAK partner)

    The + ve values need to go to an asset account and the values of ve - to a liability account.

    At the stage of import, I have a zero value which seem to be the sum of the values of the first 4, substantive values 3 then appear on individual lines.

    Step validate the asset HFM account displays the 3 + values of ve (15.8 M) Although 2 of them apparently wide net to zero with the values of the first and the third - ve. However, the liability account shows the background raising 2 (11. 3 m) as its missing the other 2 values - ve. Assets and liabilities HFM values should be the same

    In the account mapping, I'm using the following script for the + ve (its been reformatted for this forum):
    ' If varValues (9) > '0' Result = "BSA401000" Else Result = "Ignore" End If

    The script - ve is:
    ' If varValues (9) < "0" Result = "BSL625600" Else Result = "Ignore" End If

    I later changed the logical group to be Simple, but I get a similar result, although what I am trying to reach seems to be detailed in the Administrator's Guide FDM under the heading create accounts within Simple logic groups

    I understand that it is a little lengthy so apologies if you need more information, then please let me know and I will happy to provide more

    Thank you, James

    You could even put in place two mapping rules and avoid the complexity of a logic rule, if you wish. The mapping will continue to the next line if it does not meet the criterion, and there is no result.

    Side assets would be with a name of rule vAsset (you can change this, but it must be unique to the next line)
    If varValues (9) > 0 then result = "BSA401000."

    and then you add a second line to your card with a name of vLiab rule and check the sign flip.

    If varValues (9)< 0="" then="" result="BSL6256002">

    In the past I have avoided the rules of logic to the extent possible, by adding removes the ability to drill holes in the source file.

    Concerning
    JOINT TASK FORCE

  • Grouping of data warehouses my ESX host

    Hi guys,.

    I'm trying to create a bit of code that will give me some basic details on space to store data, etc., but group them by what esx hosts are attached.

    I created a simple table, but do not know how to proceed with grouping that he attached esx host-based.

    $vmh = get-Cluster | Get-VMHost

    $ds = @)

    foreach ($ho $vmh in | get-data store)

    {

    $my = » » | Select-Object name, FreespaceMB

    $my.name = $ho.name

    $my.freespacemb = $ho. FreeSpaceMB

    $ds += $my

    }

    Any ideas would be greatly appreciated

    Thank you

    DC

    $ds = @()
    
    Get-Cluster | Get-VMHost | % {
    
    $myHost = $_
    
    $myHost | Get-DataStore | % { 
    
    $out = "" | Select-Object DSName, FreespaceMB, Host
    
    $out.DSName = $_.Name
    
    $out.FreespaceMB = $_.FreespaceMB
    
    $out.Host = $myHost.Name
    
    $ds += $out 
    
    }
    
    } 
    
    $ds | *Sort* Host
    

Maybe you are looking for

  • Does not start after the welcome except in safe mode screen

    The only way I can get into my hard drive is to start in safe mode.  I have run the disk permissions, repairing the disk booted recovery and reisntalled Mavericks, but nothing that it seems to do, will fix the problem.  I use an iMac mid-2010, Maveri

  • Slate 10 HD: Android &amp; HP

    Is the Android operating system a priority for HP? Can you let us know why you are so slow and behind competitors with updated Android OS on your shelves?

  • Use PFI0 on Compact DAQ 9178 chassis as input event

    Hello I have a Compact DAQ Chasis 9178, and I connect a 5V input signal.By a detector detector. I want to use the Signal in Labview. It is possible to convert this signal to a Boolean true or False? I want to read this signal directly from my VI. Tha

  • Webcam Microsoft Life Cinema identifies as 96120

    Remember - this is a public forum so never post private information such as numbers of mail or telephone! Idea MS LifeCam 3.22you have problems with There is no camera connected to the computer install the software for the webcam When I remove and re

  • MediaPlayer video stream off after 3 seconds

    I have a problem to properly display Http Live Streaming on a Z10 so that they appear correctly on a dev to Alpha Z10. Works well on AlphaA 10.9.10.35 Stop after 3 seconds on AlphaC 10.1.0.138 and Z10 (latest Firmware) The code is made of QML: vidPla