Event for groups of data listeners

Hello

It's really a silly question, but nobody knows how to define an event listener for a group of data? Is there a 'method called' property for control of presentation "Custom Layout". I don't know how to define a method in a business object so that it shows in the dropdown of listener event.

The good links on the use of data groups are also appreciated.

Thank you very much in advance.

Yes - when you click on the table/group widget in the form editor, there is a section "Event listener" with a property called "invoked method.

I think you've thought this out on your own, but if you have trouble seeing a method from the drop-down list, create a method that has a type argument Fuego.Util.GroupEvent. If you look at this class, you can see the different types of events that can be received (ADD, INSERT_DOWN, INSERT_UP, REMOVE, and REMOVE_LAST).

Dan

Tags: Fusion Middleware

Similar Questions

  • during extraction of tasks / events for warehouses of data with powershell

    Hi, I'm looking for the extraction of tasks using powershell events (the same information you will find in using customer vi, inventory home, tabs for data warehouses, tasks, and events) to find the task of 'delete' (for example)

    I would like to use the Get-Vievent command, but I'm unable to use the - entity. Get-datastorename data store

    I think that using the data store object is not possible for the get-vievent command.

    So I use:


    Get-Vievent - MaxSamples 100000 | WHERE-object {$_.} FullFormattedmessage-match 'delete'} | Select Createduserid, username, FullFormattedmessage | Export-csv "c:\export_ds.csv" - noTypeInformation - UseCulture

    It works but it is very slow (my environment is very large)... and I've reached the limit - maxsamples.

    Is there another way? or I have to use - start - stop to limit the number of events, and to several reports by month?

    Thank you very much

    Thierry

    The current version is not it, but it's a good idea.

    I'll update the function.

  • drives great for groups of data and other files

    Hello

    I was asked to provide a large amount of storage through our cluster VM.  I was invited for between 100 and 200 terabytes and I was wondering if there is a way that vSphere can cope with this?  What I've read, you can not (LUN 32 x 2 TB) data from more than 64 to store, and you can't do a vmdisk more than 2 TB.

    Is there a way to bypass these limits?  Raw disk mapping allows large LUN?  Nobody knows when the limit of 2 TB Lun and vmdisks will change?

    Thank you!

    If your storage is iSCSI, you can use an initiator of software inside the virtual machine and use GPT disks more than 2 TB.

    If your storage is CF, you should use a dedicated HBA and VMDirectPath or try using NPIV (but I'm not sure that NPIV can use larger disks).

    André

  • event for broadcasting images of TCP/IP

    Hi all

    My request has currently 2 devices using TCP/IP on stream of framework to 100ms and 50ms data refresh to a PC. I would like to change the polling for data to generate an event when the data arrives, but I'm not sure how to proceed. I have currently 2 devices streaming via TCP/IP to a routine of labview with 2 loops and they seem to work as expected but I must question currently. The situation is not ideal and will not improve once I have 5 or 6 devices TCP/IP all managers of production at different rates.

    I previously used visa for data series and fell on the function of the event for the series that has greatly improved my behavior of codes once I got rid of my quick polling loop.

    I searched for a solution, but so far I can only find the VISA > advanced > range of event management that seems to support most of the types of communications except TCP/IP (I hope I'm wrong). I also vaguely remember a conversation where the queues or the declarant could be configured to provide the equivalent of an event for the recovery of the data.

    I guess that theres a certain many other means to produce something that looks like an event.

    Advice appreciated.

    Concerning

    Chris

    I don't know of anyway to raise an event when the TCP data arrives, but I don't know why you need to do. No need to question the TCP connection, just to set a long (or even infinite) time and wait for the data to come to you. Keep a copy of the refnum of connection TCP somewhere else, so if you need to kill TCP playback, you can do so by closing the TCP connection (which can cause the TCP Read return an error).

    If you want to generate an event of this, you could feed the chain you read TCP read in a user event.

    If you need to handle an unknown number of connections and you want not to loop through all, mark, each of them, you can instead create a reentrant VI and launch a new instance of it for each TCP connection. That said, I've never had a problem with the run-up to the election on several connections, and I have used this template a bit.

  • Send a table 1 d of test for MSSQL 2012 data

    Hello. Feel free to answer a number any of the sub questions you can. Thanks in advance!

    (1) being new to MSSQL, I don't know how to send a simple 1 d table of a floating number of type to a table in Microsoft SQL 2012. This is as far as I'm away. I get the message on the left with the VI current and one on the right when I get rid of the concatenation of strings of time and just carry the numbers going to the table name instead.

    2 (a), this is my Guinea pig file, running on a database of Guinea pig. Finally I want to send the data from a test procedure that records a piece of data per millisecond - a table usually end up with approximately 15 000 items, more or less - on the basis of data. Unfortunately, I know the misfortune that Labview can send only on one line at a time. So how the hell do I do a new column for each new piece of data for a test that will run for an indefinite period?

    2 (b) ideally, I would have a table with a column for each test was run, with the title of each column is the time and date at which the test was run. Is there a way to "transpose" the data once it's in MSSQL?

    I used the fixed data because it's a better illustation of how things work.

    If you need to write two items on one line at a time, then you need to group them together.  The error you are getting is because you have wired a double as input but listed two columns.  Notice that my last example wrote a cluster (which can be created with a fiber node) of two elements.  You must use together to create a cluster and time data, then thread this cluster node of your insert.

    A few other comments:

    Generally screws express is a reasonable place to start when you're new to LabVIEW, but you must learn to use shift registers do calendar between the execution of the loop.

    When it is to group the data and time together you need not to turn all this into a table first.  Just put the time in a single entry and your data in the other.  The groups you create are a single line of data.

  • Can I reuse fields to event for various events?

    Hello

    Can I reuse fields to event for various events?

    Thank you

    Hello

    Yes, you can reuse these Email fields in different events as long as they are the same field names. Data are populated based on the field for the event.

    Thank you

    edynamic expert Eloqua

  • Building ColdFusion output grouping by Date report

    I build a report in Report Builder CF where he documents will appear are grouped and sorted by date in the CF query

    I need to be able to view the date under the heading grouped on each page, if there is a change of date in the results on the same page, I am eager to the subtitle for the new date grouped to appear on the page as a subtitle.

    the output I need is the same as the output that would be generated from the following code in a cfm output

    < cfoutput query = "queryname" group = "TransactionDate" >

    #TransationDate # < br >

    < cfoutput >

    #transationTime #, #transactiontype #, #transactionamount #.

    < / cfoutput >

    < / cfoutput >

    When I try to generate output similar in CF Builder above I'm not able to have the TransationDate appear as the subheading once the date changes

    the report simply shows the lines containing

    #transationTime #, #transactiontype #, #transactionamount #.

    However in the change of date there is no subtitle appearing to separate the transaction from one date to another via a titre3 void

    Gezahegn on how to solve this problem would be appreciated and or reference to examples of a case fatality rate of the file which produces its results grouped within a page as described above

    the answer to that is that you add the field names in the field p [aramaters so they are referenced in the report as query.fieldname not simply fieldname, then group with the titles of sub as value changes grouped reports]

  • Group by Date and place of DateTime

    Hello

    I'm looking for some counts that are grouped by date / time to date.  For example, I run the query below.

    ALTER session SET NLS_DATE_FORMAT = "mm/dd/yyyy";

    SELECT count (distinct contact_id) as charges, created in the created_date

    of s_con_chrctr

    where created > = TO_CHAR('07/01/2015') and created < = TO_CHAR('07/10/2015')

    and created_by = 1-J9455J'

    Group by created;

    The results are as follows.

    COUNTIES CREATED_DATE
    102/07/2015
    102/07/2015
    102/07/2015
    102/07/2015

    I think that what is happening because this column is a datetime column and his back one record for each time specific in this day.  The results, I'm looking for are...

    COUNTIES CREATED_DATE
    1502/07/2015
    1703/07/2015
    3504/07/2015
    1105/07/2015

    How to change my query to group by date and not by date time?

    So created is a date, then do not compare with TO_CHAR literals.

    If you want to skip the part time in your account, use TRUNC:

    SELECT count (distinct contact_id) as charges, TRUNC (created) as created_date

    of s_con_chrctr

    where created > = TO_DATE (July 1, 2015 "," MM/DD/YYYY ') and created< to_date('07/11/2015',="">

    and created_by = 1-J9455J'

    TRUNC Group (created);

  • GROUP BY date range to identify duplicates revisited!

    Good afternoon

    It is a continuation of the previous discussion, I previously created GROUP BY date range to identify duplicates

    Your help with the following would be appreciated (examples of data below)

    I've highlighted what I mark as returned to double as below:

    example4.jpg

    Definition of duplicate (However, this is slightly different to the previous post)

    the same account_num

    maximum 20 days apart tran_effective_date

    tran_processed_date maximum 10 days apart

    the same tran_amount

    However, I do not want to return a duplicate if they have both a tran_priced_date filled.

    So, in light of the foregoing, I don't expect the following account_numbers to be marked as duplicate:

    N100283 - one of the records has populated trab_priced_date

    N101640 - none of the records have the tran_priced_date filled

    N102395 - same as N101640

    N102827 - same as N101640

    N108876 - although the two documents have the populated tran_priced_date, the tran_effective_dates are more than 20 days apart.

    BUT for the rest of the accounts, N100284 and N102396 I want to execute the following logic

    Compare the 3rd rank in 4th place and ask the following questions:

    Is tran_effective_date to a maximum of 20 days out?

    Is tran_processed_date maximum 10 days apart?

    If yes then report it as dupe

    Compare line 4 to 5, then ask the same question until you get to the line 4 or 5. When everything is done, I want to examine only the transactions that have the status of normal and if the above question is true for both and then return to my game of result as dupes.

    I hope that makes sense!

    BEGIN
      EXECUTE IMMEDIATE 'DROP TABLE samp_data';
    EXCEPTION
      WHEN OTHERS THEN
        IF SQLCODE = -942 THEN
          DBMS_OUTPUT.put_line('');
        ELSE
          RAISE;
        END IF;
    END;
    /
    
    
    CREATE TABLE samp_data (
      ACCOUNT_NUM             VARCHAR2(17),
      TRAN_ID                 NUMBER(10),
      TRAN_TYPE               VARCHAR2(50),
      TRAN_EFFECTIVE_DATE     TIMESTAMP(6),
      TRAN_PROCESSED_DATE     TIMESTAMP(6),
      TRAN_STATUS             VARCHAR2(17),
      TRAN_PRICED_DATE        TIMESTAMP(6),
      TRAN_AMOUNT             NUMBER(13,2)
      );
    /
    
    
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N100283',140119178,'Regular With',to_timestamp('01-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('22-MAY-15 07.00.34.235000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', to_timestamp('21-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),200);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE,TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N100283',140158525,'Regular With',to_timestamp('13-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('26-MAY-15 08.39.14.090000000 AM','DD-MON-RR HH.MI.SS.FF AM'),'Normal', null,200);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N100284',140118826,'Regular With',to_timestamp('03-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('22-MAY-15 07.00.19.072000000 AM','DD-MON-RR HH.MI.SS.FF AM'),'Normal', to_timestamp('20-MAY-15 03.25.05.438000000 AM','DD-MON-RR HH.MI.SS.FF AM'),450);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N100284',140158120,'Regular With',to_timestamp('06-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('23-MAY-15 08.38.42.064000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Reversed', to_timestamp('21-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),450);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N100284',140158120,'Regular With',to_timestamp('06-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('02-JUN-15 08.38.42.064000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', to_timestamp('31-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),450);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N101640',140118957,'Regular With',to_timestamp('18-MAY-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('22-MAY-15 07.00.25.015000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', null,120);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N101640',140158278,'Regular With',to_timestamp('22-MAY-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('26-MAY-15 08.38.56.228000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', null,130);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102395',140118842,'Regular With',to_timestamp('03-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('22-MAY-15 07.00.19.665000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', null,250);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102395',140158235,'Regular With',to_timestamp('03-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('26-MAY-15 08.38.53.093000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', null,250);
    
    
    
    
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102396',140118823,'Regular With',to_timestamp('09-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('18-MAY-15 07.00.18.931000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', to_timestamp('19-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),750);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102396',140158099,'Regular With',to_timestamp('16-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('24-MAY-15 08.38.39.443000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Reversed', to_timestamp('21-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),750);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102396',140158099,'Regular With',to_timestamp('16-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('29-MAY-15 08.38.39.443000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', to_timestamp('30-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),750);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102396',140158099,'Regular With',to_timestamp('12-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('22-MAY-15 08.38.39.443000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Reversed', to_timestamp('30-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),750);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102396',140158099,'Regular With',to_timestamp('14-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('23-MAY-15 08.38.39.443000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Reversed', to_timestamp('30-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),750);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102827',140118850,'Regular With',to_timestamp('03-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('22-MAY-15 07.00.20.045000000 AM','DD-MON-RR HH.MI.SS.FF AM') , 'Normal',null,157.84);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N102827',140158118,'Regular With',to_timestamp('03-JUN-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('26-MAY-15 08.38.41.861000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', null,157.84);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N108876',139840720,'Regular With',to_timestamp('01-MAY-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('11-MAY-15 08.35.34.646000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', to_timestamp('20-MAY-15 03.25.05.438000000 AM','DD-MON-RR HH.MI.SS.FF AM'),1000);
    Insert into samp_data (ACCOUNT_NUM,TRAN_ID,TRAN_TYPE,TRAN_EFFECTIVE_DATE,TRAN_PROCESSED_DATE, TRAN_STATUS, TRAN_PRICED_DATE,TRAN_AMOUNT) 
    values ('N108876',139889880,'Regular With',to_timestamp('22-MAY-15 12.00.00.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('12-MAY-15 08.49.29.080000000 AM','DD-MON-RR HH.MI.SS.FF AM'), 'Normal', to_timestamp('21-MAY-15 03.26.18.954000000 AM','DD-MON-RR HH.MI.SS.FF AM'),1000);
    /
    
    
    select * from samp_data
    ORDER BY account_num, tran_effective_date, tran_processed_date;
    

    PL continue the discussion in your original post

  • FDMEE events for Write-back

    Hi guys,.

    Anyone know if the FDMEE events, for example, AftImport, AftLoad, etc. are available for writeback functionality?

    Kind regards

    HN

    Hello

    Event scripts are only available for the data loading process.

    If you want to use any script before and after the writeback process, you can create a batch and assign the custom script before and/or after the rewrite rule.

    Concerning

  • Getting VM event for some days

    Hello

    I wan to pull events for a particular virtual machine for the last 2 months. I want to know who made changes to the configuration of the VM (like adding disks, Flash, etc.). Also, this command gives a result for all virtual machines, but I'm looking for particular VM e.g.VM1. Let me know if any script or command in cli.

    Thank you

    vm2014

    Right channel output to Export-CSV.

    Get-VIEvent-beginning (Get-Date). AddMonths(-2) - MaxSamples ([int]: MaxValue)-entity (Get - VM someVM) | Export-CSV - NoTypeInformation ' VM - Events.csv.

  • Difference in the number of records for the same date - 11 GR 2

    Guy - 11 GR on Windows2005 2, 64-bit.

    BILLING_RECORD_KPN_ESP - is a monthly partitioned table.
    BILLING_RECORD_IDX #DATE - is a local index on "charge_date" in the table above.

    SQL > select / * + index (BILLING_RECORD_KPN_ESP BILLING_RECORD_IDX #DATE) * /.
    2 (trunc (CHARGE_DATE)) CHARGE_DATE;
    3 count (1) Record_count
    4. IN "RATOR_CDR". "" BILLING_RECORD_KPN_ESP ".
    where the 5 CHARGE_DATE = January 20, 2013.
    Group 6 by trunc (CHARGE_DATE)
    5 m

    CHARGE_DATE RECORD_COUNT
    ------------------ ------------
    2401 20 January 13-> > some records here.

    -> > Here I can see only '2041' records for Jan/20. But in the query below, it shows "192610" for the same date.

    Why is this difference in the number of records?

    SQL > select / * + index (BILLING_RECORD_KPN_ESP BILLING_RECORD_IDX #DATE) * /.
    (trunc (CHARGE_DATE)) CHARGE_DATE,
    2 count (1) Record_count
    3. FOR "RATOR_CDR." "" BILLING_RECORD_KPN_ESP ".
    "4 where CHARGE_DATE > 20 January 2013."
    Group of 5 by trunc (CHARGE_DATE)
    6 order by trunc (CHARGE_DATE)
    5 m

    CHARGE_DATE RECORD_COUNT
    ------------------ ------------
    192610 20 January 13-> > more records here
    JANUARY 21, 13 463067
    JANUARY 22, 13 520041
    23 JANUARY 13 451212
    JANUARY 24, 13 463273
    JANUARY 25, 13 403276
    JANUARY 26, 13 112077
    27 JANUARY 13 10478
    28 JANUARY 13 39158

    Thank you!

    Because in the second example you also select rows that have a nonzero component.

    The first example selects only rows that are 00:00:00

    (by the way, you should ask questions like this in the forum SQL)

  • between operator to group by date in the Apex of the interactive reports

    Hello

    In the filter of interactive reports, I could not find the "between" for the date field (got a "group by date" in my (source) sql query.) I wonder, is - this in view of the group by clause date?. Is there a way to show the operator "between" in the interactive reports filter.

    Thank you

    I just opened an existing style IR report, went to the actions, filter, selected a date column and found at the bottom of the list of values... Are you sure of the date that you want to filter on is a real date column?

    Thank you

    Tony Miller
    Webster, TX

    What happens if you were really stalking a paranoid schizophrenic... They would know?

    If you answer this question, please mark the thread as closed and give points where won...

  • Blocks of groups high low listeners of the group.

    Hi all

    I have a few groups on top of the other, what I would do, is to have the Group low power use MouseEvents. Unfortunately, I have found that if a group is another, the bottom group does not receive these events.  Here's the pseudo code of an example.

    < group id = "parentGroup" >

    < Group XY = "0" = "0" width = "100%" height = "100%" id = "bottomGroup" mouseDown = "{bottomGroupMouseDownHandler (event))}" / > "

    < Group XY = "0" = "0" width = "100%" height = "100%" id = "topGroup" mouseDow = "{topGroupMouseDownHandler (event)}" / > "

    function bottomGroupMouseDownHandler (event: Event): void

    {

    I will not fire

    }

    function topGroupMouseDownHandler (event: Event): void

    {

    I fires

    }

    Does anyone know how I can get the low Group caught MouseEvents, or am I solve a problem that does not really need to be solved, and the solution lies in the design. Points will be awarded to helpful responses / correct

    Sincerely,

    Ubu

    You may be looking for mouseEnabledWhereTransparent, for example:

    http://ns.Adobe.com/MXML/2009.
    xmlns:s = "library://ns.adobe.com/flex/spark" >
       
       
           
       

       
       
           
       

       
       
           
       

       

  • View threads AND grouped by date/conversation

    Hello

    I try to have messages see the thunderbird grouped by date (with a thread on today, yesterday, last week, etc.) and within these threads, see email conversations. A conversation that began three weeks ago, but has received a new email today is expected to manifest itself in the thread today, but from the first email of the conversation.
    I did not understand if it is possible, as grouped by (Date) and Threads are mutually exclusive. Any ideas?

    Thank you

    As you say, it is there only one group so at the time. So I guess that your out of luck.

    You could fill a better bug report and see if one of the developers interested. https://Bugzilla.Mozilla.org/

Maybe you are looking for