To avoid the accumulation of incremental result record?

I have a solution work that uses "semantic value" to compare multiple records in an overall assessment that happen over a long period of time. In the simplest case, I want to just compare the new evaluation report with the most recent assessment of old, and I've set up my own process are / CONSUME_PRIM_EVENTS that captures the correspondence rule I want and consumes the primitive event of the screening assessment report.

The problem is that the corresponding logic for semantics defined in very large results number of incremental result of the RLM record $ PRMINCRSLT_ * table. Even for assessment records that do not match any other event, I always get two matching records in the table, and it seems to grow exponentially when there are matches. My manual consumption process gets rid of matches that I'M actually interested in, leaving all these records in the table that is never consumed. As I test adding and processing of cases, article assessment becomes slower and slower. According to the number of matched pairs in my dataset, 100 000 primitive events leaves between 195 000-225 000 records of RLM$ PRMINCRSLT, even after I've consumed the pairs I am interested.

Is there something fundamentally wrong with the way I'm trying to put up my games? Or are the composite events based on a game only suitable for primitive events for limited life? Here is my setup right now:

Start
DBMS_RLMGR. CREATE_EVENT_STRUCT (data = > 'AppraisalPair');

DBMS_RLMGR. () ADD_ELEMENTARY_ATTRIBUTE
Event_Struct = > 'AppraisalPair ',.
attr_name = > "appr1"
tab_alias = > RLM$ TABLE_ALIAS ('APPRAISALTEST'));

DBMS_RLMGR. () ADD_ELEMENTARY_ATTRIBUTE
Event_Struct = > 'AppraisalPair ',.
attr_name = > "appr2"
tab_alias = > RLM$ TABLE_ALIAS ('APPRAISALTEST'));
end;

BEGIN
DBMS_RLMGR. () CREATE_RULE_CLASS
rule_class = > 'AppraisalPairRC ',.
Event_Struct = > 'AppraisalPair ',.
action_cbk = > 'AppraisalPairCBK ',.
actprf_spec = > ' EVENT_TYPE VARCHAR2 (20), EVENT_PARAM VARCHAR2 (20),
rslt_viewnm = > 'NewAppraisalPairs ',.
rlcls_prop = > ' < composite equal = 'appr1. ADDRESS_ID, appr2. Consumption ADDRESS_ID"="RULE"/ >"); "
END;

SET DEFINE OFF
insert into AppraisalPairRC (rlm$ ruleid, score_component_type, score_value, rulecond$ rlm)
values ('Rule1', 'ABC', 'DEF',
' < condition >
< and join = "Appr2.APPRAISAL_DATE & gt; Appr1.APPRAISAL_DATE and Appr2.VALUE & gt; Appr1.value">
< name of the object = "Appr1" / >
< name of the object = "Appr2" / >
< / and >
(< / condition > ');

Hello

The way in which the rule is stated, each incoming event could be a candidate for the first record of the assessment and the second assessment drive. Although your application can generate the dates of assessment in ascending order, is not implicit with the rule. For example, either of the two partial records captured in incremental results can ignite the remedy based on the date of the evaluation planned for the new event. Thus, there is no easy way to cut the table of incremental results to capture only the first line and not the other. But if the matches on the address IDS are rare, even keep the State of single line for all unnecessary events may not be the best. Unless you have more scenarios (rules) you want to model using the structure of the event, you can be better to use simple events with user-defined functions (to check the older assessments). However, if you have more rules that you want to test for corresponding assessment records, you could redesign the application to pull the old recordings for each incoming event and add them to the class rule as events of another type. In this way, your events exist only for the duration of the session/call and they do not contribute to any incremental State.

Hope this helps,
-Aravind.

Tags: Oracle

Similar Questions

  • How do I avoid the accumulation of color / opacity where two brush strokes overlap?  In other words, I want to use more than one path with the Paintbrush tool, but see no additive effect where strokes overlap.  What Miss me?

    How do I avoid the accumulation of color / opacity where two brush strokes overlap?  In other words, I want to use more than one path with the Paintbrush tool, but see no additive effect where strokes overlap. 5 Lightroom

    I use it all the time. Turn your opacity, density and traffic all 100%.

    Benjamin

  • Display size is counted for results Records.

    Hi experts,


    In my application, I want to see the size counts of results Records. Results of support includes 50 discs, discs of 50, with color red = 15 files. So I want to display the results of red color-

    Color:
    Red (15)
    Silver (5)

    Like wise for other dimensions as well. According to documentations short, I guess "Drs" parameter is there to achieve. If this is true, you guys can explain how the required functionality can be obtained with the endea_jspref application. Or using api presentation, what kind of code should be written.

    Any input/ideas will be highly appreciated.

    Kind regards
    Hoque

    Hi Hoque,

    If you are using java and code below will help you
    You need to get the list of refinement for a dimension of

    REF. DimValList = dimension.getRefinements ();

    then

    for (int k = 0; k)< refs.size();="">
    {

    DimVal ref = refs.getDimValue (k);
    PropertyMap pmap = ref.getProperties ();
    String dimesionstats = "";
    If (pmap.get ("DGraph.Bins")! = null)
    {
    dimesionstats = "(" + pmap.get("DGraph.Bins") + "" ");
    }
    }

    Note: Please allow to calculate the statistical sophistication to the dimension of the developer studio.

    Thank you
    Sunil

  • How to avoid the second NULL records table using the join

    I followed two tables:

    Name of the table: Emp
    EmpNo EmpName salary Dept DeptLocation
    1 Lar 1000 1
    2 Dai 2 2000
    3 mar 3 3000
    4 Apr 4000 4 NULL

    Name of the table: Dept
    DeptNo DeptName DeptLocation
    1 HR A
    2 Dev B
    2 Dev NULL
    3 test NULL
    NULL terminator 4


    I try to get following result:
    EmpNo EmpName salary DeptName DeptLocation
    LAR 1000 1 HR has
    2 Dai 2000 Dev B
    March 3 3000 Test C
    4 Apr 4000 end NULL


    Rules:
    -Get all matching records from Emp & the DeptNo from Dept
    -If the Dept table has more entries for the same DeptNo then consider records with DeptLocation is not NULL and avoid the record with DeptLocation with NULL value
    -Get all records matching Emp & Dept from the DeptNo where DeptLocation is NULL and exist only once in Dept


    Thanks in advance for your suggestions.

    Hello

    So when deptlocation is in the two tables, but not the same thing, you want to take the value of the table emp, not the dept table.

    In this case, reverse the NVL arguments:

    WITH     got_rnk          AS
    (
         SELECT     deptno, deptname, deptlocation
         ,     DENSE_RANK () OVER ( PARTITION BY  deptno
                                   ORDER BY          NVL2 ( deptlocation
                                                  , 1
                                       , 2
                                       )
                           )     AS rnk
         FROM    dept
    )
    SELECT     e.empno
    ,     e.empname
    ,     e.salary
    ,     r.deptno
    ,     r.deptname
    ,     NVL ( e.deptlocation          -- Changed
             , r.deptlocation          -- Changed
             )          AS deptlocation
    FROM     emp      e
    JOIN     got_rnk     r     ON     e.dept     = r.deptno
    WHERE     r.rnk     = 1
    ;
    

    Apart from the 2 marked lines "Changed", it's the same query I posted earlier.

  • Camileo S10 - how to avoid the 'information' during video playback on TV

    I just received my Camileo S10... very nice indeed... but how to avoid the track and reading data to be shown on the screen during playback on my TV...?

    Go to the movies, then before you start playing click the OK button. This also works when you save. You must do this before playing or recording.

  • Using bulk collect into with assistance from the limit to avoid the TEMP tablespace error run out?

    Hi all

    I want to know if using bulk collect into limit will help to avoid the TEMP tablespace error run out.

    We use Oracle 11 g R1.

    I am assigned to a task of creating journal facilitated for all tables in a query of the APEX.

    I create procedures to execute some sql statements to create a DEC (Create table select), and then fires on these tables.

    We have about three tables with more than 26 million records.

    It seems very well running until we reached a table with more than 15 million record, we got an error says that Miss tablespace TEMP.

    I googled on this topic and retrieve the tips:

    Use NO LOG

    Parallel use

    BULK COLLECT INTO limited

    However, the questions for those above usually short-term memory rather than running out of TEMPORARY tablespace.

    I'm just a junior developer and does not have dealed with table more than 10 million documents at a time like this before.

    The database support is outsourced. If we try to keep it as minimal contact with the DBA as possible. My Manager asked me to find a solution without asking the administrator to extend the TEMP tablespace.

    I wrote a few BULK COLLECT INTO to insert about 300,000 like once on the development environment. It seems.

    But the code works only against a 000 4000 table of records. I am trying to add more data into the Test table, but yet again, we lack the tablespace on DEV (this time, it's a step a TEMP data)

    I'll give it a go against the table of 26 million records on the Production of this weekend. I just want to know if it is worth trying.

    Thanks for reading this.

    Ann

    I really need check that you did not have the sizes of huge line (like several K by rank), they are not too bad at all, which is good!

    A good rule of thumb to maximize the amount of limit clause, is to see how much memory you can afford to consume in the PGA (to avoid the number of calls to the extraction and forall section and therefore the context switches) and adjust the limit to be as close to that amount as possible.

    Use the routines below to check at what threshold value would be better suited for your system because it depends on your memory allocation and CPU consumption.  Flexibility, based on your limits of PGA, as lines of length vary, but this method will get a good order of magnitude.

    CREATE OR REPLACE PROCEDURE show_pga_memory (context_in IN VARCHAR2 DEFAULT NULL)

    IS

    l_memory NUMBER;

    BEGIN

    SELECT st. VALUE

    IN l_memory

    SYS.v_$ session se, SYS.v_$ sesstat st, SYS.v_$ statname nm

    WHERE se.audsid = USERENV ('SESSIONID')

    AND st.statistic # nm.statistic = #.

    AND themselves. SID = st. SID

    AND nm.NAME = 'pga session in memory. "

    Dbms_output.put_line (CASE

    WHEN context_in IS NULL

    THEN NULL

    ELSE context_in | ' - '

    END

    || 'Used in the session PGA memory ='

    || To_char (l_memory)

    );

    END show_pga_memory;

    DECLARE

    PROCEDURE fetch_all_rows (limit_in IN PLS_INTEGER)

    IS

    CURSOR source_cur

    IS

    SELECT *.

    FROM YOUR_TABLE;

    TYPE source_aat IS TABLE OF source_cur % ROWTYPE

    INDEX BY PLS_INTEGER;

    l_source source_aat;

    l_start PLS_INTEGER;

    l_end PLS_INTEGER;

    BEGIN

    DBMS_SESSION.free_unused_user_memory;

    show_pga_memory (limit_in |) "- BEFORE"); "."

    l_start: = DBMS_UTILITY.get_cpu_time;

    OPEN source_cur.

    LOOP

    EXTRACTION source_cur

    LOOSE COLLECTION l_source LIMITED limit_in;

    WHEN l_source EXIT. COUNT = 0;

    END LOOP;

    CLOSE Source_cur;

    l_end: = DBMS_UTILITY.get_cpu_time;

    Dbms_output.put_line (' elapsed time CPU for limit of ')

    || limit_in

    || ' = '

    || To_char (l_end - l_start)

    );

    show_pga_memory (limit_in |) "- AFTER");

    END fetch_all_rows;

    BEGIN

    fetch_all_rows (20000);

    fetch_all_rows (40000);

    fetch_all_rows (60000);

    fetch_all_rows (80000);

    fetch_all_rows (100000);

    fetch_all_rows (150000);

    fetch_all_rows (250000);

    -etc.

    END;

  • Anyway to avoid the pre-rendered?

    Hello community,

    Work with adobe audition from 5.5 and I've updated the updates came out.

    The music I did usually consist of 10 to 15 titles, 32 bit (float) and 192000 Hz.

    The only way for me to edit without hearing pop and Crackle is all pre-rendered. I have the upward Task Manager while I work and it doesn't look like my CPU is under development that hard. I have an i7 - 4930 k overclocked to 4.5 Ghz, 32 GB of RAM to 2400 mhz, the operating system is on a RAID-0 configuration, hearing is a separate SDS and I have the pictures on another HARD drive.

    I was wondering if there was a way for me to avoid the pre-rendered and still not heard the POPs and crackles.

    Need me a sound card to avoid this problem?

    I went my audio with bad settings?

    My drive configuration is not optimized correctly?

    Thanks for the help,

    Arren

    TheCheeseling wrote:

    The music I did usually consist of 10 to 15 titles, 32 bit (float) and 192000 Hz.

    This is music for bats? You ask the streaming system to do work at least two times more than that would be even a desperate optimist working at 96 k, and even they record anything else of noise on half of their disk space!

    The only way for me to edit without hearing pop and Crackle is all pre-rendered. I have the upward Task Manager while I work and it doesn't look like my CPU is under development that hard. I have an i7 - 4930 k overclocked to 4.5 Ghz, 32 GB of RAM to 2400 mhz, the operating system is on a RAID-0 configuration, hearing is a separate SDS and I have the pictures on another HARD drive.

    I was wondering if there was a way for me to avoid the pre-rendered and still not heard the POPs and crackles.

    Need me a sound card to avoid this problem?

    I went my audio with bad settings?

    My drive configuration is not optimized correctly?

    The bit I in fat frightens me more. Try seriously to use its edge to 192 k sampling? It's positively crazy! The onboard sound devices are of very poor quality and almost certainly do not work natively at this rate anyway, there will be on-the-fly resampling going on. This demolishes the sound quality is beyond comprehension.

    Humans can hear (if they are really young) up to about 20 kHz - this figure fall considerably in adults due to presbycusis. Past audio sample. humans to point higher can hear requires a sampling frequency of 44.1 kHz no more. There were old arguments about the sampling rates higher "sounds" better, but with modern audio devices as oversampling instead of use the filter brick wall, it is absolutely not benefit anyone to go beyond that. And the sample rate only affects the frequency the higher that can be recorded (half of the sampling frequency, known as the Nyquist frequency) - does not really something to do with the 'quality' or anything else.

    What this means is that you can save a lot of hard work in your readers. The reason why you hear clicks all by checking only the computer is simply because you're trying to broadcast four times more data that you need. The other thing that I discovered recently is that overclocking is sometimes detrimental to the performance of the system in a strange way. If the processor does not work hard, then you simply have not need for clock too it - you just warm it more!

    If you want to make serious work to audition at all, you need an external audio device, Yes. The quality of these, even the least expensive, is significantly better than any embedded device.

  • How to force a return, even if the sql cannot find a record

    Hello
    I ' l try explan, what I'm looking for:
    I have a SQL query (Oracle 10 g) as

    Select system, A, B T where system = 0

    In the table T is no record with system = 0, so I don't get anything back from the SQL. It's OK, but I need to get something like * 0, "," * as a return. I tried with NVL(SysT,0)... but it seems not to be the solution.

    Does anyone has an idea how to solve this problem?

    Best regards
    Carsten

    Hi, Carsten,

    Another way to do this is to use an outer join:

    SELECT  0 AS SysT, a, b
    FROM               dual
    LEFT OUTER JOIN  t    ON  t.SysT  = 0
    ;
    

    UNION, Saurabh suggested, is a way to do it. If the combination (System, a, b) is not unique in table t, then you need to make UNION ALL , which is more efficient anyway, to avoid losing lines.
    If the original query is longer than what you have posted, you will not want to repeat. Write it only once, in a WITH clause, like this:

    WITH       original_query     AS
    (
         SELECT  SysT, a, b
         FROM      t
         WHERE      SysT     = 0
    )
    SELECT     *
    FROM     origianl_query
        UNION ALL
    SELECT     0 AS SysT, a, b
    FROM     dual
    WHERE     NOT EXISTS  (
                          SELECT     1
                   FROM     original_query
                      )
    ;
    

    Regardless of how complicated the original query, the EXISTS subquery is just 'SELECT 1 FROM original_query '.

    Published by: Frank Kulash, December 19, 2011 06:21

  • the accumulation on a global scale

    Hello

    in the web form, I've entered data in the first 3 months (January, February, March)... I recorded... but the accumulation does not occur to the Q1. 1st quarter take only the values of the market.

    Is this a classic Planning Application? If so, here are the steps 11.1.1.3 version: http://download.oracle.com/docs/cd/E12825_01/epm.111/hp_admin/mem_prop.html

  • Governor limit exceeded in the generation of cube (data records Maximum exceeded)

    Hello

    I have a PivotTable that contains about 30 measures and the lines are also measures. The columns are the last 12 months, I get the following error when you try to view the results:

    Governor limit exceeded in the generation of cube (data records Maximum exceeded).
    Error details
    Error codes: QBVC92JY


    I checked view pivot instanceconfig.xml and these settings are set to higher values:

    < CubeMaxRecords > 100000 < / CubeMaxRecords >
    < CubeMaxPopulatedCells > 100000 < / CubeMaxPopulatedCells >
    < PivotView >
    < MaxVisibleColumns > 5000 < / MaxVisibleColumns >
    < MaxVisiblePages > 5000 < / MaxVisiblePages >
    < MaxVisibleRows > 100000 < / MaxVisibleRows >
    < MaxVisibleSections > 5000 < / MaxVisibleSections >
    < / PivotView >

    I do not know why this error pops up as the set of data is not large, there are 30 rows and 12 columns.

    I followed http://obiee101.blogspot.com/2008/02/obiee-controling-pivot-view-behavior.html

    Can anyone help?

    Thank you

    Hello

    Increase the size of CubeMaxRecords and CubeMaxPopulatedCells and check it out.

    600000
    600000

    See you soon,.
    Aravind

  • Load the page with a selected record

    I have a PHP page that is linked to a database of mySql with 2 recordets.  The first presents a simple list of records then a link to the presentation of the details of these records in a table below.  Everything works fine but the loading of the page details table is, of course, an empty shell until the user clicks on the link url in the list which loads the record in the table.

    I'd like the page to load with a recording of the sample inside so that it looks better - preferably a record at random, but if not, the first in the list.  What is the easiest way to achieve this?

    Ah ok, so it seems that the first Recordset is filtered on a certain setting? You view several records in the first Recordset in a region of repatriation? And the second table shows the details of a single record, but there should be a folder in the first Recordset. Hmmmm.  The first thing that comes to mind, is to set a variable and fill it with the ID value of the first record since the first Recordset. Then use this variable as a default in the second set. I am not sure of is this: when you fill out the variable, it will either use the first record in the game or the Recordset returns actually several records with different values for each ID, it can raise an error.

    I don't usually use the result of a recordset to filter a second, but I did - but it was only when the first Recordset returns only a single record.

  • Initialize the dashboard with no result

    Hi all

    I have a number of dashboards if left to run without parameter will return too much data and look a bit funny overall that users will almost never run the report with absolutely no filtering/guests.

    I remember reading a blogpost somewhere (but I can't find it now) that a detailed method to initialize with no data, the returned dashboard until the user, press the button 'Go' for the first time. Anyone know an approach similar to this or have a link to a place that I can reference?

    Thank you

    K

    (1) have a default value for the command prompt which has no chance to be in your report.

    This will produce a report of any record. From here you have two options for your dashboard.

    First option:

    (2) use NAV. guided tour of the section in which your report is located and navigate to the report to see if it returns the lines.

    (3) add a text object in a separate section just below the one above. Use the Navigation guided on this section to display the section if the report returns NO rows.

    (4) in the text window, type the following. Make sure that you select the HTML check box.

     

    Please enter something in the prompt

    Second option:

    (2) set up the display of "No. Results" of your report and when no record is found, the message is displayed.

  • How to find the number of search results

    Hello world

    Suppose that you type on the Web, and 2 page tabs for table dept and the other for the table emp and you do database text element not in the emp tab page. I need when I press dept page on-line access emp page and display the number of search results in this element of text

    any help that I enjoyed

    Thank you

    Hello
    What happens if you create a text element (no database) with the type of digital data in the emp block and set the property No. view records at 1.
    Then set the summary calculation .
    Then assign the Count summary function .
    Then choose the summarized block list as block EMP and choose any column in the list Elements summarized .

    The run of the form.

    -Clément

  • How to avoid the 'union' to combine several sqls in one?

    Hi all

    I wrote to the sql below for a list of instructions which do not have audit defined on it.

    The sql works very well and gives the right output, but its effective not because I am struck dba_stmt_audit_opts several times.

    Is there anyway I can avoid the union and get the same result by pressing dba_stmt_audit_opts once or twice?

    Thanks for you help,
    Hozy

    ------------------------------------------
    Select * from
    (
    SELECT "audit_alter_any_table" as a PROPERTY, 'ALTER ANY TABLE' as VALUE, "NA" as value2
    OF dba_stmt_audit_opts
    WHERE audit_option = "ALTER ANY TABLE"
    AND the success = "BY ACCESS.
    AND failure = "BY ACCESS.
    AND user_name IS NULL
    AND proxy_name IS NULL
    Having COUNT ()) = 0
    Union
    SELECT "audit_alter_user" as a PROPERTY, "ALTER USER' as VALUE,"NA"as value2
    OF dba_stmt_audit_opts
    WHERE audit_option = 'ALTER USER'
    AND the success = "BY ACCESS.
    AND failure = "BY ACCESS.
    AND user_name IS NULL
    AND proxy_name IS NULL
    Having COUNT ()) = 0
    Union
    SELECT "audit_create_role" as a PROPERTY, "CREATE ROLE" as a VALUE, "NA" as value2
    OF dba_stmt_audit_opts
    WHERE audit_option = "CREATE ROLE"
    AND the success = "BY ACCESS.
    AND failure = "BY ACCESS.
    AND user_name IS NULL
    AND proxy_name IS NULL
    Having COUNT ()) = 0
    Union
    SELECT "audit_create_user" as a PROPERTY, "CREATE USER" as a VALUE, "NA" as value2
    OF dba_stmt_audit_opts
    WHERE audit_option = 'CREATE USER'
    AND the success = "BY ACCESS.
    AND failure = "BY ACCESS.
    AND user_name IS NULL
    AND proxy_name IS NULL
    Having COUNT ()) = 0
    Union
    SELECT "audit_create_session" as a PROPERTY, "CREATE SESSION" as a VALUE, "NA" as value2
    OF dba_stmt_audit_opts
    WHERE audit_option = "CREATE SESSION".
    AND the success = "BY ACCESS.
    AND failure = "BY ACCESS.
    AND user_name IS NULL
    AND proxy_name IS NULL
    Having COUNT ()) = 0
    Union
    SELECT "audit_drop_any_proc" as a PROPERTY, "DROP ANY PROCEDURE" as VALUE, "NA" as value2
    OF dba_stmt_audit_opts
    WHERE audit_option = 'DROP ANY PROCEDURE'
    AND the success = "BY ACCESS.
    AND failure = "BY ACCESS.
    AND user_name IS NULL
    AND proxy_name IS NULL
    Having COUNT ()) = 0
    Union
    SELECT "audit_drop_any_table" as a PROPERTY, "DROP ANY TABLE" as a VALUE, "NA" as value2
    OF dba_stmt_audit_opts
    WHERE audit_option = 'DROP ANY TABLE'
    AND the success = "BY ACCESS.
    AND failure = "BY ACCESS.
    AND user_name IS NULL
    AND proxy_name IS NULL
    Having COUNT ()) = 0
    Union
    SELECT "audit_grant_any_priv" as a PROPERTY, "GRANT ANY PRIVILEGE" as a VALUE, "NA" as value2
    OF dba_stmt_audit_opts
    WHERE audit_option = "GRANT ANY PRIVILEGE"
    AND the success = "BY ACCESS.
    AND failure = "BY ACCESS.
    AND user_name IS NULL
    AND proxy_name IS NULL
    Having COUNT ()) = 0
    Union
    SELECT "audit_drop_any_role" as a PROPERTY, "DROP ANY ROLE" as a VALUE, "NA" as value2
    OF dba_stmt_audit_opts
    WHERE audit_option = 'DROP ANY ROLE'
    AND the success = "BY ACCESS.
    AND failure = "BY ACCESS.
    AND user_name IS NULL
    AND proxy_name IS NULL
    Having COUNT ()) = 0
    Union
    SELECT "audit_execute_proc" as a PROPERTY, "RUN the PROCEDURE" as VALUE, "NA" as value2
    OF dba_stmt_audit_opts
    WHERE audit_option = 'PROCEDURE '.
    AND the success = "BY ACCESS.
    AND failure = "BY ACCESS.
    AND user_name IS NULL
    AND proxy_name IS NULL
    Having COUNT ()) = 0
    Union
    SELECT "audit_select_any_dic" as a PROPERTY, "SELECT ANY DICTIONARY" as VALUE, "NA" as value2
    OF dba_stmt_audit_opts
    WHERE audit_option = "SELECT ANY DICTIONARY"
    AND the success = "BY ACCESS.
    AND failure = "BY ACCESS.
    AND user_name IS NULL
    AND proxy_name IS NULL
    Having COUNT ()) = 0
    Union
    SELECT "audit_grant_any_obj" as a PROPERTY, 'GRANT ANY OBJECT' as VALUE, "NA" as value2
    OF dba_stmt_audit_opts
    WHERE audit_option = "GRANT ANY OBJECT"
    AND the success = "BY ACCESS.
    AND failure = "BY ACCESS.
    AND user_name IS NULL
    AND proxy_name IS NULL
    Having COUNT ()) = 0
    Union
    SELECT "audit_create_any_lib" as a PROPERTY, "CREATE ANY LIBRARY" as a VALUE, "NA" as value2
    OF dba_stmt_audit_opts
    WHERE audit_option = "CREATE ANY LIBRARY"
    AND the success = "BY ACCESS.
    AND failure = "BY ACCESS.
    AND user_name IS NULL
    AND proxy_name IS NULL
    Having COUNT ()) = 0
    Union
    SELECT "audit_create_lib" as a PROPERTY, "CREATE LIBRARY" as a VALUE, "NA" as value2
    OF dba_stmt_audit_opts
    WHERE audit_option = "CREATE LIBRARY"
    AND the success = "BY ACCESS.
    AND failure = "BY ACCESS.
    AND user_name IS NULL
    AND proxy_name IS NULL
    Having COUNT ()) = 0
    );

    Maybe something like that is what you are looking for:

    with options as (select 'ALTER ANY TABLE' col1 from dual union all
                     select 'ALTER USER' col1 from dual union all
                     .... -- all your audit_options
                     )
    select col1
    from   options
    minus
    select audit_option
    from   dba_stmt_audit_opts;
    
  • avoiding the multiple connection so that others

    If I nested call of procs such as JAVA p1 calling calling calling p3 p2 calling p4 and each of them have a call to a stored procedure to the error in the log and re-raise so that the java layer is notified of the exception, then the error is recorded several times in each nested proc as the exception propagates upward to the java layer and is trapped by WHEN E.A. and revived each proc. Even if I take Tom Kytes advice and limit the OTHERS WHEN in p1 alone though the outermost proc I would still need to log the exception exactly where it happens because I have all the context information tell in p3. But WHEN OTHERS in p1 would always double check the exception.

    Any successful implementation of suggestions / real to this problem?

    user4900730 wrote:

    That's exactly my question. I want to immediately connect the error say p3. But how should I inform p2 and p1 it has already been saved.

    You can use a global variable (defined in a packet header and usable by any code of PL in this session) for example is a Boolean.

    If TRUE then the LogMessage() procedure called from an exception handler logs an exception. If FALSE, it is not.

    You can now set to true by default. The first time that LogMessage() is called in a session, she saves this exception, and then set the variable false - preventing any future exception being saved.

    So what you want to do, is not feasible. I wonder, however, whether this approach is sensitive for the reasons that I said previously.

    The previous answer by BILLY VERREYNE indicates that multiple logging of the error should not be considered a problem. Maybe I'll settle for that.

    You can always report on an exception. Let's say that your LogMessage() procedure registers the exception with a timestamp of system (using a stand-alone insert) and with a single session or the process identifier, a journal table.

    P3 throws an exception. It connects there. P2 catches, saves her and it triggers again. P3 grabs and it opens a session.

    You have now 3 exceptions to the process. Each with a different timestamp. Your report for people dealing with exceptions can select the first exception (using the time stamp) that took place for this process.

    At the same time, you have a pile of complete call procedures that handled the exception which may shed more light on who called p3 when the 3 p exception occurred.

    Consistencyis a concept important in software engineering *. Consistency in design. Consistency in the code.

    If the p1 exception handler must act differently from p2 and 3 p (because of the requirement of logging) exception handlers, you now have an inconsistency in the design and code. This makes the design more complex and more complex code. Increase of complexity makes the code more difficult to read and to understand and maintain. Increases the likelihood of bugs.

    This is the fundamental reason why I don't like the approach you plan to record an exception only once to avoid a kind of duplication. There is inconsistent behavior of the code and exception handlers.

    Rather deal with this (no exception duplicate) requirement to report on this matter - as it is much more logical to apply the filter and reports logic to transform the data on exceptions in useful information for users.

Maybe you are looking for