Archiving of audit data

Hi all

I have to do the archiving of Audit data. I scoured the document oracle provided with the number of queries.

Every thing is fine, but I doubt that the new table created UPA after alter table command has data that is eff_to_date is set to null, but the customer has 90% of the UPA, data in table eff_to_date are zero.

Now is it possible to remove these data to oracle-based. ?

Thank you

I solved this problem by making both tablespace and put old data in partition and rest eff_to_date UPA_LATEST1 = null in the UPA_LATEST partition.

After that, I took the extract of the partition and deleted the partition UPA_LATEST1.

After that, I run GenerateSnapshot.bat and run "Message from check the question" Planner. Now everything works fine.

Tags: Fusion Middleware

Similar Questions

  • Analyzes operational Audit, data verification and historical process of flow

    Hello

    Internal Audit Department asked a bunch of information, we need to compile from newspaper Audit task, data verification and process Flow history. We have all the information available, but not in a format that allows to correct "reporting" the log information. What is the best way to manage the HFM logs so that we can quickly filter and export the verification information required?

    We have housekeeping in place, newspapers are 'live' partial db tables and partial purged tables which have been exported to Excel to archive historical newspaper information.

    Thank you very much.

    I thought I posted this Friday, but I just noticed that I never hit the "Post Message" button, ha ha.

    This info below will help you translate some information in tables, etc.. You may realize in tables audit directly or move them to another array of appropriate data for analysis later. The consensus, even if I disagree, is that you will suffer from performance issues if your audit tables become too big, if you want to move them periodically. You can do it using a manual process of scheduled task, etc.

    I personally just throw in another table and report on it here. As mentioned above, you will need to translate some information as it is not "readable" in the database.

    For example, if I wanted to pull the load of metadata, rules of loading, loading list of members, you can run a query like this. (NOTE: strAppName must be the name of your application...)

    The main tricks to know at least for checking table tasks are finding how convert hours and determine what activity code matches the friendly name.

    -- Declare working variables --
    declare @dtStartDate as nvarchar(20)
    declare @dtEndDate as nvarchar(20)
    declare @strAppName as nvarchar(20)
    declare @strSQL as nvarchar(4000)
    -- Initialize working variables --
    set @dtStartDate = '1/1/2012'
    set @dtEndDate = '8/31/2012'
    set @strAppName = 'YourAppNameHere'
    
    --Get Rules Load, Metadata, Member List
    set @strSQL = '
    select sUserName as "User", ''Rules Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (1)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Metadata Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (21)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Memberlist Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (23)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + ''''
    
    exec sp_executesql @strSQL
    

    With regard to the codes of the activity, here's a quick breakdown on those...

    ActivityID     ActivityName
    0     Idle
    1     Rules Load
    2     Rules Scan
    3     Rules Extract
    4     Consolidation
    5     Chart Logic
    6     Translation
    7     Custom Logic
    8     Allocate
    9     Data Load
    10     Data Extract
    11     Data Extract via HAL
    12     Data Entry
    13     Data Retrieval
    14     Data Clear
    15     Data Copy
    16     Journal Entry
    17     Journal Retrieval
    18     Journal Posting
    19     Journal Unposting
    20     Journal Template Entry
    21     Metadata Load
    22     Metadata Extract
    23     Member List Load
    24     Member List Scan
    25     Member List Extract
    26     Security Load
    27     Security Scan
    28     Security Extract
    29     Logon
    30     Logon Failure
    31     Logoff
    32     External
    33     Metadata Scan
    34     Data Scan
    35     Extended Analytics Export
    36     Extended Analytics Schema Delete
    37     Transactions Load
    38     Transactions Extract
    39     Document Attachments
    40     Document Detachments
    41     Create Transactions
    42     Edit Transactions
    43     Delete Transactions
    44     Post Transactions
    45     Unpost Transactions
    46     Delete Invalid Records
    47     Data Audit Purged
    48     Task Audit Purged
    49     Post All Transactions
    50     Unpost All Transactions
    51     Delete All Transactions
    52     Unmatch All Transactions
    53     Auto Match by ID
    54     Auto Match by Account
    55     Intercompany Matching Report by ID
    56     Intercompany Matching Report by Acct
    57     Intercompany Transaction Report
    58     Manual Match
    59     Unmatch Selected
    60     Manage IC Periods
    61     Lock/Unlock IC Entities
    62     Manage IC Reason Codes
    63     Null
    
  • Identity Manager how to archive the audit tables?

    Hello

    Tables of internal audit (e.g., UPA, UPA_USR..) in Identity Manager consume a lot of space in the database. I know they can be disabled, but we need for testing purposes.
    Is there a script to archive the data in these tables and cut off their?

    Kind regards
    ECE

    Hello!

    You can follow the Notes 431429.1 notes 431420.1. from Oracle.

  • What is the AUDIT data dictionary?

    Hi all

    10.2.0.4
    I read the docs, but it tells me only information related to sys.aud$. Where this post
    I'm audit our table named using ARBITRATION:

    AUDIT ALTER, DELETE, INSERT, UPDATE ON HUMAN RESOURCES. ARBITRATION BY ACCESS;

    This data dictionary should I choose which reflect the action above is made?
    While I will not repeat in the running again?

    Thank you very much

    zxy

    Hello

    You can also consult this table
    http://docs.Oracle.com/CD/B13789_01/server.101/b10755/statviews_2047.htm

  • Purge schema OAM/IOM 11.1.1.3 Audit data

    All,

    Does anyone know how to archive/purge of the check data OAM (tables IAU_BASE etc.) and the IOM (tables UPA etc.).

    Thanks in advance,

    Please refer to Article ID 431420.1 for the tables of the UPA.

    HTH,
    BB

  • way to archive the audit

    Where can I change the path to the Archives of verification for the job of managing the Audit logs.

    In addition, how to I Delet old job logs.

    Thank you

    -Lmal

    Change it network file path of archive in the Portal Administration tool > Audit Manager

    Also see the following KB under Metalink:
    Audit Log Management Agent fails
    Article no.: 857242.1

    This article explains how to manage the database space used for audit messages. To deal with the reduction of the number of newspapers of the job, the process is different, but the first step for this would be to make sure that you run the weekly job of cleaning on a regular basis. I'll work on an article for these steps, but you can open a pension case in the meantime.

  • Relaxation for the audit data.

    Hi all

    I would like to rrre to write a trigger that maintains the auditing information.

    We have two tables. 1 2 EMP. EMP_AUDIT.

    If I do all DML operations or any other predefined on EMP table that verify information must INSERT into table EMP_AUDIT.

    Can you please let me know how to write a trigger for this.

    Thank you.

    This is called re - invent the wheel. Why opt for it?

    Have you considered maintenance efforts while suggesting the solution? How is it easy to replicate when given the need for another table of audit?

    You already have audit functionality provided by Oracle to take care of the check. Why not reuse the functionality rather that rebuild?

    This link here gives a demonstration step by step to enable the feature.

    In addition, the example does not ask for OP. Could there be updated or insert operations that may be interested in OP. With your path, he would need to create three different triggers.

    Here is an example to do only once:

    drop table test_table;
    drop table test_table_hist;

    create table test_Table (pk_col number primary key, col1 number, col2 varchar2(5));

    create table test_table_hist (pk_col number, old_col1 number, new_col1 number, old_col2 varchar2(5), new_col2 varchar2(5), operation varchar2(10), mod_time timestamp);

    create or replace trigger trg_test_table_history
    before insert or delete or update of col1, col2
    on test_table
    for each row
    begin
      if inserting then
        insert into test_table_hist (pk_col, new_col1, new_col2, operation, mod_time)
        values (:new.pk_col, :new.col1, :new.col2, 'Insert', systimestamp);
      elsif updating then
        insert into test_table_hist (pk_col, old_col1, new_col1, old_col2, new_col2, operation, mod_time)
        values (:new.pk_col, :old.col1, :new.col1, :old.col2, :new.col2, 'Update', systimestamp);
      else
        insert into test_table_hist (pk_col, old_col1, old_col2, operation, mod_time)
        values (:old.pk_col, :old.col1, :old.col2, 'Delete', systimestamp);
      end if;
    end;

    insert into test_table values (1, 1001, 'ABCD');
    update test_table set col2 = 'ABCDZ' where pk_col = 1;
    insert into test_table values (2, 1002, 'PQRS');
    delete from test_table where pk_col = 2;

    commit;

    select *
      from test_table_hist;

    PK_COL                 OLD_COL1               NEW_COL1               OLD_COL2 NEW_COL2 OPERATION  MOD_TIME                 
    ---------------------- ---------------------- ---------------------- -------- -------- ---------- -------------------------
    1                                             1001                            ABCD     Insert     04-FEB-13 06.50.58.926695000 AM
    1                      1001                   1001                   ABCD     ABCDZ    Update     04-FEB-13 06.50.59.099346000 AM
    2                                             1002                            PQRS     Insert     04-FEB-13 06.50.59.264155000 AM
    2                      1002                                          PQRS              Delete     04-FEB-13 06.50.59.427643000 AM

    Post edited by: PurveshK added Audit trigger for example

  • Copia archives via the data store

    Pessoal, good afternoon.

    Estou comencando no VMWare e estou com a seguinte has:

    Preciso fazer form of some diretorios com several archives (snapshots of VMs com) than are no DataStore A para o B, mas isso so pode ser feito madrugada, POIS o I/O back discos lap as VMs muito lentas.

    Alguem sabe como help?

    O objetivo e copy uma para com os seus um outro datastore Vm snapshots e deste modo mutiplicar as vms.

    JA I tried via clone, mas ele não traz os instant fazer.

    Obrigado.

    Peron

    Hello Peron,

    If you enable ssh, you can use the veeam fastscp for free, http://www.veeam.com/vmware-esxi-fastscp.html

    Concerning

    ---
    Diego Quintana


  • Speed up the audit date

    Hi all
    Today my colleague has complained that having a date range control clause would slow its application a lot. I wonder if it is better to replace a regular date range test:
    ...
    where trunc(tabname.mydate) between to_date('01-JAN-08','DD-MON-YY') and to_date('01-MAY-09','DD-MON-YY')
    ...
    by the following:
    select ..... ,
    (case when trunc(tabname.mydate) >= to_date('01-JAN-08','DD-MON-YY') then 1 else 0 end) +
    (case when trunc(tabname.mydate) <= to_date('01-MAY-09','DD-MON-YY') then 1 else 0 end) Date_Range,
    ...
    where Date_Range = 2
    ...
    Which of them would be faster? Thank you.

    Hello

    Change this:

    where trunc(tabname.mydate) between to_date('01-JAN-08','DD-MON-YY') and to_date('01-MAY-09','DD-MON-YY')
    

    To do this:

    where tabname.mydate >= to_date('01-JAN-08','DD-MON-YY') and tabname.mydate 
    

    Do not apply a function to the column you are looking at except if you have an index created on this exact column and call function, i.e., unless you have an index TRUNC (tabname.mydate) do not use TRUNC - it is not necessary.

    Then execute a plan to explain to the time and look at costs.

  • Historical data in the audit tables

    We are facing the following requirement of our current customer.

    They run a number of workflow manually for a few years, and they have retained the historical data in Excel spreadsheets. Now, they want to start using the BPM Suite.

    They would like to query power and display the track Audit Suite BPM of all workflows, i.e. not only those who will be executed with BPM Suite in the future, but also those who have been stored in spreadsheets Excel over the years.

    I have a few questions about this:

    What is a reasonable demand?
    My guess is that we need to load historical data in the tables of audit BPM (BPM_AUDIT_QUERY, etc.), is - that correct?
    Any suggestions on how to get there?

    I was reading the doc, Googling the web and searching this forum and I have not found yet any relevant material on this subject.

    Concerning

    Juan

    Published by: Juan Algaba Colera on October 11, 2012 04:14

    It would be very difficult to directly load their data in the audit tables. Also check tables are stored in the soa-infra schema which is usually configured to be served after a certain period of time so that it grows indefinitely, because it contains all the runtime information as well. You will need to determine how long they need these historical data and configure properly the purge and the size of the DB correctly for this amount of data.

    If data in excel are truly one, the new BPM process the best way to get the audit data in the process can be use the APIs to create instances of the process and then automate execution of each task as defined in the spreadsheets. This way these past instances have been run through the system.

    In most cases this is not done however. Most of the customers usually somewhere archived these worksheets in case they need to recover, but will not try to make that old data visible from the BPM workspace and to view an audit trail.

    Thank you
    Adam DesJardin

  • Audit purge signon data scheduling

    Hi all

    I have R12.1.3. I want to program 'Purge the Signon check data' to run once a month. Purge the Signon audit data the default date that is displayed is the current date. but I want to replace it with the current date less than 2 months. It's easy to do that when I run it manually, but in the planning of it, the only option I can think replaces the value set FND_STANDARD_DATE with custom value defined in the parameter check date. Is there a way to better/more easy to change the default date to the date of the day less than 2 months?

    Thank you

    DBA_EBiz_EBS wrote:
    Hi all

    I have R12.1.3. I want to program 'Purge the Signon check data' to run once a month. Purge the Signon audit data the default date that is displayed is the current date. but I want to replace it with the current date less than 2 months. It's easy to do that when I run it manually, but in the planning of it, the only option I can think replaces the value set FND_STANDARD_DATE with custom value defined in the parameter check date. Is there a way to better/more easy to change the default date to the date of the day less than 2 months?

    Thank you

    Plan the query to run once per month (re-run each month) from two months ago and activate "not date setting each run.

    If your query contains date parameters, you can select ' Increment date settings each run "so that the value for this parameter is necessary to adapt the new submission interval. For example, if the value of the parameter is 25 July 1997 07:00 and your interval is monthly, is set to August 25, 1997 07:00 for the next presentation.

    http://docs.Oracle.com/CD/A60725_05/HTML/comnls/us/FND/10gch606.htm

    Thank you
    Hussein

  • Configuration of the audit trail of data

    Hello

    I am brand new with Hyperion Planning/Essbase and I'm trying to implement an audit trail of data. I already turned on the audit data in the workspace-> Navigate-> Applications-> planning-> DBName-> Administration-> Application reports. I also downloaded SQLDeveloper in my laptop and have it operational. However, I can not understand how to connect with Hyperion Planning/Essbase. Is there documentation on how to setup the connection? Any help will be greatly appreciated.

    Thank you

    If you go into the administration of planning and change the data source for your application, then it should provide all the connection information.
    In SQLdeveloper add a new connection and enter the details of the data source to connect to the schema and expand tables and view or query the data in HSP_AUDIT_RECORDS

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Can you collect audit from sources of data non-base data?

    Can you collect audit from sources of data non-base data? For example, security related events in Windows and/or Linux?

    In the current production version of the Audit Vault, you can only collect audit data from Oracle, Sybase, SQL Server and DB2 LUW databases.

  • Serving AVDF 12.1.2 integrated with the package DBMS_AUDIT_MGMT allowing the automation of audit records

    I have a question about this part of the vault of the audit and the Guide Release 12.1.2 database firewall administrator documentation:

    -Start quote-

    Schedule for a job of automatic Purge

    Oracle AVDF is integrated with the DBMS_AUDIT_MGMT package on an Oracle database. This integration automates the purge of the AUD $ audit records and files of $ FGA_LOG and operating system .aud and .xml files after that that they have been properly applied in the repository of Audit Vault Server.

    Once the complete purge, officer of Vault automatically sets a timestamp on the audit data that has been collected. Therefore, you must set the property USE_LAST_ARCH_TIMESTAMP set to true to ensure that the right set of audit records are purged. You don't need to manually set a work of purge interval.

    -Extract-

    According to the documentation above, how AVDF brings integration resulting in automation?

    Hello

    When you configure an audit trail in the AV server, say a table AUD$ path, once it collects the audit data he attributes automatically the last time stamp archive on the secure target database (you can check it out of view DBA_AUDIT_MGMT_LAST_ARCH_TS).

    However, the trail (or the AV itself server) does not purge that verification data already collected.

    You have to clean these data with the DBMS_AUDIT_MGMT. Procedure CLEAN_AUDIT_TRAIL, example for AUD$ table only:

    BEGIN

    DBMS_AUDIT_MGMT. () CLEAN_AUDIT_TRAIL

    audit_trail_type-online DBMS_AUDIT_MGMT. AUDIT_TRAIL_AUD_STD,

    use_last_arch_timestamp => TRUE);

    END;

    /

    You can simply run this procedure via a job depending on how often you want to cleanup audit and what time recordings. You don't need to worry about the timestamp of last archive.

  • Audit trail - on tables

    Hello, I got where I got to watch the scenario changes which makes the user.

    It was easy for me to monitor on a regular basis. But when it comes in the form that the form doesn't have any associated point page... I had to use the trigger on the table.

    But on the use of the trigger on table, I noticed that the relaxation begins when you delete records from all the tables changing. COS even the Audit trail data in table are deleted right here. So is there a way I can monitor the removal of records from APEX part INSTEAD OF trigger on table tabular presentation.

    Thank you.

    So is there a way I can monitor the removal of records from APEX part INSTEAD OF trigger on table tabular presentation.

    I recommend you implement the verification on the side of the database, rather than on the side 'client' (Apex), as this would work regardless of the client (sqlplus, ODP.NET program, etc.) is used to update the data.

    Depending on your version of the database and the license, you can use the "Total Recall" feature, which manages in fact for you:

    http://www.Oracle.com/technetwork/issue-archive/2008/08-Jul/o48totalrecall-092147.html

    + "Oracle Total Recall provides many advantages over traditional archival methods based on demand and trigger.» First of all, archiving data are inviolable. No database user, not even privileged users, cannot change the historical record. This level of security is increasingly required by government regulation. No programming effort is necessary. There is no table archive design, no code to write and no trigger to debug. The result has much less risk than any which can damage inadvertently archive. The performance impact is minimal. There is no direct impact on the flow of transactions, because the processes involved in archiving run asynchronously. Oracle Total Recall is implemented natively in the Oracle database and application-transparent. You can implement it under a running application without making changes to existing code. » +

    Or you can use the trigger approach as you tried.

    But on the use of the trigger on table, I noticed that the relaxation begins when you delete records from all the tables changing. COS even the Audit trail data in table are deleted right here.

    This should not happen if you use a separate table to save the audit trail information in.

    For example, if you have the EMP table, you create a table named (for example) $ AUD EMP, which is identical in structure, except that it has additional columns to track audit date, user name and operation (insert, update, delete).

    Then you put a trigger "for each line" on EMP to save the changes to the AUD$ EMP. You can delete all you want from EMP, it does not table mutation error.

    -Morten

    http://ORA-00001.blogspot.com

Maybe you are looking for

  • VAIO Motion Eye Camera stop working

    Laptop: VGN-SR190EProblem: USB Motion Eye Camera stop workingOS: Vista Edition Home Premium The Sony Visual Communication Camera died in this laptop. I want to talk about the "Motion Eye" webcam above the screen. In the Arcsoft window, frame (window

  • How to do a restore of the system without a mouse

    After an update auto as my mouse no longer works, how to restore system without a mouse?

  • Studio XPS 7100 display problems

    Hello to all out there. I have a Studio XPS 7100 minitour, and I had a devil of a time to connect this PC to my TV 32 '' to the House. When I got the computer, I've upgraded to a graphics card for games and the card has the usual video output ports:

  • U2713HM, no power

    My U2713HM crashed today completely. It is all black and nothing woks at all. We had a small fall in electricity (a minute), and when the power came back, the monitor stayed off, when all the other devices started again. I unplugged all the cables an

  • Shortcut on the desktop for Task Manager, within the Task Scheduler?

    When I open the Task Scheduler, I see this... Task Scheduler (local) Event viewer tasks Microsoft Windows ... ... Task Manager I want to create a shortcut on the desktop for the entrance to the "Task Manager", in Task Scheduler. Is this possible? Kee