Statistics calculation of schema for a huge 10 hours batch

Hello

We are on gR 11, 2 (11.2.0.4) Oracle Linux. The integrated schema statistics collection job is configured to run only during the night now in production.

We have a huge batch moving a very large amount of data in almost 90% of the tables in the database (only get tables are not affected). It will take about 10 hours and is scheduled over the weekend. It's a big ETL process that moves a huge amount of data from a database (source) to another (target) database.  How can I go to ensure that schema statistics are updated when this heavy DML constantly occurs during this period of 10-hour day?  What is the best practice to adopt in such a scenario? What I read in the Oracle documentation, which is

"For tables that are loaded in bulk, procedures for the collection of statistics must be run on these tables immediately after the loading process, preferably as part of the same script or job that executes the bulk loading.

How to batch? One option is to "divide the lots say 5 equal pieces" and after that each piece is finished, run the stats together. What is the best way to do it?

I will be grateful for the entries on it.

Thank you

OrauserN

We have a huge batch moving a very large amount of data in almost 90% of the tables in the database (only get tables are not affected). It will take about 10 hours and is scheduled over the weekend. It's a big ETL process that moves a huge amount of data from a database (source) to another (target) database.

OK - but this does not explain why you have a question about what stats should be executed. And you do not mention what system, the source or the target. your stats is everything.

How can I go to ensure that schema statistics are updated when this heavy DML constantly occurs during this period of 10-hour day?  What is the best practice to adopt in such a scenario? What I read in the Oracle documentation, which is

"For tables that are loaded in bulk, procedures for the collection of statistics must be run on these tables immediately after the loading process, preferably as part of the same script or job that executes the bulk loading.

The recommendation is ONLY for the use case where there is a concomitant use of these ' tables that are loaded in bulk.

Yet once, you did not mention ANY use of the data in the systems of your source or target. Without knowing how the data is used DURING THE MOVE, we can give advice:

1. what 'pass' given average to you? You are "suppressing" the data from the source system and "insert" then the same data in the target system? Or you just ' data are copied "from source to the target?

2 users perform DML on tables of source? If data comes to be SELECTED from those tables and not deleted, so you need not to run statistics on the source system.

3. users perform DML on target tables? What is DML performed during loading the new data? Then the load might affect the performance of the executed DML.

How to batch? One option is to "divide the lots say 5 equal pieces" and after that each piece is finished, run the stats together. What is the best way to do it?

Who knows? You have not even told us:

1. what system, source or target or both, you are talking about

2. What does batch processing

3. what DML is to be carried out at the same time as the treatment by lots

Much depends on the type of DML performed and used tables and the structure of lots. It could be that a step-by-step process could be used. That would be a bit similar to what you proposed except that he wouldn't use the table stats, stats of schema:

1. run the part of lot 1 which updates some tables that uses certain DML

2. collect statistics table on the tables affected by part 1 - which minimizes the effect of 'statistics' in this part of the load

3. run the part of lot 2 that updates some different tables that affect different DML

4 collect statistics table on the tables affected by part 2

5 and so on

Tags: Database

Similar Questions

  • How to collect statistics on the table for tables in a different schema

    Hi all

    I have a table in a schema, and I want to collect statistics for the table in a different schema.

    I gave GRANT ALL ON SCHEMA1. T1 TO SCHEMA2;

    And when I tried to run the command to collect statistics to help

    DBMS_STATS. GATHER_TABLE_STATS (OWNNAME = > 'SCHMEA1', TABNAME = > 'T1');

    The function will fail.

    Is there a way we can collect statistics of the table for tables in a schema into another schema.

    Thank you
    MK.

    You must grant analyze to schema2.

    SY.

  • Inaccurate statistics calculated by the task of automatic statistics collection

    Hello

    I have a table with about 300,000 sets and I manually collected statistics for the table and its index with the command: "EXEC DBMS_STATS.gather_table_stats ('SCOTT', 'Table_test');" Both indices of the table statistics were accurate and the execution of SQL statements were also very powerful. Then, during the night Enterprise Manager recalculated statistics again, but the results were far from exact:

    OWNER: SCOTT
    INDEX_NAME: I_TEST_INDEX
    INDEX_TYPE: NORMAL
    TABLE_OWNER: SCOTT
    TABLE_NAME: TEST_TABLE
    TABLE_TYPE: TABLE
    UNIQUENESS: UNIQUE
    COMPRESSION: DISABLED
    PREFIX_LENGTH:
    NOM_TABLESPACE: USERS
    INI_TRANS: 2
    MAX_TRANS: 255
    INITIAL_EXTENT: 65536
    NEXT_EXTENT:
    MIN_EXTENTS: 1
    MAX_EXTENTS: 2147483645
    PCT_INCREASE:
    PCT_THRESHOLD:
    INCLUDE_COLUMN:
    FREELISTS:
    FREELIST_GROUPS:
    PCT_FREE: 10
    LOGGING: YES
    BLEVEL: 2
    LEAF_BLOCKS: 16
    DISTINCT_KEYS: 141
    AVG_LEAF_BLOCKS_PER_KEY: 1
    AVG_DATA_BLOCKS_PER_KEY: 1
    CLUSTERING_FACTOR: 19
    STATUS: VALID
    NUM_ROWS: 141
    SAMPLE_SIZE: 141
    LAST_ANALYZED: November 4, 2008 22:03
    DEGREE: 1
    INSTANCES: 1
    PARTITIONED: NO.
    TEMPORARY: N
    GENERATED: N
    SCHOOL: N
    USER_TABLES: DEFAULT
    USER_STATS: NO.
    DURATION:
    PCT_DIRECT_ACCESS: 100
    ITYP_OWNER:
    ITYP_NAME:
    PARAMETERS:
    GLOBAL_STATS: YES
    DOMIDX_STATUS:
    DOMIDX_OPSTATUS:
    FUNCIDX_STATUS:
    JOIN_INDEX: NO.
    IOT_REDUNDANT_PKEY_ELIM: NO.
    ELIMINATED: NO.


    The new statistics report that there are only 141 separate in the table key although there are actually 300 000. The result is that SQL statements executing degrades considerably. Does anyone know why the statistics have become so inaccurate? I'm on 10.2.0.1.0.

    Kind regards
    Swear

    user633661 wrote:
    The only thing I'm not sure of is if I should include the statistics calculation step in the batch (this is a PL/SQL procedure) - I mean, it's a mixture of normal practices code that calculates statistics with business logic code?

    Swear,

    I think Justin has responded to this already pretty well, it is quite common in environments of warehouses of data or any other leads batch environments. You must be precise (or let's say representative) statistics if you want the CBO to do a good job.

    BTW, do you know if the collection of statistics on an Index organized table is somewhat different that gather on heap-organized tables?

    No, there is no difference. DBMS_STATS (and even surprisingly ANALYZE) know IOT and collect 'table', column and potential segments of infinity and the (mandatory) primary key index (representing the IOT) even if you do not specify a "cascade-online true" in the call DBMS_STATS.

    Kind regards
    Randolf

    Oracle related blog stuff:
    http://Oracle-Randolf.blogspot.com/

    SQLTools ++ for Oracle (Open source Oracle GUI for Windows):
    http://www.sqltools-plusplus.org:7676 /.
    http://sourceforge.NET/projects/SQLT-pp/

  • Extension of the AD schema for second installation of the unit

    A CCM 4.1.3 and a Unity 4.0.4 server will be installed in a forest AD already having a CCM (3.x) and the servers of the unit (3.x) maintenance of users on the side is. The new CCM servers and services unit to users on the West Coast, including the mailboxes resident also on Exchange 2003 servers on the West Coast.

    When you install the unity server, what will be the consequences of the extension of the AD schema again for the new version of the unit? The schema has been extended during the installation of the 3.x version of the unit. Extension of the additional schema for the installation of the imminent 4.x Unit will have no effect on the existing unit? Or should the schema even extend again at all?

    Thank you

    Michael.

    I see a problem that you may or may not have lived:

    "Cisco Unity 3.1 does not support Exchange 2003. This means that you cannot run the tool of Exchange 2003 Forestprep to begin an upgrade to Exchange 2003, you can not have an Exchange 2003 server in the same forest, Active Directory as the Cisco Unity server, and you can not host subscribers Cisco Unity in Exchange 2003. Before you run Exchange 2003 Forestprep, you must first spend Cisco Unity version 4.0 (3) or later. Otherwise, the changes that Forestprep makes to Active Directory will cause Cisco Unity stop working. »

    See http://www.cisco.com/en/US/products/sw/voicesw/ps2237/prod_system_requirements_hardware09186a008010a339.html

    If you had an exchange environment 2000, he would have just run the adschema unit 4.x configuration and the extension of the schema is backward compatible for 3.x. You can run the extension of the scheme for unity 4.x anytime before installing unity 4.x

    If you want unity 4.x installed, you must extend the schema (even if the schema has been extended for unity 3.x).

  • Where can I get the URL scheme for adding / deleting/search a contact of the people app

    Original title: application Windows 8 Metro people...

    Where can I get the URL scheme for adding / deleting/search a contact of the people app

    It is not a URI API for this application.

  • Oracle schema for planning an Application

    Salvation by the Expert,

    What is the advantage of having a new scheme for a new planning application? It is advisable to use an existing scheme that already contains two planning application? What are the benefits of using the same schema for a new application plannign?

    Kind regards

    SG

    You must have a separate scheme by planning request otherwise that demand will be replaced each time.

    See you soon

    John

    http://John-Goodwin.blogspot.com/

  • Cannot create schema for the ATG publication

    I am so close to finishing my first install! Help, please.

    I'm stuck on creating the schema for the publication. All the rest when it is fine. But now:



    -----
    -SELECT A DATA SOURCE TO CONFIGURE-
    Enter [h] ELP, ain menu [m], [q] ITU to exit



    [R] reporting data warehouse - done
    [L] reports Loader - done
    * [P] Publishing
    [C] production Core - done
    [D]

    >

    -CONFIGURE DATASOURCE PUBLISHING-
    Enter [h] ELP, ain menu [m], [q] ITU to exit



    [C] the connection Details - facts
    [T] test connection - made
    * [S] create a schema
    Import initial data
    [D] drop Schema
    [O] set up another source of data

    >

    -------CREATE SCHEMA------------------------------------------------------------
    Enter [h] ELP, ain menu [m], [q] ITU to exit


    Editing


    * [C] create a schema
    [S] Skip

    >

    Beginning of creation of schema. View the log to /home/oracle/ATG/ATG10.0.3/home/ file
    .. /CIM/log/CIM.log
    |. . . . . . . . |
    |
    -CREATING SCHEMA FAILURE-
    Enter [h] ELP, ain menu [m], [q] ITU to exit


    ORA-00904: "CHECKIN_DATE": invalid identifier



    * [E] modify connection details
    [D] drop Schema
    [C] cancel

    >

    Can you check the ATG/home/./... /CIM/log/CIM.log and post the error here?

    Or maybe just drop the schema and try again to create.

    -Kiss

  • Assigning a default schema for a user.

    In Oracle can I assign a default schema for a user without using the later session command?

    Is there a USER EDIT setting where I can assign a default schema to a user?

    user9229690 wrote:
    Why don't you create a role and assign this role to this user
    -specify what you want in the role.

    hope this has helped

    That would give the user permission to access the tables in the schema of the app, but it would not eliminate the need to qualify these tables with schema name, and it was the stated purpose of the OP.

    The solution to the OP's question is synonymous, as Centinul suggested. If it is for a single user, private synonym would be appropriate if, as it seems, many users, may need a public synonym would be more appropriate. If who is considered to be a security problem, then either the logon (ALTER SESSION SET SCHEMA CURRENT...) trigger or procedure run when a new user is created, a set of private synonyms.

  • Get performance statistics of a VM method for the last 2 hours

    Need a scipt for toolkit VI to obtain performance statistics of a VM method for the last 2 hours

    What metrics (cpu.usage.average, disk.usage.average...) are you looking for?

    Have you tried one of the preconfigured metric collections (Cpu, memory, disk, network) which are available in the cmdlet Get-Stat?

    An example with the scope of the Cpu

    get-vm  | Get-Stat -Cpu -Start (Get-Date).addhours(-2) -IntervalMins 2 -MaxSamples 60
    
  • granting of privileges to the schema for several tables at the same time... any script?

    Hello gurus,

    I have about 25 tables in the ABC scheme

    I want to give all privialges to the XYZ schema for all tables of 25 which is in the pattern ABC... So is there is SQL statement or a script, I can run to grant privileges to all tables.

    Something similar to these...

      SELECT 'create synonym ' || table_name || ' for ' || table_name
      FROM user_tables  {code}
     
    So i get all the table names ....then i can run as a script.... U r help is greatly appriciated gurus!!!
    
    
    Thank you!!!                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                    

    Administrator:

    set head off
    set pages 0
    set feed off
    spool myscript.sql
    Select 'grant select, insert, update, delete on abc.'||table_name||' to xyz;'
      from dba_tables
     where owner = 'ABC';
    
    Select 'create synonym xyz.'||table_name||' for abc.'||table_name||';'
      from dba_tables
     where owner = 'ABC';
    
    spool off;
    

    Obviously this does not all new table that will be created in the future on ABC schema...

    Max
    [My Italian blog Oracle | http://oracleitalia.wordpress.com/2010/02/07/aggiornare-una-tabella-con-listruzione-merge/]

  • What is the schema for the service module planning part name

    Hi Experts,

    I need the info

    1. What is the name of the schema for the PSP module
    For example, suppose that we call CPSA as MSC, in similar passion, which is the name given to the SPP.

    2. I searched the details in the table in etrm site (in R12.1.1) but it does not exist, can anyone help me to get the details of the table in the PSP?

    Thank you for your valuable time.

    Good bye
    Badin

    SPP uses the same tables as MSC with additional columns as appropriate. There is therefore no separate scheme for the SPP name.

  • Introduction to iCloud but "glued" for more than 12 hours

    I am new to icloud and try to put up my pictures in icloud.

    in the Photos on my imac, he says it's "uploading 11 983 points", but he's been stuck like that for more than 12 hours.

    If I click on icloud in system preferences, it is also stuck and will not let me click on anything whatsoever.

    What should I do now?

    The first part in the photos is not uncommon, the initial download may take days and seems to stall for long periods of time. The second part on the system preferences is unusual, have you tried to restart the Mac yet.

  • I tried to erase and reset my iphone 6 s. But why is it taking so long? For more than 24 hours. Is this normal? The phone displays just the apple logo and wont open. Help, please. Thank you in advance.

    I tried to erase and reset my iphone 6 s. But why is it taking so long? For more than 24 hours. Is this normal? The phone displays just the apple logo and wont open. Help, please. Thank you in advance.

    Hello

    Follow the instructions here:

    If you are unable to update or restore your iPhone, iPad or iPod touch - Apple supports

  • 10.11.5 was their stall for more than 2 hours, is it possible to cancel install

    10.11.5 on MacBook Air installation it worked for more than two hours, anyway to stop installing?

    Try to shut down the computer using the switch and restart. This may cause the installation ends.

  • A frozen (for more than an hour) when installing Win7 updates

    During the installation of updates during a restart, IQ506 froze, although few 'circle' still rotating, for more than an hour so far.  Message says not to put off, but nothing happens.  Where should I go from here, if I am not in power down?

    A few times just to give up and put off, I look at the HD indicator, if it goes 30 minutes without blinking I give him a strong power off, if the HD is flashing I will usually let it run all night

Maybe you are looking for

  • can't see the devices in my macbook pro

    I can't see the devices in my macbook pro

  • iwatch pairing problems

    whenever I ping my iwatch to confirm the connection with my iphone5 it doesn't connect automatically.

  • Problem VLAN HP Mini 110-4101er

    Hello. We have a level 2 + switch with VLAN. This netbook is unable to obtain an IP address from the DHCP server. Parameters default network has been applied. The same thing with another HP laptop with the same NIC Realtek. But another Acer netbook w

  • Safari on Mac El Capitan accidents

    Hi, recently updated to El Capitan and installed all updates including safari.  Safari then began to plant start up. Have followed the instructions found on the Internet (via the iPad) to remove all extensions, cache, plug-ins etc etc.  Can now open

  • lsasss.exe - application error "the application could not initialize properly (0xc0000006).

    Hi I was wondering if someone out there could help, My laptop is a packard bell Windows xp, when I turn on the laptop it does not completely load, it will take me to a black screen with the error message error lsasss.exe - request "the application co