bunch of table temporary organized vs table for organizing data

We use the Oracle 10 g on Linux platform.

We have an intermediate table where the data read from the file (containing 1 million rows per day) is loaded, enriched(insert/update/delete) and then finally load in the table of another. I want to know if this table is a global temporary table or normal heap organized to reduce the generation of undo/redo.

I'm not in favor of the temporary table because:
1. any additional pressure on the temporary tablespace can cause ORA-01652: unable to extend temp segment temporary problems
2. they are mainly intended to manipulate specific session data.
3 - statistics do not exist for these work Oracle tables. To do this, we will have to do a dynamic sampling in the query.

The problem with the table organized in piles is that they generate more undo/redo as temporary tables.

Please guide me.

>
We have an intermediate table where the data read from the file (containing 1 million rows per day) is loaded, enriched(insert/update/delete) and then finally load in the table of another. I want to know if this table is a global temporary table or normal heap organized to reduce the generation of undo/redo.

I'm not in favor of the temporary table because:
1. any additional pressure on the temporary tablespace can cause ORA-01652: unable to extend temp segment temporary problems
2. they are mainly intended to manipulate specific session data.
3 - statistics do not exist for these work Oracle tables. To do this, we will have to do a dynamic sampling in the query.

The problem with the table organized in piles is that they generate more undo/redo as temporary tables.
>
Some of you concerns can easily be mitigated.

1 - temp tablespace

A common practice for ETL processing to create and use a temporary tablespace customized to use TWG. This prevents the TWG of impacting on the standard temp space and possibly interfere with the rest of the DB.

See "Creating a temporary Table" in the DBA Guide. This article has sample code that illustrates this.
http://docs.Oracle.com/CD/B28359_01/server.111/b28310/tables003.htm#i1006400
>
By default, rows in a temporary table is stored in the temporary tablespace default of the user who creates it. However, you can assign a temporary table to a different tablespace when creating the temporary table by using the TABLESPACE in CREATE TABLE of TEMPORARY GLOBAL clause. You can use this feature to save the space used by temporary tables. For example, if you need to perform many operations of small temporary table and the default temporary tablespace is configured for sort operations and uses so a large measure, these small operations consume a lot of unnecessary disk space. In this case, it is best to assign a tablespace temporary second with a smaller measure.
>
#2 (planned for the session-specific data processing) is correct, that they have the specific session data. But GTT can also be used simply to reduce the amount of REDO normal DML operations during the processing of data which are already isolated from other users and does not need to be shared.
>
3 - statistics do not exist for these work Oracle tables. To do this, we will have to do a dynamic sampling in the query.
>
Sure - but a working Oracle generally does not collect statistics whenever you do DML on your staging table. And if you do stats new TRUNCATE/LOAD operations should gather anyway once you load the new data.

You may not have considered one of the factors are that you should design your architecture to be scalable. It is quite possible that you have only a SINGLE step and your current treatment is very simple.

If so AND you do not need an evolutionary process, so the solution suggested by knani, maybe the best solution for you. But this solution is not very scalable.

Complext ETL implementations have several steps. And the data still does not move between these steps in one nice, easy step. It is a common requirement that after each stage of data processing can must be discussed or reported on make sure you that he satisfied all the requirements of the company. Then data problems (e.g. lack of data, incorrect, etc.) must be resolved before the data can be processed to the next step. Depending on the severity of the problems that step may need to be rerun.

If only the TWG is used there is no way to "review" the data. And if ONLY the external tables are used, then you can not tables of several process efficient in parallel and asynchronously.

Complex implementations that I worked on usually consisted of a range of external and normal tables and TWG.

External tables and one step simple "data cleansing" are used to load data into tables normal as soon as possible. That allows multiple processes to run asynchronously and in parallel for the detection of problems of data 'stored' as much as possible. Any table can be reloaded/processed without affecting other processes.

This suggests that your first step should knapen processes in place do the same "serial" cleaning possible.

The second stage of ETL, when necessary, can perform cleanup of more complex data for data in a table or several tables. TWG can be effectively used here to store the intermediate results that may have large amounts of 'temporary' DML performed on the data. At the end of this stage the TWG data would be transferred to another table of result or permanent staging.

Start with a simple one-step process (maybe Karthicks). The key is to avoid complicating the process in a way that makes it impassable.

Tags: Database

Similar Questions

  • Fill a Table for a Date Dimension

    I would be grateful for all the guru/developers who have populated it dates for a dimension table containing dates as in the example of data, that I will give you.
    And if you could share your knowledge on how this dimension table is filled.

    Here is the table structures you need and I have provided a check for the sqlloader in that file if you don't mine loads data into the dim_recnc_dte table.

    The sample I provided is a sample of data in 2006 smaill and the first 20 lines that was extracted from a productionised table which I thought will completely to the year 2037.

    I find it very difficult and I'm interested in this inclination. and how is this possible a sql to complete this table. ???

    CREATE TABLE DIM_RECNCL_DTE
    (
    DTE_ID NUMBER (12) NOT NULL,
    DATE OF DAY_DTE,
    DAY_NME VARCHAR2 (100 BYTE) NOT NULL,
    DAY_DESC VARCHAR2 (200 BYTE),
    BUSS_DAY_IND VARCHAR2 (20 BYTE) NOT NULL,
    WK_DAY_IND VARCHAR2 (20 BYTE) NOT NULL,
    WK_DAY_NBR NUMBER (10),
    DATE OF WK_START_DTE,
    DATE OF WK_END_DTE,
    DATE OF WK_LST_BUSS_DAY_DTE,
    MTH_DAY_NBR NUMBER (10),
    MTH_NME VARCHAR2 (100 BYTE) NOT NULL,
    MTH_DESC VARCHAR2 (200 BYTE) NOT NULL,
    DATE OF MTH_START_DTE,
    DATE OF MTH_END_DTE,
    DATE OF MTH_LST_BUSS_DAY_DTE,
    MTH_DAY NUMBER (10),
    MTH_BUSS_DAY NUMBER (10),
    QTR_DAY_NBR NUMBER (10),
    DATE OF QTR_START_DTE,
    QTR_END_DTE DATE NOT NULL,
    DATE OF QTR_LST_BUSS_DAY_DTE,
    QTR_DAY NUMBER (10),
    QTR_BUSS_DAY NUMBER (10),
    CLNDR_WK_NBR NUMBER (10),
    CLNDR_MTH_NBR NUMBER (10),
    CLNDR_QTR_NBR NUMBER (10),
    CLNDR_YR NUMBER (12).
    CLNDR_YR_DAY_NBR NUMBER (10),
    DATE OF CLNDR_YR_START_DTE,
    DATE OF CLNDR_YR_END_DTE,
    DATE OF CLNDR_YR_LST_BUSS_DAY_DTE,
    CLNDR_YR_DAY NUMBER (10),
    CLNDR_YR_BUSS_DAY NUMBER (10),
    FISCAL_WK_NBR NUMBER (10),
    FISCAL_MTH_NBR NUMBER (10),
    FISCAL_QTR_NBR NUMBER (10),
    FISCAL_YR NUMBER (10),
    FISCAL_YR_DAY_NBR NUMBER (10),
    DATE OF FISCAL_YR_START_DTE,
    DATE OF FISCAL_YR_END_DTE,
    DATE OF FISCAL_YR_LST_BUSS_DAY_DTE,
    FISCAL_YR_DAY NUMBER (10),
    FISCAL_YR_BUSS_DAY NUMBER (10),
    ROLLING_QTR_DAY_NBR NUMBER (10) NOT NULL,
    ROLLING_QTR_START_DTE DATE NOT NULL,
    ROLLING_QTR_END_DTE DATE NOT NULL,
    ROLLING_QTR_LST_BUSS_DAY_DTE DATE NOT NULL,
    ROLLING_QTR_DAY NUMBER (10) NOT NULL,
    ROLLING_QTR_BUSS_DAY NUMBER (10) NOT NULL,
    PRSN_TAX_YR VARCHAR2 (BYTE 9),
    DTE_PRT_DEFND_IND VARCHAR2 (20 BYTE) NOT NULL,
    AUDIT_ID NUMBER (12) NOT NULL
    )


    DOWNLOAD THE DATA
    INFILE *.
    BADFILE '. / DIM_RECNCL_DTE. BAD'
    DISCARDFILE '. / DIM_RECNCL_DTE. DSC"
    INSERT INTO TABLE DIM_RECNCL_DTE
    Fields ended by '; '. Surrounded of possibly "" "
    (
    DTE_ID NULLIF (DTE_ID = 'NULL'),
    DAY_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (DAY_DTE = "NULL").
    DAY_NME,
    DAY_DESC,
    BUSS_DAY_IND,
    WK_DAY_IND,
    WK_DAY_NBR NULLIF (WK_DAY_NBR = 'NULL'),
    WK_START_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (WK_START_DTE = "NULL").
    WK_END_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (WK_END_DTE = "NULL").
    WK_LST_BUSS_DAY_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (WK_LST_BUSS_DAY_DTE = "NULL").
    MTH_DAY_NBR NULLIF (MTH_DAY_NBR = 'NULL'),
    MTH_NME,
    MTH_DESC,
    MTH_START_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (MTH_START_DTE = "NULL").
    MTH_END_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (MTH_END_DTE = "NULL").
    MTH_LST_BUSS_DAY_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (MTH_LST_BUSS_DAY_DTE = "NULL").
    MTH_DAY NULLIF (MTH_DAY = 'NULL'),
    MTH_BUSS_DAY NULLIF (MTH_BUSS_DAY = 'NULL'),
    QTR_DAY_NBR NULLIF (QTR_DAY_NBR = 'NULL'),
    QTR_START_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (QTR_START_DTE = "NULL").
    QTR_END_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (QTR_END_DTE = "NULL").
    QTR_LST_BUSS_DAY_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (QTR_LST_BUSS_DAY_DTE = "NULL").
    QTR_DAY NULLIF (QTR_DAY = 'NULL'),
    QTR_BUSS_DAY NULLIF (QTR_BUSS_DAY = 'NULL'),
    CLNDR_WK_NBR NULLIF (CLNDR_WK_NBR = 'NULL'),
    CLNDR_MTH_NBR NULLIF (CLNDR_MTH_NBR = 'NULL'),
    CLNDR_QTR_NBR NULLIF (CLNDR_QTR_NBR = 'NULL'),
    CLNDR_YR NULLIF (CLNDR_YR = 'NULL'),
    CLNDR_YR_DAY_NBR NULLIF (CLNDR_YR_DAY_NBR = 'NULL'),
    CLNDR_YR_START_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (CLNDR_YR_START_DTE = "NULL").
    CLNDR_YR_END_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (CLNDR_YR_END_DTE = "NULL").
    CLNDR_YR_LST_BUSS_DAY_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (CLNDR_YR_LST_BUSS_DAY_DTE = "NULL").
    CLNDR_YR_DAY NULLIF (CLNDR_YR_DAY = 'NULL'),
    CLNDR_YR_BUSS_DAY NULLIF (CLNDR_YR_BUSS_DAY = 'NULL'),
    FISCAL_WK_NBR NULLIF (FISCAL_WK_NBR = 'NULL'),
    FISCAL_MTH_NBR NULLIF (FISCAL_MTH_NBR = 'NULL'),
    FISCAL_QTR_NBR NULLIF (FISCAL_QTR_NBR = 'NULL'),
    FISCAL_YR NULLIF (FISCAL_YR = 'NULL'),
    FISCAL_YR_DAY_NBR NULLIF (FISCAL_YR_DAY_NBR = 'NULL'),
    FISCAL_YR_START_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (FISCAL_YR_START_DTE = "NULL").
    FISCAL_YR_END_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (FISCAL_YR_END_DTE = "NULL").
    FISCAL_YR_LST_BUSS_DAY_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (FISCAL_YR_LST_BUSS_DAY_DTE = "NULL").
    FISCAL_YR_DAY NULLIF (FISCAL_YR_DAY = 'NULL'),
    FISCAL_YR_BUSS_DAY NULLIF (FISCAL_YR_BUSS_DAY = 'NULL'),
    ROLLING_QTR_DAY_NBR NULLIF (ROLLING_QTR_DAY_NBR = 'NULL'),
    ROLLING_QTR_START_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (ROLLING_QTR_START_DTE = "NULL").
    ROLLING_QTR_END_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (ROLLING_QTR_END_DTE = "NULL").
    ROLLING_QTR_LST_BUSS_DAY_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (ROLLING_QTR_LST_BUSS_DAY_DTE = "NULL").
    ROLLING_QTR_DAY NULLIF (ROLLING_QTR_DAY = 'NULL'),
    ROLLING_QTR_BUSS_DAY NULLIF (ROLLING_QTR_BUSS_DAY = 'NULL'),
    PRSN_TAX_YR,
    DTE_PRT_DEFND_IND,
    AUDIT_ID NULLIF (AUDIT_ID = 'NULL')
    )
    BEGINDATA
    20060131; "31/01/2006 00:00:00 '; "" KILL " ' Tuesday '; » Y « ; » Y » ; 4; "28/01/2006 00:00:00 '; "03/02/2006 00:00:00 '; "03/02/2006 00:00:00 '; 31; » « « « JAN » ; » January «;» 01/01/2006 00:00:00 '; 31/01/2006 00:00:00 '; 31/01/2006 00:00:00 '; 31; 20; 31; ' 01/01/2006 00:00:00 '; ' 31/03/2006 00:00:00 '; ' 31/03/2006 00:00:00 '; 90; 63; 5; 1; 1; 2006; 31; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '; 365; 255; 18; 4; 2; 2005; 123; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 92; "2005-11-01 00:00:00 '; "31/01/2006 00:00:00 '; "31/01/2006 00:00:00 '; 92; 62; "" 2005/06; "" FULL "; 664858
    20060201; ' 01/02/2006 00:00:00 '; "" SEA "; "" Wednesday "; » Y « ; » Y » ; 5; "28/01/2006 00:00:00 '; "03/02/2006 00:00:00 '; "03/02/2006 00:00:00 '; 1; "" FEB "; "February"; ' 01/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; 28; 20; 32; ' 01/01/2006 00:00:00 '; ' 31/03/2006 00:00:00 '; ' 31/03/2006 00:00:00 '; 90; 63; 5; 2; 1; 2006; 32; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '; 365; 255; 18; 5; 2; 2005; 124; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 63; "2005-12-01 00:00:00 '; "28/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; 90; 60; "" 2005/06; "" FULL "; 664858
    20060202; ' 02/02/2006 00:00:00 '; "" GAME "; ' Thursday '; » Y « ; » Y » ; 6; "28/01/2006 00:00:00 '; "03/02/2006 00:00:00 '; "03/02/2006 00:00:00 '; 2; "" FEB "; "February"; ' 01/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; 28; 20; 33; ' 01/01/2006 00:00:00 '; ' 31/03/2006 00:00:00 '; ' 31/03/2006 00:00:00 '; 90; 63; 5; 2; 1; 2006; 33; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '; 365; 255; 18; 5; 2; 2005; 125; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 64; "2005-12-01 00:00:00 '; "28/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; 90; 60; "" 2005/06; "" FULL "; 664858
    20060203; "03/02/2006 00:00:00 '; "' FRI '; ' Friday '; » Y « ; » Y » ; 7; "28/01/2006 00:00:00 '; "03/02/2006 00:00:00 '; "03/02/2006 00:00:00 '; 3; "" FEB "; "February"; ' 01/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; 28; 20; 34; ' 01/01/2006 00:00:00 '; ' 31/03/2006 00:00:00 '; ' 31/03/2006 00:00:00 '; 90; 63; 5; 2; 1; 2006; 34; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '; 365; 255; 18; 5; 2; 2005; 126; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 65; "2005-12-01 00:00:00 '; "28/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; 90; 60; "" 2005/06; "" FULL "; 664858
    20060405; "05/04/2006 00:00:00 '; "" SEA "; "" Wednesday "; » Y « ; » Y » ; 5; "01/04/2006 00:00:00 '; "07/04/2006 00:00:00 '; "07/04/2006 00:00:00 '; 5; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 5; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 14; 4; 2; 2006; 95; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 27; 7; 3; 2005; 187; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 64; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
    20060406; ' 06/04/2006 00:00:00 '; "" GAME "; ' Thursday '; » Y « ; » Y » ; 6; "01/04/2006 00:00:00 '; "07/04/2006 00:00:00 '; "07/04/2006 00:00:00 '; 6; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 6; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 14; 4; 2; 2006; 96; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 27; 7; 3; 2005; 188; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 65; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
    20060407; "07/04/2006 00:00:00 '; "' FRI '; ' Friday '; » Y « ; » Y » ; 7; "01/04/2006 00:00:00 '; "07/04/2006 00:00:00 '; "07/04/2006 00:00:00 '; 7; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 7; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 14; 4; 2; 2006; 97; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 27; 7; 3; 2005; 189; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 66; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
    20060408; ' 08/04/2006 00:00:00 '; "" SAM "; ' Saturday '; » N « ; » N » ; 1; ' 08/04/2006 00:00:00 '; "14/04/2006 00:00:00 '; "13/04/2006 00:00:00 '; 8; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 8; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 14; 4; 2; 2006; 98; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 28; 7; 3; 2005; 190; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 67; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
    20060409; "09/04/2006 00:00:00 '; "" SUN "; ' Sunday '; » N « ; » N » ; 2; ' 08/04/2006 00:00:00 '; "14/04/2006 00:00:00 '; "13/04/2006 00:00:00 '; 9; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 9; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 15; 4; 2; 2006; 99; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 28; 7; 3; 2005; 191; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 68; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
    20060410; "10/04/2006 00:00:00 '; "" MY. " ' Monday '; » Y « ; » Y » ; 3; ' 08/04/2006 00:00:00 '; "14/04/2006 00:00:00 '; "13/04/2006 00:00:00 '; 10; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 10; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 15; 4; 2; 2006; 100; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 28; 7; 3; 2005; 192; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 69. ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
    20060411; "11/04/2006 00:00:00 '; "" KILL " ' Tuesday '; » Y « ; » Y » ; 4; ' 08/04/2006 00:00:00 '; "14/04/2006 00:00:00 '; "13/04/2006 00:00:00 '; 11; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 11; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 15; 4; 2; 2006; 101; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 28; 7; 3; 2005; 193; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 70; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
    20060412; "12/04/2006 00:00:00 '; "" SEA "; "" Wednesday "; » Y « ; » Y » ; 5; ' 08/04/2006 00:00:00 '; "14/04/2006 00:00:00 '; "13/04/2006 00:00:00 '; 12; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 12; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 15; 4; 2; 2006; 102; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 28; 7; 3; 2005; 194; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 71; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
    20060413; "13/04/2006 00:00:00 '; "" GAME "; ' Thursday '; » Y « ; » Y » ; 6; ' 08/04/2006 00:00:00 '; "14/04/2006 00:00:00 '; "13/04/2006 00:00:00 '; 13; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 13; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 15; 4; 2; 2006; 103; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 28; 7; 3; 2005; 195; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 72; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
    20060414; "14/04/2006 00:00:00 '; "' FRI '; ' Friday '; » N « ; » Y » ; 7; ' 08/04/2006 00:00:00 '; "14/04/2006 00:00:00 '; "13/04/2006 00:00:00 '; 14; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 14; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 15; 4; 2; 2006; 104; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 28; 7; 3; 2005; 196; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 73; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
    20060415; "15/04/2006 00:00:00 '; "" SAM "; ' Saturday '; » N « ; » N » ; 1; "15/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; 15; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 15; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 15; 4; 2; 2006; 105; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 29; 7; 3; 2005; 197; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 74; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
    20060416; "16/04/2006 00:00:00 '; "" SUN "; ' Sunday '; » N « ; » N » ; 2; "15/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; 16; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 16; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 16; 4; 2; 2006; 106; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 29; 7; 3; 2005; 198; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 75; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
    20060417; "17/04/2006 00:00:00 '; "" MY. " ' Monday '; » N « ; » Y » ; 3; "15/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; 17; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 17; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 16; 4; 2; 2006; 107; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 29; 7; 3; 2005; 199; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 76; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
    20060418; "18/04/2006 00:00:00 '; "" KILL " ' Tuesday '; » Y « ; » Y » ; 4; "15/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; 18; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 18; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 16; 4; 2; 2006; 108; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 29; 7; 3; 2005; 200; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 77; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
    20060419; "19/04/2006 00:00:00 '; "" SEA "; "" Wednesday "; » Y « ; » Y » ; 5; "15/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; 19; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 19; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 16; 4; 2; 2006; 109; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 29; 7; 3; 2005; 201; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 78; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
    20060420; "20/04/2006 00:00:00 '; "" GAME "; ' Thursday '; » Y « ; » Y » ; 6; "15/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; 20; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 20; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 16; 4; 2; 2006; 110; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 29; 7; 3; 2005; 202; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 79; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858

    Published by: user531731 on August 5, 2009 06:39

    This is an example which will fill the 2 folders for a day with combination Y and N for all dates between
    1/1/1980-1/1/2099.

    Create table emp3 (F_date date, cd varchar2 (1));

    declare
      v_date_key    DATE;
      v_future_date DATE;
    BEGIN
      v_date_key := to_date('01-JAN-1980', 'DD-MON-YYYY');
      v_future_date := to_date('01-JAN-2099', 'DD-MON-YYYY');
      WHILE (v_date_key <= v_future_date) LOOP
        INSERT INTO emp3 a (F_date, CD) VALUES (v_date_key, 'Y');
        INSERT INTO emp3 a (F_date, CD) VALUES (v_date_key, 'N');
        v_date_key := v_date_key + 1;
      END LOOP;
      Commit;
    end;
      
    
  • Table for the data to HAVE it

    Please add the table to my VI.

    Please see the attachment.

    A probe inside the loop shows a lot of activities.  Are you preventing the VI with the command menu to cancel the execution? If so, you will never see data outside the While loop. Data will only forward your While loop when you stop the loop by program.

    Your STOP button is hidden behind the chart. It out behind the loop and use it to stop the loop. Then you will see the table becomes populated.

    Your spreadsheet column headings indicates that you want to save the square and square root average of each sample. If this is the case, then these values must be calculated at the time of acquisition before the release of the data in the spreadsheet file.

    JohnCS

  • How can, during the collection of data, start a new column in my table every 100 data points?

    Hello! I have a problem with my data - I get in a wide range of 1 x 1000, but it's the repeated measures, each taking about 500 data points. I want to break this table for this data string start a new column in my table every 500 points given. I don't know how to do it-please help!

    datacompiler100 wrote:

    Hey thanks for the sponsor and the first off I must apologize for the State, I am attaching my VI. I put the part of the VI that I am working on (my team has access, so didn't post everything here) and also attached the data file (when just written in a spreadsheet file, not through the attached VI). I want to convert the long row of data and then start a new column every 50, 100, 5 points (user-defined).

    Using the data from the file, you can reshape everything simply (as you already!), followed by transposition (since you want columns instead of rows). 2D arrays must always be rectangular, so the last column is filled with zeros if necessary. Is that what you want?

    Of course if you try to add a new column in a file, that will not work. You can only add lines in an existing file because of the way the data is organized. To add columns, the entire file must be read, intertwined and all re-written new data in the file.

  • expdp for all dates earlier or equal to 31/03/2009

    Good morning gurus,
    I created a copy of my production for migration on different server. We have windows 2003 and oracle 10.2.0.4.I created a schema ivs2_prod (metadata_only) thru expdp/impdp and now want expdp/impdp some data tables for all dates prior to or equal to 31/03/2009 11:59:59.I don't see any option in this utility to do this. You gurus can help me with the solution for this.
    Kind regards
    Deepa

    Deepa,

    Use the settings file instead of using the query on the command line option, if you want to use the query option on command that you escape ' \ ' to correctly format the query. Try this and it should work

    #mytest.par

    query="where collectiondate <= to_date('2009-03-31 11:59:59','YYYY-MM-DD HH24:MI:SS')"
    

    Datapumpt export

    expdp username/pwd@prod directory=DATADIR1 dumpfile=expcollections_tab3 tables=collections_tab3 logfile=collection_tab3.log content=data_only parfile=mytest.par
    

    Concerning

  • Will be more charge time is mode bedside table for iwatch

    Will be more charge time is mode bedside table for iwatch?

    Hello

    In normal operation, the watch cannot be overloaded, and the battery won't suffer harm of regular refills done every day.

    The charge will stop automatically when the watch is fully charged. He leaves also again automatically as and when required due to the use of the current battery.

  • There is 'table' for the object_type values?

    I wonder if there is a system view, or a table for all the values that can appear in the all_objects.object_type column? Or maybe the only way to get this list of os values to query with the distinct keyword dba_objects.object_type?

    Thank you

    Look at the source of the view:

    Decode (o.type #, 0, 'NEXT ITEM', 1, 'INDEX', 2, 'TABLE', 3, 'CLUSTERS',)

    4, 'VIEW', 5, 'SYNONYM', 6, 'ORDER ',.

    7, 'PROCEDURE', 'FUNCTION', 8, 9, 'PACKAGE ',.

    11, 'PACKAGE BODY', 12, 'TRIGGER ',.

    13, 'TYPE', 14, 'TYPE BODY ':

    19, "PARTITION TABLE", 20, 'PARTITION OF THE INDEX', 21, 'CRAFT. "

    22, 'LIBRARY', 23, "DIRECTORY," 24, "QUEUE."

    28, 'JAVA SOURCE,' 29, 'JAVA CLASS', 30, 'JAVA RESOURCE. "

    32, "INDEXTYPE", 33, "OPERATOR",.

    34, 'TABLE SUBPARTITION', 35, "INDEX SUBPARTITION.

    40, 'LOB PARTITION', 41, "LOB SUBPARTITION.

    42, NVL ((SELECT "EQUIVALENCE OF REWRITING"

    AMOUNT $ s

    WHERE s.obj #= o.obj #.

    and bitand (s.xpflags, 8388608) = 8388608).

    "MATERIALIZED VIEW'),

    43, "DIMENSION."

    44, 'CONTEXT', 46, "SET OF RULES', 47, 'PLAN OF RESOURCES."

    48, GROUP "CONSUMER."

    55, "XML SCHEMA", 56, "JAVA DATA."

    57, 'EDITION', 59, "RULE."

    60, "CAPTURE," 61, "APPLICABLE."

    62, 'EVALUATION CONTEXT ',.

    66, 'JOB', 'PROGRAMME', 67, 68, "JOB CLASS", 69, 'WINDOW',

    72, GROUP 'PLANNER', 74, 'CALENDAR', 79 'CHAIN ',.

    81, 'GROUP OF FILES', 82 'MODEL OF EXPLORATION' 87, "ASSEMBLY,"

    90, 'IDENTIFICATION', ' 92, 'CUBE DIMENSION', 93, "CUBE."

    94, "MEASUREMENT RECORD", 95 "CUBE BUILD PROCESS,"

    100, "LINE WATCHER", 101, "DESTINATION."

    "UNDEFINED"),

  • How to create the column of the table for long-form Master detail relationship

    Apex 4.1

    Oracle 11g

    I created a form master detail and see the main table hotel_list and table hotel_mapping as detailed below.

    Hotel_list

    ID HOTEL_NAME

    1 Holiday Inn

    Hotel Hilton 2

    Hotel_mapping

    ID HOTEL_NAME MAPPING_NAME

    1 Inn Holiday Inn Select hotel

    2 holiday hotel Holiday Inn Select

    3 hotel Holiday Inn Holiday Inn Hotel

    4 Hilton Hotel Hilton Hotel chain

    Hotel Hilton 5 HiltonHotel

    Table Hotel_name Hotel_list is a linked table Hotel_mapping

    When I add a line to the Hotel_mapping table for the selected row in the hotel_list table, mapping_name of column is null, therefore impossible to create the relationship between the main table and the secondary table.

    I would like to know, how to create the relationship?

    Thank you very much

    Best regards

    Yong Huang,

    simple step see creating a form detailed master with APEX - Assistant Master retail

    and check how to maintain the relationship between two tables,

    simple return the packaged application «Sample of Masters details»

    and try to understand this concept...

    In your example, use Hotel_list.ID as a foreign key in the table Hotel_mapping

    and maintain the relationship with the column ID...

    and choose the display type of the column Hotel_list.ID in table Hotel_mapping as List(Query Based LOV) select.

    otherwise the best way is to create sample on oracle.apex.com

    I hope this helps...

    Leave.

  • source table for employee benefits

    Nice day

    I have a requirement to create a report for information on employees including their benefits and dependents.

    Any idea where I can query the data or what are the source table in this navigation?

    NAVIGATION: Employee Self Service > Employee legislative Information


    Thank you in advance,


    Rey

    per_assignment_extra_info is used for custom data. If your company has customized how they use it.

    Please close the message if you're done with it.

    See you soon,.

    Vignesh

  • How to collect statistics on the table for tables in a different schema

    Hi all

    I have a table in a schema, and I want to collect statistics for the table in a different schema.

    I gave GRANT ALL ON SCHEMA1. T1 TO SCHEMA2;

    And when I tried to run the command to collect statistics to help

    DBMS_STATS. GATHER_TABLE_STATS (OWNNAME = > 'SCHMEA1', TABNAME = > 'T1');

    The function will fail.

    Is there a way we can collect statistics of the table for tables in a schema into another schema.

    Thank you
    MK.

    You must grant analyze to schema2.

    SY.

  • Is it Possible to have 2 multi table for a descriptor point

    Hello

    Is it Possible to have 2 multi table for a descriptor to point... If so, can u explain how to implement it


    If not, what is the other way to do it.

    Thanks in advance.

    Kind regards
    Mark

    For example, student point-descriptor can have so many tables as SenVideo, topics, address etc...
    -RMishra

  • Best practices for migrating data tables - please comment.

    I have 5 new tables stocked with data that must be promoted an evolution to a production environment.
    Instead of just DBA using a data migration tool, they are insistent that I have record and provide scripts for each commit, in good condition, necessary both to build the table and insert the data from ground zero.

    I'm very little used to such an environment, and it looks much more risky for me to try to reconstruct the objects from scratch, so I already have a model perfect, tested and ready.

    They require a lot of literature, where each step is recorded in a document and use for deployment.
    I think their purpose is that they do not want to rely on backups but would rather than rely on a document that specifies each step to recreate.

    Please comment on your view of this practice. Thank you!

    I'm not a DBA. I can't even hold a candle to the fans of the forum as Srini/Justin/sb.
    Now that I'm giving a slightly different opinion.

    It is great to have, and I paraphrase of various positions
    Deployment documents,
    sign off of groups
    recovery steps
    Source code control
    repositories
    "The production environment is sacred. All the risks that must be reduced to a minimum at any price. In my opinion a DBA should NEVER move anything from a development environment directly in a production environment. 'NEVER.'
    etc etc.

    But we cannot generalize that each production system must have these. Each customer is different; everyone has different levels of fault tolerance.

    You can't wait for a change of design of the product to a cabinetmaker to go through "as rigorous than at NASA. Why would it be different for software change?

    How rigorous you put is a bit subjective - it depends on the policies of the company, experiences in society with the disasters of migration (if any) and the corporate culture.

    OP can come from a customer with lax controls. And it can be questioned if the rigour at the level of the new customer is worth it.

    To a single client, (and I don't kid), the prod password is apps/apps. (after 12 years of being alive!) I was appalled at first. But it's a very small business. They got saddled with EBS during the .com boom Hay days. They use just 2 modules in EBS. If I speak of creation (and charge) these documents/processes, I would lose the customer.
    My point is that not all places must or want these controls. By trial and error, you get what is the best company.

    OP:
    You said that you're not used to this type of environment. I recommend that you go with the current first. Spend time on understanding the value/use/history of these processes. Ask if they still seem too much to you. Keep in mind: this is subjective. And if it comes down to your existing reviews of the v/s, you need either an authority OR some means of persuasion (money / sox (intentional typo here) / beheaded horse heads... everything that works!)

    Sandeep Gandhi

    Edit: Typo-detail fault

    Published by: Sandeep Gandhi, independent Consultant on 25 June 2012 23:37

  • By comparing the two tables for the integrity of the data

    Hi all
    I need to compare two tables for the integrity of the data through the SQL query.

    If you need to compare all the columns of t1 to t2:

    (SELECT * FROM t1
    MINUS
    SELECT * FROM t2)
    UNION ALL
    (SELECT * FROM t2
    MINUS
    SELECT * FROM t1);
    

    Kind regards
    Ankit Rouault
    http://theoraclelog.blogspot.in

  • Base table for the customer in R12

    Hello

    What is the name of the base table for the customer in R12?

    Thank you
    GSM

    Please see (absorption of the projects of the TCA Architecture in version 12 [417511.1 ID]).

    Thank you
    Hussein

  • Unable to display data for the date where there is no entry in the table

    Hello

    I need a urgent, described below:

    I have a table named as 'dirty', consisting of three columns: empno, sale_amt and sale_date.
    (Please ref. The table with data script as shown below)

    Now, if I run the query:
    "select trunc (sale_date) sale_date, sum (sale_amt) total_sale of the sales group by order trunc (sale_date) by 1.
    It then displays the data for the dates there is an entry in this table. But it displays no data for the
    date in which there is no entry in this table.

    If you run the Table script with data in your schema, then u will see that there is no entry for the 28th. November 2009 in
    sales table. Now the above query displays data for the rest as his dates are in the table of the sale with the exception of 28. November 2009.
    But I need his presence in the result of the query with the value "sale_date' as '28. November 2009 "and that of"total_sale"as
    « 0 ».

    Y at - there no way to get the result I need?

    Please help as soon as POSSIBLE.

    Thanks in advance.

    Create the table script that contains data:
    ------------------------------------------

    CREATE TABLE SALE
    (
    NUMBER EMPNO,
    NUMBER OF SALE_AMT
    DATE OF SALE_DATE
    );
    TOGETHER TO DEFINE
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (100, 1000, TO_DATE (DECEMBER 1, 2009 10:20:10 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (100, 1000, TO_DATE (NOVEMBER 30, 2009 10:21:04 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (100, 1000, TO_DATE (NOVEMBER 29, 2009 10:21:05 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (100, 1000, TO_DATE (NOVEMBER 26, 2009 10:21:06 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (100, 1000, TO_DATE (NOVEMBER 25, 2009 10:21:07 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (200, 5000, TO_DATE (NOVEMBER 27, 2009 10:23:06 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (200, 4000, TO_DATE (NOVEMBER 29, 2009 10:23:08 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (200, 3000, TO_DATE (NOVEMBER 24, 2009 10:23:09 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (200, 2000, TO_DATE (NOVEMBER 30, 2009 10:23:10 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (300, 7000, TO_DATE (NOVEMBER 24, 2009 10:24:19 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (300, 5000, TO_DATE (NOVEMBER 25, 2009 10:24:20 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (300, 3000, TO_DATE (NOVEMBER 27, 2009 10:24:21 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (300, 2000, TO_DATE (NOVEMBER 29, 2009 10:24:22 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (300, 1000, TO_DATE (NOVEMBER 30, 2009 10:24:22 ',' DD/MM/YYYY HH24:MI:SS'));))
    COMMIT;

    Any help will be necessary for me


    Kind regards
    WITH tab AS
      (SELECT TRUNC(sale_date) sale_date,
        SUM(sale_amt) total_sale
         FROM sale
       GROUP BY TRUNC(sale_date)
       ORDER BY 1
      )
     SELECT sale_date,
      NVL(total_sale,0) total_sale
       FROM tab
       model
       REFERENCE refmodel ON (SELECT 1 indx, MAX(sale_date)-MIN(sale_date) AS daysdiff , MIN(sale_date) minsaledate FROM tab)
         dimension BY (indx)
         measures(daysdiff,minsaledate)
       main main_model
       dimension BY (sale_date)
       measures(total_sale)
       RULES upsert SEQUENTIAL ORDER ITERATE(1000) until (iteration_number>refmodel.daysdiff[1]-1)
       ( total_sale[refmodel.minsaledate[1]+iteration_number]=total_sale[cv()] )
    ORDER BY sale_date
    

    using a clause type

    Ravi Kumar

Maybe you are looking for