Fill a Table for a Date Dimension

I would be grateful for all the guru/developers who have populated it dates for a dimension table containing dates as in the example of data, that I will give you.
And if you could share your knowledge on how this dimension table is filled.

Here is the table structures you need and I have provided a check for the sqlloader in that file if you don't mine loads data into the dim_recnc_dte table.

The sample I provided is a sample of data in 2006 smaill and the first 20 lines that was extracted from a productionised table which I thought will completely to the year 2037.

I find it very difficult and I'm interested in this inclination. and how is this possible a sql to complete this table. ???

CREATE TABLE DIM_RECNCL_DTE
(
DTE_ID NUMBER (12) NOT NULL,
DATE OF DAY_DTE,
DAY_NME VARCHAR2 (100 BYTE) NOT NULL,
DAY_DESC VARCHAR2 (200 BYTE),
BUSS_DAY_IND VARCHAR2 (20 BYTE) NOT NULL,
WK_DAY_IND VARCHAR2 (20 BYTE) NOT NULL,
WK_DAY_NBR NUMBER (10),
DATE OF WK_START_DTE,
DATE OF WK_END_DTE,
DATE OF WK_LST_BUSS_DAY_DTE,
MTH_DAY_NBR NUMBER (10),
MTH_NME VARCHAR2 (100 BYTE) NOT NULL,
MTH_DESC VARCHAR2 (200 BYTE) NOT NULL,
DATE OF MTH_START_DTE,
DATE OF MTH_END_DTE,
DATE OF MTH_LST_BUSS_DAY_DTE,
MTH_DAY NUMBER (10),
MTH_BUSS_DAY NUMBER (10),
QTR_DAY_NBR NUMBER (10),
DATE OF QTR_START_DTE,
QTR_END_DTE DATE NOT NULL,
DATE OF QTR_LST_BUSS_DAY_DTE,
QTR_DAY NUMBER (10),
QTR_BUSS_DAY NUMBER (10),
CLNDR_WK_NBR NUMBER (10),
CLNDR_MTH_NBR NUMBER (10),
CLNDR_QTR_NBR NUMBER (10),
CLNDR_YR NUMBER (12).
CLNDR_YR_DAY_NBR NUMBER (10),
DATE OF CLNDR_YR_START_DTE,
DATE OF CLNDR_YR_END_DTE,
DATE OF CLNDR_YR_LST_BUSS_DAY_DTE,
CLNDR_YR_DAY NUMBER (10),
CLNDR_YR_BUSS_DAY NUMBER (10),
FISCAL_WK_NBR NUMBER (10),
FISCAL_MTH_NBR NUMBER (10),
FISCAL_QTR_NBR NUMBER (10),
FISCAL_YR NUMBER (10),
FISCAL_YR_DAY_NBR NUMBER (10),
DATE OF FISCAL_YR_START_DTE,
DATE OF FISCAL_YR_END_DTE,
DATE OF FISCAL_YR_LST_BUSS_DAY_DTE,
FISCAL_YR_DAY NUMBER (10),
FISCAL_YR_BUSS_DAY NUMBER (10),
ROLLING_QTR_DAY_NBR NUMBER (10) NOT NULL,
ROLLING_QTR_START_DTE DATE NOT NULL,
ROLLING_QTR_END_DTE DATE NOT NULL,
ROLLING_QTR_LST_BUSS_DAY_DTE DATE NOT NULL,
ROLLING_QTR_DAY NUMBER (10) NOT NULL,
ROLLING_QTR_BUSS_DAY NUMBER (10) NOT NULL,
PRSN_TAX_YR VARCHAR2 (BYTE 9),
DTE_PRT_DEFND_IND VARCHAR2 (20 BYTE) NOT NULL,
AUDIT_ID NUMBER (12) NOT NULL
)


DOWNLOAD THE DATA
INFILE *.
BADFILE '. / DIM_RECNCL_DTE. BAD'
DISCARDFILE '. / DIM_RECNCL_DTE. DSC"
INSERT INTO TABLE DIM_RECNCL_DTE
Fields ended by '; '. Surrounded of possibly "" "
(
DTE_ID NULLIF (DTE_ID = 'NULL'),
DAY_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (DAY_DTE = "NULL").
DAY_NME,
DAY_DESC,
BUSS_DAY_IND,
WK_DAY_IND,
WK_DAY_NBR NULLIF (WK_DAY_NBR = 'NULL'),
WK_START_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (WK_START_DTE = "NULL").
WK_END_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (WK_END_DTE = "NULL").
WK_LST_BUSS_DAY_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (WK_LST_BUSS_DAY_DTE = "NULL").
MTH_DAY_NBR NULLIF (MTH_DAY_NBR = 'NULL'),
MTH_NME,
MTH_DESC,
MTH_START_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (MTH_START_DTE = "NULL").
MTH_END_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (MTH_END_DTE = "NULL").
MTH_LST_BUSS_DAY_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (MTH_LST_BUSS_DAY_DTE = "NULL").
MTH_DAY NULLIF (MTH_DAY = 'NULL'),
MTH_BUSS_DAY NULLIF (MTH_BUSS_DAY = 'NULL'),
QTR_DAY_NBR NULLIF (QTR_DAY_NBR = 'NULL'),
QTR_START_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (QTR_START_DTE = "NULL").
QTR_END_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (QTR_END_DTE = "NULL").
QTR_LST_BUSS_DAY_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (QTR_LST_BUSS_DAY_DTE = "NULL").
QTR_DAY NULLIF (QTR_DAY = 'NULL'),
QTR_BUSS_DAY NULLIF (QTR_BUSS_DAY = 'NULL'),
CLNDR_WK_NBR NULLIF (CLNDR_WK_NBR = 'NULL'),
CLNDR_MTH_NBR NULLIF (CLNDR_MTH_NBR = 'NULL'),
CLNDR_QTR_NBR NULLIF (CLNDR_QTR_NBR = 'NULL'),
CLNDR_YR NULLIF (CLNDR_YR = 'NULL'),
CLNDR_YR_DAY_NBR NULLIF (CLNDR_YR_DAY_NBR = 'NULL'),
CLNDR_YR_START_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (CLNDR_YR_START_DTE = "NULL").
CLNDR_YR_END_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (CLNDR_YR_END_DTE = "NULL").
CLNDR_YR_LST_BUSS_DAY_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (CLNDR_YR_LST_BUSS_DAY_DTE = "NULL").
CLNDR_YR_DAY NULLIF (CLNDR_YR_DAY = 'NULL'),
CLNDR_YR_BUSS_DAY NULLIF (CLNDR_YR_BUSS_DAY = 'NULL'),
FISCAL_WK_NBR NULLIF (FISCAL_WK_NBR = 'NULL'),
FISCAL_MTH_NBR NULLIF (FISCAL_MTH_NBR = 'NULL'),
FISCAL_QTR_NBR NULLIF (FISCAL_QTR_NBR = 'NULL'),
FISCAL_YR NULLIF (FISCAL_YR = 'NULL'),
FISCAL_YR_DAY_NBR NULLIF (FISCAL_YR_DAY_NBR = 'NULL'),
FISCAL_YR_START_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (FISCAL_YR_START_DTE = "NULL").
FISCAL_YR_END_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (FISCAL_YR_END_DTE = "NULL").
FISCAL_YR_LST_BUSS_DAY_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (FISCAL_YR_LST_BUSS_DAY_DTE = "NULL").
FISCAL_YR_DAY NULLIF (FISCAL_YR_DAY = 'NULL'),
FISCAL_YR_BUSS_DAY NULLIF (FISCAL_YR_BUSS_DAY = 'NULL'),
ROLLING_QTR_DAY_NBR NULLIF (ROLLING_QTR_DAY_NBR = 'NULL'),
ROLLING_QTR_START_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (ROLLING_QTR_START_DTE = "NULL").
ROLLING_QTR_END_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (ROLLING_QTR_END_DTE = "NULL").
ROLLING_QTR_LST_BUSS_DAY_DTE DATE "MM/DD/YYYY HH24:MI:SS" NULLIF (ROLLING_QTR_LST_BUSS_DAY_DTE = "NULL").
ROLLING_QTR_DAY NULLIF (ROLLING_QTR_DAY = 'NULL'),
ROLLING_QTR_BUSS_DAY NULLIF (ROLLING_QTR_BUSS_DAY = 'NULL'),
PRSN_TAX_YR,
DTE_PRT_DEFND_IND,
AUDIT_ID NULLIF (AUDIT_ID = 'NULL')
)
BEGINDATA
20060131; "31/01/2006 00:00:00 '; "" KILL " ' Tuesday '; » Y « ; » Y » ; 4; "28/01/2006 00:00:00 '; "03/02/2006 00:00:00 '; "03/02/2006 00:00:00 '; 31; » « « « JAN » ; » January «;» 01/01/2006 00:00:00 '; 31/01/2006 00:00:00 '; 31/01/2006 00:00:00 '; 31; 20; 31; ' 01/01/2006 00:00:00 '; ' 31/03/2006 00:00:00 '; ' 31/03/2006 00:00:00 '; 90; 63; 5; 1; 1; 2006; 31; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '; 365; 255; 18; 4; 2; 2005; 123; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 92; "2005-11-01 00:00:00 '; "31/01/2006 00:00:00 '; "31/01/2006 00:00:00 '; 92; 62; "" 2005/06; "" FULL "; 664858
20060201; ' 01/02/2006 00:00:00 '; "" SEA "; "" Wednesday "; » Y « ; » Y » ; 5; "28/01/2006 00:00:00 '; "03/02/2006 00:00:00 '; "03/02/2006 00:00:00 '; 1; "" FEB "; "February"; ' 01/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; 28; 20; 32; ' 01/01/2006 00:00:00 '; ' 31/03/2006 00:00:00 '; ' 31/03/2006 00:00:00 '; 90; 63; 5; 2; 1; 2006; 32; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '; 365; 255; 18; 5; 2; 2005; 124; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 63; "2005-12-01 00:00:00 '; "28/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; 90; 60; "" 2005/06; "" FULL "; 664858
20060202; ' 02/02/2006 00:00:00 '; "" GAME "; ' Thursday '; » Y « ; » Y » ; 6; "28/01/2006 00:00:00 '; "03/02/2006 00:00:00 '; "03/02/2006 00:00:00 '; 2; "" FEB "; "February"; ' 01/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; 28; 20; 33; ' 01/01/2006 00:00:00 '; ' 31/03/2006 00:00:00 '; ' 31/03/2006 00:00:00 '; 90; 63; 5; 2; 1; 2006; 33; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '; 365; 255; 18; 5; 2; 2005; 125; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 64; "2005-12-01 00:00:00 '; "28/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; 90; 60; "" 2005/06; "" FULL "; 664858
20060203; "03/02/2006 00:00:00 '; "' FRI '; ' Friday '; » Y « ; » Y » ; 7; "28/01/2006 00:00:00 '; "03/02/2006 00:00:00 '; "03/02/2006 00:00:00 '; 3; "" FEB "; "February"; ' 01/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; 28; 20; 34; ' 01/01/2006 00:00:00 '; ' 31/03/2006 00:00:00 '; ' 31/03/2006 00:00:00 '; 90; 63; 5; 2; 1; 2006; 34; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '; 365; 255; 18; 5; 2; 2005; 126; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 65; "2005-12-01 00:00:00 '; "28/02/2006 00:00:00 '; "28/02/2006 00:00:00 '; 90; 60; "" 2005/06; "" FULL "; 664858
20060405; "05/04/2006 00:00:00 '; "" SEA "; "" Wednesday "; » Y « ; » Y » ; 5; "01/04/2006 00:00:00 '; "07/04/2006 00:00:00 '; "07/04/2006 00:00:00 '; 5; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 5; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 14; 4; 2; 2006; 95; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 27; 7; 3; 2005; 187; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 64; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
20060406; ' 06/04/2006 00:00:00 '; "" GAME "; ' Thursday '; » Y « ; » Y » ; 6; "01/04/2006 00:00:00 '; "07/04/2006 00:00:00 '; "07/04/2006 00:00:00 '; 6; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 6; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 14; 4; 2; 2006; 96; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 27; 7; 3; 2005; 188; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 65; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
20060407; "07/04/2006 00:00:00 '; "' FRI '; ' Friday '; » Y « ; » Y » ; 7; "01/04/2006 00:00:00 '; "07/04/2006 00:00:00 '; "07/04/2006 00:00:00 '; 7; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 7; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 14; 4; 2; 2006; 97; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 27; 7; 3; 2005; 189; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 66; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
20060408; ' 08/04/2006 00:00:00 '; "" SAM "; ' Saturday '; » N « ; » N » ; 1; ' 08/04/2006 00:00:00 '; "14/04/2006 00:00:00 '; "13/04/2006 00:00:00 '; 8; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 8; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 14; 4; 2; 2006; 98; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 28; 7; 3; 2005; 190; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 67; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
20060409; "09/04/2006 00:00:00 '; "" SUN "; ' Sunday '; » N « ; » N » ; 2; ' 08/04/2006 00:00:00 '; "14/04/2006 00:00:00 '; "13/04/2006 00:00:00 '; 9; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 9; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 15; 4; 2; 2006; 99; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 28; 7; 3; 2005; 191; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 68; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
20060410; "10/04/2006 00:00:00 '; "" MY. " ' Monday '; » Y « ; » Y » ; 3; ' 08/04/2006 00:00:00 '; "14/04/2006 00:00:00 '; "13/04/2006 00:00:00 '; 10; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 10; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 15; 4; 2; 2006; 100; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 28; 7; 3; 2005; 192; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 69. ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
20060411; "11/04/2006 00:00:00 '; "" KILL " ' Tuesday '; » Y « ; » Y » ; 4; ' 08/04/2006 00:00:00 '; "14/04/2006 00:00:00 '; "13/04/2006 00:00:00 '; 11; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 11; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 15; 4; 2; 2006; 101; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 28; 7; 3; 2005; 193; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 70; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
20060412; "12/04/2006 00:00:00 '; "" SEA "; "" Wednesday "; » Y « ; » Y » ; 5; ' 08/04/2006 00:00:00 '; "14/04/2006 00:00:00 '; "13/04/2006 00:00:00 '; 12; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 12; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 15; 4; 2; 2006; 102; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 28; 7; 3; 2005; 194; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 71; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
20060413; "13/04/2006 00:00:00 '; "" GAME "; ' Thursday '; » Y « ; » Y » ; 6; ' 08/04/2006 00:00:00 '; "14/04/2006 00:00:00 '; "13/04/2006 00:00:00 '; 13; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 13; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 15; 4; 2; 2006; 103; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 28; 7; 3; 2005; 195; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 72; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
20060414; "14/04/2006 00:00:00 '; "' FRI '; ' Friday '; » N « ; » Y » ; 7; ' 08/04/2006 00:00:00 '; "14/04/2006 00:00:00 '; "13/04/2006 00:00:00 '; 14; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 14; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 15; 4; 2; 2006; 104; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 28; 7; 3; 2005; 196; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 73; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
20060415; "15/04/2006 00:00:00 '; "" SAM "; ' Saturday '; » N « ; » N » ; 1; "15/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; 15; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 15; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 15; 4; 2; 2006; 105; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 29; 7; 3; 2005; 197; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 74; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
20060416; "16/04/2006 00:00:00 '; "" SUN "; ' Sunday '; » N « ; » N » ; 2; "15/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; 16; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 16; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 16; 4; 2; 2006; 106; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 29; 7; 3; 2005; 198; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 75; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
20060417; "17/04/2006 00:00:00 '; "" MY. " ' Monday '; » N « ; » Y » ; 3; "15/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; 17; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 17; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 16; 4; 2; 2006; 107; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 29; 7; 3; 2005; 199; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 76; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
20060418; "18/04/2006 00:00:00 '; "" KILL " ' Tuesday '; » Y « ; » Y » ; 4; "15/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; 18; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 18; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 16; 4; 2; 2006; 108; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 29; 7; 3; 2005; 200; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 77; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
20060419; "19/04/2006 00:00:00 '; "" SEA "; "" Wednesday "; » Y « ; » Y » ; 5; "15/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; 19; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 19; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 16; 4; 2; 2006; 109; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 29; 7; 3; 2005; 201; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 78; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858
20060420; "20/04/2006 00:00:00 '; "" GAME "; ' Thursday '; » Y « ; » Y » ; 6; "15/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; "21/04/2006 00:00:00 '; 20; "" APRIL "; ' April '; "01/04/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 30; 17; 20; "01/04/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; ' 06/30/2006 00:00:00 '; 91; 62; 16; 4; 2; 2006; 110; ' 01/01/2006 00:00:00 '; ' 31/12/2006 00:00:00 '; "29/12/2006 00:00:00 '365; 255; 29; 7; 3; 2005; 202; "2005-10-01 00:00:00 '; ' 09/30/2006 00:00:00 '; "29/09/2006 00:00:00 '365 253; 79; ' 01/02/2006 00:00:00 '; ' 30/04/2006 00:00:00 '; "28/04/2006 00:00:00 '; 89; 60; "" 2005/06; "" FULL "; 664858

Published by: user531731 on August 5, 2009 06:39

This is an example which will fill the 2 folders for a day with combination Y and N for all dates between
1/1/1980-1/1/2099.

Create table emp3 (F_date date, cd varchar2 (1));

declare
  v_date_key    DATE;
  v_future_date DATE;
BEGIN
  v_date_key := to_date('01-JAN-1980', 'DD-MON-YYYY');
  v_future_date := to_date('01-JAN-2099', 'DD-MON-YYYY');
  WHILE (v_date_key <= v_future_date) LOOP
    INSERT INTO emp3 a (F_date, CD) VALUES (v_date_key, 'Y');
    INSERT INTO emp3 a (F_date, CD) VALUES (v_date_key, 'N');
    v_date_key := v_date_key + 1;
  END LOOP;
  Commit;
end;
  

Tags: Database

Similar Questions

  • bunch of table temporary organized vs table for organizing data

    We use the Oracle 10 g on Linux platform.

    We have an intermediate table where the data read from the file (containing 1 million rows per day) is loaded, enriched(insert/update/delete) and then finally load in the table of another. I want to know if this table is a global temporary table or normal heap organized to reduce the generation of undo/redo.

    I'm not in favor of the temporary table because:
    1. any additional pressure on the temporary tablespace can cause ORA-01652: unable to extend temp segment temporary problems
    2. they are mainly intended to manipulate specific session data.
    3 - statistics do not exist for these work Oracle tables. To do this, we will have to do a dynamic sampling in the query.

    The problem with the table organized in piles is that they generate more undo/redo as temporary tables.

    Please guide me.

    >
    We have an intermediate table where the data read from the file (containing 1 million rows per day) is loaded, enriched(insert/update/delete) and then finally load in the table of another. I want to know if this table is a global temporary table or normal heap organized to reduce the generation of undo/redo.

    I'm not in favor of the temporary table because:
    1. any additional pressure on the temporary tablespace can cause ORA-01652: unable to extend temp segment temporary problems
    2. they are mainly intended to manipulate specific session data.
    3 - statistics do not exist for these work Oracle tables. To do this, we will have to do a dynamic sampling in the query.

    The problem with the table organized in piles is that they generate more undo/redo as temporary tables.
    >
    Some of you concerns can easily be mitigated.

    1 - temp tablespace

    A common practice for ETL processing to create and use a temporary tablespace customized to use TWG. This prevents the TWG of impacting on the standard temp space and possibly interfere with the rest of the DB.

    See "Creating a temporary Table" in the DBA Guide. This article has sample code that illustrates this.
    http://docs.Oracle.com/CD/B28359_01/server.111/b28310/tables003.htm#i1006400
    >
    By default, rows in a temporary table is stored in the temporary tablespace default of the user who creates it. However, you can assign a temporary table to a different tablespace when creating the temporary table by using the TABLESPACE in CREATE TABLE of TEMPORARY GLOBAL clause. You can use this feature to save the space used by temporary tables. For example, if you need to perform many operations of small temporary table and the default temporary tablespace is configured for sort operations and uses so a large measure, these small operations consume a lot of unnecessary disk space. In this case, it is best to assign a tablespace temporary second with a smaller measure.
    >
    #2 (planned for the session-specific data processing) is correct, that they have the specific session data. But GTT can also be used simply to reduce the amount of REDO normal DML operations during the processing of data which are already isolated from other users and does not need to be shared.
    >
    3 - statistics do not exist for these work Oracle tables. To do this, we will have to do a dynamic sampling in the query.
    >
    Sure - but a working Oracle generally does not collect statistics whenever you do DML on your staging table. And if you do stats new TRUNCATE/LOAD operations should gather anyway once you load the new data.

    You may not have considered one of the factors are that you should design your architecture to be scalable. It is quite possible that you have only a SINGLE step and your current treatment is very simple.

    If so AND you do not need an evolutionary process, so the solution suggested by knani, maybe the best solution for you. But this solution is not very scalable.

    Complext ETL implementations have several steps. And the data still does not move between these steps in one nice, easy step. It is a common requirement that after each stage of data processing can must be discussed or reported on make sure you that he satisfied all the requirements of the company. Then data problems (e.g. lack of data, incorrect, etc.) must be resolved before the data can be processed to the next step. Depending on the severity of the problems that step may need to be rerun.

    If only the TWG is used there is no way to "review" the data. And if ONLY the external tables are used, then you can not tables of several process efficient in parallel and asynchronously.

    Complex implementations that I worked on usually consisted of a range of external and normal tables and TWG.

    External tables and one step simple "data cleansing" are used to load data into tables normal as soon as possible. That allows multiple processes to run asynchronously and in parallel for the detection of problems of data 'stored' as much as possible. Any table can be reloaded/processed without affecting other processes.

    This suggests that your first step should knapen processes in place do the same "serial" cleaning possible.

    The second stage of ETL, when necessary, can perform cleanup of more complex data for data in a table or several tables. TWG can be effectively used here to store the intermediate results that may have large amounts of 'temporary' DML performed on the data. At the end of this stage the TWG data would be transferred to another table of result or permanent staging.

    Start with a simple one-step process (maybe Karthicks). The key is to avoid complicating the process in a way that makes it impassable.

  • Table for the data to HAVE it

    Please add the table to my VI.

    Please see the attachment.

    A probe inside the loop shows a lot of activities.  Are you preventing the VI with the command menu to cancel the execution? If so, you will never see data outside the While loop. Data will only forward your While loop when you stop the loop by program.

    Your STOP button is hidden behind the chart. It out behind the loop and use it to stop the loop. Then you will see the table becomes populated.

    Your spreadsheet column headings indicates that you want to save the square and square root average of each sample. If this is the case, then these values must be calculated at the time of acquisition before the release of the data in the spreadsheet file.

    JohnCS

  • Fill a table with date values with a fixed increment

    Hello

    I want to fill a table with a date column with the increment of a fixed value date values. The start date is selectable, the increment is selectable and the number of records is adjustable as well.
    For example
    start date is 1905-Jan-02, 15:00 (DD-MON-YYYY, HH24:MI:SS)
    increment is 1 hour and 5 minutes
    None. records is 10
    then the dates in the table must be
    REC 1 1905 - Jan - 02, 15:00
    REC 2 1905 - Jan - 02, 16:05
    REC 3 1905 - Jan - 02, 17:10
    ....
    REC 9 1905 - Jan - 02, 23:40
    REC 10 1905 - Jan - 03, 00:45:00

    We are working on 11 GR 2, the number of records can be between a few hundred and a few million and they must be ordered with Crescent of time (maybe to have a whole id).
    Any ideas how to fill this table (simple and fast?) using sql / plsql are welcome.

    Thanks, Hannes

    Something along the lines of:

    SQL> alter session set nls_date_format = 'YYYY-Mon-DD HH24:MI:SS';
    
    Session altered.
    
    SQL> select trunc(sysdate,'HH')+((rownum-1)*(1/24)*(65/60)) as dt
      2  from dual connect by rownum <= 20;
    
    DT
    --------------------
    2010-Aug-26 16:00:00
    2010-Aug-26 17:05:00
    2010-Aug-26 18:10:00
    2010-Aug-26 19:15:00
    2010-Aug-26 20:20:00
    2010-Aug-26 21:25:00
    2010-Aug-26 22:30:00
    2010-Aug-26 23:35:00
    2010-Aug-27 00:40:00
    2010-Aug-27 01:45:00
    2010-Aug-27 02:50:00
    2010-Aug-27 03:55:00
    2010-Aug-27 05:00:00
    2010-Aug-27 06:05:00
    2010-Aug-27 07:10:00
    2010-Aug-27 08:15:00
    2010-Aug-27 09:20:00
    2010-Aug-27 10:25:00
    2010-Aug-27 11:30:00
    2010-Aug-27 12:35:00
    
    20 rows selected.
    
  • How can, during the collection of data, start a new column in my table every 100 data points?

    Hello! I have a problem with my data - I get in a wide range of 1 x 1000, but it's the repeated measures, each taking about 500 data points. I want to break this table for this data string start a new column in my table every 500 points given. I don't know how to do it-please help!

    datacompiler100 wrote:

    Hey thanks for the sponsor and the first off I must apologize for the State, I am attaching my VI. I put the part of the VI that I am working on (my team has access, so didn't post everything here) and also attached the data file (when just written in a spreadsheet file, not through the attached VI). I want to convert the long row of data and then start a new column every 50, 100, 5 points (user-defined).

    Using the data from the file, you can reshape everything simply (as you already!), followed by transposition (since you want columns instead of rows). 2D arrays must always be rectangular, so the last column is filled with zeros if necessary. Is that what you want?

    Of course if you try to add a new column in a file, that will not work. You can only add lines in an existing file because of the way the data is organized. To add columns, the entire file must be read, intertwined and all re-written new data in the file.

  • expdp for all dates earlier or equal to 31/03/2009

    Good morning gurus,
    I created a copy of my production for migration on different server. We have windows 2003 and oracle 10.2.0.4.I created a schema ivs2_prod (metadata_only) thru expdp/impdp and now want expdp/impdp some data tables for all dates prior to or equal to 31/03/2009 11:59:59.I don't see any option in this utility to do this. You gurus can help me with the solution for this.
    Kind regards
    Deepa

    Deepa,

    Use the settings file instead of using the query on the command line option, if you want to use the query option on command that you escape ' \ ' to correctly format the query. Try this and it should work

    #mytest.par

    query="where collectiondate <= to_date('2009-03-31 11:59:59','YYYY-MM-DD HH24:MI:SS')"
    

    Datapumpt export

    expdp username/pwd@prod directory=DATADIR1 dumpfile=expcollections_tab3 tables=collections_tab3 logfile=collection_tab3.log content=data_only parfile=mytest.par
    

    Concerning

  • Date dimension unique creating aggregation tables

    Hi guys,.

    I have a date single dimension (D1 - D) with key as date_id and the granularity is at the level of the day. I did table(F1-D) that gives daily transactions. Now, I created three tables of aggregation with F2-M(aggregated to monthly), Q(Aggregated to quarterly)-F3, F4-Y(Aggregated to yearly). As I said. I have a table of unique date with date-id as a key dimension. I have other columns month, quarter, year in the Date dimension.


    My question is: is this single dimension table is sufficient to create the joins and maintain layer MDB. I joined the date_id of all facts in the physical layer. MDB layer, I have a fact and logical table 4 sources. II have created the hierarchy of the Date dimension dimension and created the logical levels as a year, quarter, month, and day and also set their respective level keys. Now, after doing this I also put the logic levels for logic table 4 sources in the fact table.

    Here, I get an error saying:



    WARNINGS:


    BUSINESS financial model MODEL:
    [39059] D04_DIM_DATE logical dimension table has a source of D04_DIM_DATE at the level of detail of D04_DIM_DATE that connects to a source of fact level superior F02_FACT_GL_DLY_TRAN_BAL. F03_FACT_GL_PERIOD_TRAN_BAL




    Can someone tell me why I get this error.

    Reverse - your group table months must have information on the year.

    It's so she can be summarized in the parent hierarchy levels.

    In general, it is so you don't have to create a table of aggregation for each situation - your table of months can be used for aggregates of the year. Still quite effective (12 times more data than the needs, but better than 365 times).

    Think about your particular situation where you have a year AND a month group you might get away without information from parent levels - but I have not tested this scenario.

    With the second part, let's say you have a description of months and a key of the month field. When you select month and income description, obiee needs to know where to find the description of months of. You don't find it secondary date for reasons mentioned previously dimension table. So, you tell him to do it from the global table. It is a simple as you drag the respective physical column from the overall table on the existing logical column for the description of months.

    Kind regards

    Robert

  • Date for date dimension pull upward (Dimension to Dimension)

    Need advice...

    1. I have a person dimension table and the sales fact table. Person_id is the foreign key in the fact. I want to be able to make the date of birth. But the size of the person has only dob_sid which should point to the dimension of the day to get my the equivalent date. That's the problem... I can't join dimension dimension of the day person, I need to be able to join by fact, but the fact table is absent from the dob_sid.
    2. the second question is, dimension of the person has other dates partners action_date, cancel_date, etc. in addition to a date field that I need to attach to the dimension of the day.

    How will I be able to do this in BMM? Physical layer?

    The key (no pun intended) is here to alias your date dimension for several searches to date, you need, these aliases directly join your dimension of person out the appropriate keys.

  • Unable to display data for the date where there is no entry in the table

    Hello

    I need a urgent, described below:

    I have a table named as 'dirty', consisting of three columns: empno, sale_amt and sale_date.
    (Please ref. The table with data script as shown below)

    Now, if I run the query:
    "select trunc (sale_date) sale_date, sum (sale_amt) total_sale of the sales group by order trunc (sale_date) by 1.
    It then displays the data for the dates there is an entry in this table. But it displays no data for the
    date in which there is no entry in this table.

    If you run the Table script with data in your schema, then u will see that there is no entry for the 28th. November 2009 in
    sales table. Now the above query displays data for the rest as his dates are in the table of the sale with the exception of 28. November 2009.
    But I need his presence in the result of the query with the value "sale_date' as '28. November 2009 "and that of"total_sale"as
    « 0 ».

    Y at - there no way to get the result I need?

    Please help as soon as POSSIBLE.

    Thanks in advance.

    Create the table script that contains data:
    ------------------------------------------

    CREATE TABLE SALE
    (
    NUMBER EMPNO,
    NUMBER OF SALE_AMT
    DATE OF SALE_DATE
    );
    TOGETHER TO DEFINE
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (100, 1000, TO_DATE (DECEMBER 1, 2009 10:20:10 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (100, 1000, TO_DATE (NOVEMBER 30, 2009 10:21:04 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (100, 1000, TO_DATE (NOVEMBER 29, 2009 10:21:05 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (100, 1000, TO_DATE (NOVEMBER 26, 2009 10:21:06 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (100, 1000, TO_DATE (NOVEMBER 25, 2009 10:21:07 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (200, 5000, TO_DATE (NOVEMBER 27, 2009 10:23:06 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (200, 4000, TO_DATE (NOVEMBER 29, 2009 10:23:08 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (200, 3000, TO_DATE (NOVEMBER 24, 2009 10:23:09 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (200, 2000, TO_DATE (NOVEMBER 30, 2009 10:23:10 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (300, 7000, TO_DATE (NOVEMBER 24, 2009 10:24:19 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (300, 5000, TO_DATE (NOVEMBER 25, 2009 10:24:20 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (300, 3000, TO_DATE (NOVEMBER 27, 2009 10:24:21 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (300, 2000, TO_DATE (NOVEMBER 29, 2009 10:24:22 ',' DD/MM/YYYY HH24:MI:SS'));))
    Insert into SALES
    (EMPNO, SALE_AMT, SALE_DATE)
    Values
    (300, 1000, TO_DATE (NOVEMBER 30, 2009 10:24:22 ',' DD/MM/YYYY HH24:MI:SS'));))
    COMMIT;

    Any help will be necessary for me


    Kind regards
    WITH tab AS
      (SELECT TRUNC(sale_date) sale_date,
        SUM(sale_amt) total_sale
         FROM sale
       GROUP BY TRUNC(sale_date)
       ORDER BY 1
      )
     SELECT sale_date,
      NVL(total_sale,0) total_sale
       FROM tab
       model
       REFERENCE refmodel ON (SELECT 1 indx, MAX(sale_date)-MIN(sale_date) AS daysdiff , MIN(sale_date) minsaledate FROM tab)
         dimension BY (indx)
         measures(daysdiff,minsaledate)
       main main_model
       dimension BY (sale_date)
       measures(total_sale)
       RULES upsert SEQUENTIAL ORDER ITERATE(1000) until (iteration_number>refmodel.daysdiff[1]-1)
       ( total_sale[refmodel.minsaledate[1]+iteration_number]=total_sale[cv()] )
    ORDER BY sale_date
    

    using a clause type

    Ravi Kumar

  • How can I find and open/search the 32 GB of data that fills the disk for possible deletion?

    Using XP: I have only about 2 GB (2 000 MB) of the programs on my hard drive Go 37 and less than 3 GB of space left.

    How can I find and open/search the 32 GB of data that fills the disk for possible deletion?

    Thank you.

    * original title - full hard drive. *

    Using XP: I have only about 2 GB (2 000 MB) of the programs on my hard drive Go 37 and less than 3 GB of space left.

    How can I find and open/search the 32 GB of data that fills the disk for possible deletion?

    Thank you.

    Download and run JDiskReport.

    There are a number of things you can do to- temporarily - free up disk space.  The only real solution, however, is to get a larger hard drive.

    • You can run Disk Cleanup (start > run > cleanmgr > OK)
    • You can reduce the size allocated for the restoration of the system to about 1 GB (right click on desktop > properties > system restore > settings)
    • You can disable hibernation (if you don't use it) (right click on an empty spot on the desktop > properties > screen saver > power > Hibernate)
    • You can disable the indexing of the drive, which will also speed up your computer a bit (http://lifehacker.com/031440/turn-off-indexing-and-speed-up-windows-xp)
    • You can reduce the size of your Internet browser cache (depending on the browser that you use)
    • You can remove most of the $NTUninstallKBxxxxxx files $ following the directions here: http://windowsxp.mvps.org/Hotfix_backup.htm (read the warnings in the gray box first)

    But none of these answers you will earn really large amounts of space for very long.  The term solution time is a bigger hard drive.

    In the meantime, download and run JDiskReport, that will show you graphically what files take up more space and seem to so indicate what files to delete or uninstall applications.

  • Best practices for migrating data tables - please comment.

    I have 5 new tables stocked with data that must be promoted an evolution to a production environment.
    Instead of just DBA using a data migration tool, they are insistent that I have record and provide scripts for each commit, in good condition, necessary both to build the table and insert the data from ground zero.

    I'm very little used to such an environment, and it looks much more risky for me to try to reconstruct the objects from scratch, so I already have a model perfect, tested and ready.

    They require a lot of literature, where each step is recorded in a document and use for deployment.
    I think their purpose is that they do not want to rely on backups but would rather than rely on a document that specifies each step to recreate.

    Please comment on your view of this practice. Thank you!

    I'm not a DBA. I can't even hold a candle to the fans of the forum as Srini/Justin/sb.
    Now that I'm giving a slightly different opinion.

    It is great to have, and I paraphrase of various positions
    Deployment documents,
    sign off of groups
    recovery steps
    Source code control
    repositories
    "The production environment is sacred. All the risks that must be reduced to a minimum at any price. In my opinion a DBA should NEVER move anything from a development environment directly in a production environment. 'NEVER.'
    etc etc.

    But we cannot generalize that each production system must have these. Each customer is different; everyone has different levels of fault tolerance.

    You can't wait for a change of design of the product to a cabinetmaker to go through "as rigorous than at NASA. Why would it be different for software change?

    How rigorous you put is a bit subjective - it depends on the policies of the company, experiences in society with the disasters of migration (if any) and the corporate culture.

    OP can come from a customer with lax controls. And it can be questioned if the rigour at the level of the new customer is worth it.

    To a single client, (and I don't kid), the prod password is apps/apps. (after 12 years of being alive!) I was appalled at first. But it's a very small business. They got saddled with EBS during the .com boom Hay days. They use just 2 modules in EBS. If I speak of creation (and charge) these documents/processes, I would lose the customer.
    My point is that not all places must or want these controls. By trial and error, you get what is the best company.

    OP:
    You said that you're not used to this type of environment. I recommend that you go with the current first. Spend time on understanding the value/use/history of these processes. Ask if they still seem too much to you. Keep in mind: this is subjective. And if it comes down to your existing reviews of the v/s, you need either an authority OR some means of persuasion (money / sox (intentional typo here) / beheaded horse heads... everything that works!)

    Sandeep Gandhi

    Edit: Typo-detail fault

    Published by: Sandeep Gandhi, independent Consultant on 25 June 2012 23:37

  • By comparing the two tables for the integrity of the data

    Hi all
    I need to compare two tables for the integrity of the data through the SQL query.

    If you need to compare all the columns of t1 to t2:

    (SELECT * FROM t1
    MINUS
    SELECT * FROM t2)
    UNION ALL
    (SELECT * FROM t2
    MINUS
    SELECT * FROM t1);
    

    Kind regards
    Ankit Rouault
    http://theoraclelog.blogspot.in

  • Oracle table for the field ' Date effectivity to "in the Bill of materials screen

    Hi gurus Apps,

    In Oracle EBS Bill of materials screen tab Date effectivity, there's column: date 'To' indicates the end until whose Nomenclature is active.

    I want to know where this field Date effectivity to lies in the base table, I tried to help-> diagonostics but this field is not present in the given view.

    Almost all areas at the level of BOM components are present in bom_inventory_components table except this date of entry into force to the field.

    Please advice regarding in what table are present here.

    Thank you
    RG

    Hi RG,.

    These two fields EFFECTIVITY_DATE & DISABLE_DATE are available in the base table: BOM_COMPONENTS_B

    SELECT EFFECTIVITY_DATE, DISABLE_DATE FROM BOM_COMPONENTS_B WHERE BILL_SEQUENCE_ID =: p_bill_seq_id

    HTH
    Sanjay

  • joining several tables to generate data

    with itemlist as

    (select 1 cid, listname 'test', 'r' flg Union double all the)

    Select 2 cid, listname 'test2', ' not flg double

    ),

    category 1)

    Select 1 cid, 122 k' sub '123' catcode, of any double Union

    Select 1 cid, '234i' catcode, sub '124' all Union double

    Select 1 cid, 238 k' void '124' catcode, double

    )

    void)

    Select 1 cid, sub '123' of any union double

    Select 1 cid, sub ' 124' double

    )

    I want to write a static query that gathers all the tables and produces the following output assuming that

    CID is you pass as a parameter

    CID listname flg topic catcode

    1 123 122 k r test

    1 test of r 234i 124

    1 124 238 k r test

    Suppose the data as changes to follow and even cid 1 is to pass as a parameter:

    with itemlist as

    (select 1 cid, listname 'test', 'r' flg Union double all the)

    Select 2 cid, listname 'test2', ' not flg double

    ),

    category 1)

    Select 1 cid, 122 k' sub '123' catcode, of any double Union

    Select 1 cid, '234i' catcode, sub '124' all Union double

    Select 1 cid, 238 k' void '124' catcode, double

    )

    void)

    Select 2 cid, sub '123' of any union double

    Select 2 cid, sub '124' double

    )

    then the output should be

    CID listname flg topic catcode

    1 r test 123

    1 test of r 124

    1 test of r 124

    for a final example, suppose the data now as to user 1 as cid

    with itemlist as

    (select 1 cid, listname 'test', 'r' flg Union double all the)

    Select 2 cid, listname 'test2', ' not flg double

    ),

    category 1)

    Select 2 cid, 122 k' sub '123' catcode, of any double Union

    Select 2 cid, catcode '234i', '124' void in any union double

    Select 2 cid, 238 k' void '124' catcode, double

    )

    void)

    Select 1 cid, sub '123' of any union double

    Select 2 cid, sub '124' double

    )

    output must be

    CID listname flg topic catcode

    1       test            r                 123

    Basically, the user will pass a cid, in this case 1. I'll read the itemlist where cid = 1 and I want to join all the other tables

    to itemlist in such a way that I can generate extra lines.  If no entry is found for cid 1, for example, in category 1 or auxiliary table

    then the beloging column of this table must be null in the result

    can someone help me write a query to produce the output above

    Hello

    elmasduro wrote:

    ...

    Here is an explanation of the output.  a user will have a front screen where it can enter itemlist info.
    then in a submenu may enter a category or sub or both or none of them. the user can select multiple categories or sub
    When the user presses the button Save, an entry will be made in the itemlist table, for example the cid = 1.
    If he choose a category but without Subs, then will be entered into the table at say cid = 1


    but no entries will be made for sub because he chose not to all values.  If he chooses void instead of category values
    then sub taable will fill and category is not. If he chooses the category and under, then both will be filled.

    what I want to do is to join the itemlist with table category and sub table and get the data to the Scriptures, this intervention of the user.
    These entries not found will be the null value column in the result.
    for example, the cid = 1 itemlist, if user enter category but not void, itemlist will be the join with the table category and sub
    but since sub has no entry for the cid = 1, the column object will be empty in the output, but catcode will fill

    I'm so confused.

    It is a query (that is, just a SELECT statement, which shows the data already present in the tables), or is this a DML statement, that will add data to the tables?  If it's just a query, then do not say things like "entries will be taken" or "filled with tables.

    Assuming you want a request maybe you want something like this:

    WITH c_and_s AS

    (

    SELECT NVL (c.cid, s.cid) AS cid

    NVL (c.sub, s.sub) AS topic

    , NVL2 (s.cid, c.catcode, NULL) AS catcode

    CATEGORY 1 c

    FULL OUTER JOIN void s ON c.cid = s.cid

    AND c.sub = s.sub

    )

    SELECT i.cid, i.listname, i.flg

    cs.subject, cs.catcode

    Itemlist I have

    LEFT OUTER JOIN c_and_s ON cs.cid = i.cid cs

    WHERE the i.cid (1) - you can do the cids 2 or several at the same time, if you want to

    ORDER BY i.cid, cs.subject, cs.catcode

    ;

    This example gets the output you asked data sample you posted.

  • There is 'table' for the object_type values?

    I wonder if there is a system view, or a table for all the values that can appear in the all_objects.object_type column? Or maybe the only way to get this list of os values to query with the distinct keyword dba_objects.object_type?

    Thank you

    Look at the source of the view:

    Decode (o.type #, 0, 'NEXT ITEM', 1, 'INDEX', 2, 'TABLE', 3, 'CLUSTERS',)

    4, 'VIEW', 5, 'SYNONYM', 6, 'ORDER ',.

    7, 'PROCEDURE', 'FUNCTION', 8, 9, 'PACKAGE ',.

    11, 'PACKAGE BODY', 12, 'TRIGGER ',.

    13, 'TYPE', 14, 'TYPE BODY ':

    19, "PARTITION TABLE", 20, 'PARTITION OF THE INDEX', 21, 'CRAFT. "

    22, 'LIBRARY', 23, "DIRECTORY," 24, "QUEUE."

    28, 'JAVA SOURCE,' 29, 'JAVA CLASS', 30, 'JAVA RESOURCE. "

    32, "INDEXTYPE", 33, "OPERATOR",.

    34, 'TABLE SUBPARTITION', 35, "INDEX SUBPARTITION.

    40, 'LOB PARTITION', 41, "LOB SUBPARTITION.

    42, NVL ((SELECT "EQUIVALENCE OF REWRITING"

    AMOUNT $ s

    WHERE s.obj #= o.obj #.

    and bitand (s.xpflags, 8388608) = 8388608).

    "MATERIALIZED VIEW'),

    43, "DIMENSION."

    44, 'CONTEXT', 46, "SET OF RULES', 47, 'PLAN OF RESOURCES."

    48, GROUP "CONSUMER."

    55, "XML SCHEMA", 56, "JAVA DATA."

    57, 'EDITION', 59, "RULE."

    60, "CAPTURE," 61, "APPLICABLE."

    62, 'EVALUATION CONTEXT ',.

    66, 'JOB', 'PROGRAMME', 67, 68, "JOB CLASS", 69, 'WINDOW',

    72, GROUP 'PLANNER', 74, 'CALENDAR', 79 'CHAIN ',.

    81, 'GROUP OF FILES', 82 'MODEL OF EXPLORATION' 87, "ASSEMBLY,"

    90, 'IDENTIFICATION', ' 92, 'CUBE DIMENSION', 93, "CUBE."

    94, "MEASUREMENT RECORD", 95 "CUBE BUILD PROCESS,"

    100, "LINE WATCHER", 101, "DESTINATION."

    "UNDEFINED"),

Maybe you are looking for

  • Load OSX on 2014 macbook air problems

    Hey guys, dude this device telling me that he had bought earlier by someone who had deleted the info private and he has said that he needs to enter the password to access the desktop computer. The thing is, no matter what he tried it is not on the de

  • Bulk move send email to Mac native format on my hard drive?

    Can I to bulk move/export Send Mail on my hard drive to the native format of Mail? My goal is to move mail from a value of savings of online web email accounts in my hierarchy of folders on the local hard disk. Check the mail - like and export functi

  • Updated BIOS for Equium L20 missing docs?

    I recently downloaded the upgrade of Bios for the Equium L20 PSL2ZE.I need to this day with a bootable CD-ROM. In the FAQ, it says in the sub-heading "Option 2: update with CD-ROM" the following text: 1 extract the zip file containing the BIOS upgrad

  • Mode of execution in the exe version

    Y at - it an option to disable an exe file to start "run"? I wish I could change my analog sampling rate without having to stop and start the .exe at the start.

  • Will the concept AD LDAP Referral supports two different forests

    Hello I have areas DomainA under ForestA and DomainB in ForestB. I would like to implement the concept of LDAP Referral for these two areas, so I have set up between these two areas and add a crossRef under CN = Partitions for DomainA and trying to s