SQL * Loader running forever
HelloI use Oracle 10.2.0.4 on RHEL 4u4. I am trying to load a table with data using "Load data user Files" in Enterprise Manager (SQL * Loader).
When I run the work, she spends just forever, and never ends. I try only to charge about 6 400 lines, so I can't imagine that he should take a very long time. I have created a. BAD file, a. File to DISCARD and one. LOG file, but all are empty. I also followed the alert log, but I see not everything about the load there either. I may have something wrong in the control file?
Here is a sample of my data in the flat file:
1 * "LAURA, LAURIE, LORA, LAURI, LARA."
2 * "SARA, SARAH."
3 * "' JENNIFER, JENN, JEN, JENNI, JENNY" * "
...
And the control file generated by Enterprise Manager:
DOWNLOAD THE DATA
ADD
IN THE NAMES.NAME_VARIANTS TABLE
' FIELDS TERMINATED BY ' *' POSSIBLY PROVIDED BY ' "'
(
COUNTER INTEGER EXTERNAL,
VARIANT_TEXT TANK
)
I'm not sure how to solve this because I don't get anything in each log file!
Mimi
I'm not sure how to solve this because I don't get anything in each log file!
script /tmp/capture.log
sqlldr strace...
Tags: Database
Similar Questions
-
SQL Loader - ignore the lines with "rejected - all null columns."
Hello
Please see the attached log file. Also joined the table creation script, data file and the bad and throw the files after execution.
Sqlldr customer in the version of Windows-
SQL * Loader: release 11.2.0.1.0 - Production
The CTL file has two clauses INTO TABLE due to the nature of the data. The data presented are a subset of data in the real world file. We are only interested in the lines with the word "Index" in the first column.
The problem we need to do face is, according to paragraph INTO TABLE appears first in the corresponding CTL lines file to the WHEN CLAUSE it would insert and the rest get discarded.
1. statement of Create table : create table dummy_load (varchar2 (30) name, number, date of effdate);
2. data file to simulate this issue contains the lines below 10. Save this as name.dat. The intention is to load all of the rows in a CTL file. The actual file would have additional lines before and after these lines that can be discarded.
H15T1Y Index | 2. 19/01/2016 |
H15T2Y Index | 2. 19/01/2016 |
H15T3Y Index | 2. 19/01/2016 |
H15T5Y Index | 2. 19/01/2016 |
H15T7Y Index | 2. 19/01/2016 |
H15T10Y Index | 2. 19/01/2016 |
CPDR9AAC Index | 2. 15/01/2016 |
MOODCAVG Index | 2. 15/01/2016 |
H15TXXX Index | 2. 15/01/2016 |
H15TXXX Index | 2. 15/01/2016 |
3. the CTL file - name.ctl
DOWNLOAD THE DATA
ADD
IN THE TABLE dummy_load
WHEN (09:13) = "Index".
TRAILING NULLCOLS
(
COMPLETED name BY ' | ',.
rate TERMINATED BY ' | '.
COMPLETED effdate BY ' | '. ' TO_DATE (: effdate, "MM/DD/YYYY").
)
IN THE TABLE dummy_load
WHEN (08:12) = "Index".
TRAILING NULLCOLS
(
COMPLETED name BY ' | ',.
rate TERMINATED BY ' | '.
COMPLETED effdate BY ' | '. ' TO_DATE (: effdate, "MM/DD/YYYY").
)
invoke SQL loader in a file-> beats
C:\Oracle\product\11.2.0\client\bin\sqlldr USERID = myid/[email protected] CONTROL=C:\temp\t\name.ctl BAD=C:\temp\t\name_bad.dat LOG=C:\temp\t\name_log.dat DISCARD=C:\temp\t\name_disc.dat DATA=C:\temp\t\name.dat
Once this is run, the following text appears in the log file (excerpt):
Table DUMMY_LOAD, charged when 09:13 = 0X496e646578 ('Index' character)
Insert the option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Position Len term Encl. Datatype name
------------------------------ ---------- ----- ---- ---- ---------------------
NAME FIRST * | CHARACTER
RATE NEXT * | CHARACTER
EFFDATE NEXT * | CHARACTER
SQL string for the column: ' TO_DATE (: effdate, "MM/DD/YYYY").
Table DUMMY_LOAD, charged when 08:12 = 0X496e646578 ('Index' character)
Insert the option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Position Len term Encl. Datatype name
------------------------------ ---------- ----- ---- ---- ---------------------
NAME NEXT * | CHARACTER
RATE NEXT * | CHARACTER
EFFDATE NEXT * | CHARACTER
SQL string for the column: ' TO_DATE (: effdate, "MM/DD/YYYY").
Record 1: Ignored - all null columns.
Sheet 2: Cast - all null columns.
Record 3: Ignored - all null columns.
Record 4: Ignored - all null columns.
Sheet 5: Cast - all null columns.
Sheet 7: Discarded - failed all WHEN clauses.
Sheet 8: Discarded - failed all WHEN clauses.
File 9: Discarded - failed all WHEN clauses.
Case 10: Discarded - failed all WHEN clauses.
Table DUMMY_LOAD:
1 row loaded successfully.
0 rows not loaded due to data errors.
9 lines not loading because all WHEN clauses were failed.
0 rows not populated because all fields are null.
Table DUMMY_LOAD:
0 rows successfully loaded.
0 rows not loaded due to data errors.
5 rows not loading because all WHEN clauses were failed.
5 rows not populated because all fields are null.
The bad file is empty. The discard file has the following
H15T1Y Index | 2. 19/01/2016 |
H15T2Y Index | 2. 19/01/2016 |
H15T3Y Index | 2. 19/01/2016 |
H15T5Y Index | 2. 19/01/2016 |
H15T7Y Index | 2. 19/01/2016 |
CPDR9AAC Index | 2. 15/01/2016 |
MOODCAVG Index | 2. 15/01/2016 |
H15TXXX Index | 2. 15/01/2016 |
H15TXXX Index | 2. 15/01/2016 |
Based on the understanding of the instructions in the CTL file, ideally the first 6 rows will have been inserted into the table. Instead the table comes from the line 6' th.
NAME RATE EFFDATE H15T10Y Index 2 January 19, 2016 If the INTO TABLE clauses were put in the CTL file, then the first 5 rows are inserted and the rest are in the discard file. The line 6' th would have a ""rejected - all columns null. "in the log file. "
Could someone please take a look and advise? My apologies that the files cannot be attached.
Unless you tell it otherwise, SQL * Loader assumes that each later in the table and what clause after the first back in the position where the previous left off. If you want to start at the beginning of the line every time, then you need to reset the position using position (1) with the first column, as shown below. Position on the first using is optional.
DOWNLOAD THE DATA
ADD
IN THE TABLE dummy_load
WHEN (09:13) = "Index".
TRAILING NULLCOLS
(
name POSITION (1) TERMINATED BY ' | '.
rate TERMINATED BY ' | '.
COMPLETED effdate BY ' | '. ' TO_DATE (: effdate, "MM/DD/YYYY").
)
IN THE TABLE dummy_load
WHEN (08:12) = "Index".
TRAILING NULLCOLS
(
name POSITION (1) TERMINATED BY ' | '.
rate TERMINATED BY ' | '.
COMPLETED effdate BY ' | '. ' TO_DATE (: effdate, "MM/DD/YYYY").
)
-
Question to load data using sql loader in staging table, and then in the main tables!
Hello
I'm trying to load data into our main database table using SQL LOADER. data will be provided in separate pipes csv files.
I have develop a shell script to load the data and it works fine except one thing.
Here are the details of a data to re-create the problem.
Staging of the structure of the table in which data will be filled using sql loader
create table stg_cmts_data (cmts_token varchar2 (30), CMTS_IP varchar2 (20));
create table stg_link_data (dhcp_token varchar2 (30), cmts_to_add varchar2 (200));
create table stg_dhcp_data (dhcp_token varchar2 (30), DHCP_IP varchar2 (20));
DATA in the csv file-
for stg_cmts_data-
cmts_map_03092015_1.csv
WNLB-CMTS-01-1. 10.15.0.1
WNLB-CMTS-02-2 | 10.15.16.1
WNLB-CMTS-03-3. 10.15.48.1
WNLB-CMTS-04-4. 10.15.80.1
WNLB-CMTS-05-5. 10.15.96.1
for stg_dhcp_data-
dhcp_map_03092015_1.csv
DHCP-1-1-1. 10.25.23.10, 25.26.14.01
DHCP-1-1-2. 56.25.111.25, 100.25.2.01
DHCP-1-1-3. 25.255.3.01, 89.20.147.258
DHCP-1-1-4. 10.25.26.36, 200.32.58.69
DHCP-1-1-5 | 80.25.47.369, 60.258.14.10
for stg_link_data
cmts_dhcp_link_map_0309151623_1.csv
DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2
DHCP-1-1-2. WNLB-CMTS-03-3,WNLB-CMTS-04-4,WNLB-CMTS-05-5
DHCP-1-1-3. WNLB-CMTS-01-1
DHCP-1-1-4. WNLB-CMTS-05-8,WNLB-CMTS-05-6,WNLB-CMTS-05-0,WNLB-CMTS-03-3
DHCP-1-1-5 | WNLB-CMTS-02-2,WNLB-CMTS-04-4,WNLB-CMTS-05-7
WNLB-DHCP-1-13 | WNLB-CMTS-02-2
Now, after loading these data in the staging of table I have to fill the main database table
create table subntwk (subntwk_nm varchar2 (20), subntwk_ip varchar2 (30));
create table link (link_nm varchar2 (50));
SQL scripts that I created to load data is like.
coil load_cmts.log
Set serveroutput on
DECLARE
CURSOR c_stg_cmts IS SELECT *.
OF stg_cmts_data;
TYPE t_stg_cmts IS TABLE OF stg_cmts_data % ROWTYPE INDEX BY pls_integer;
l_stg_cmts t_stg_cmts;
l_cmts_cnt NUMBER;
l_cnt NUMBER;
NUMBER of l_cnt_1;
BEGIN
OPEN c_stg_cmts.
Get the c_stg_cmts COLLECT in BULK IN l_stg_cmts;
BECAUSE me IN l_stg_cmts. FIRST... l_stg_cmts. LAST
LOOP
SELECT COUNT (1)
IN l_cmts_cnt
OF subntwk
WHERE subntwk_nm = l_stg_cmts (i) .cmts_token;
IF l_cmts_cnt < 1 THEN
INSERT
IN SUBNTWK
(
subntwk_nm
)
VALUES
(
l_stg_cmts (i) .cmts_token
);
DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_cmts (i) .cmts_token);
ON THE OTHER
DBMS_OUTPUT. Put_line ("token is already present'");
END IF;
WHEN l_stg_cmts EXIT. COUNT = 0;
END LOOP;
commit;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
for dhcp
coil load_dhcp.log
Set serveroutput on
DECLARE
CURSOR c_stg_dhcp IS SELECT *.
OF stg_dhcp_data;
TYPE t_stg_dhcp IS TABLE OF stg_dhcp_data % ROWTYPE INDEX BY pls_integer;
l_stg_dhcp t_stg_dhcp;
l_dhcp_cnt NUMBER;
l_cnt NUMBER;
NUMBER of l_cnt_1;
BEGIN
OPEN c_stg_dhcp.
Get the c_stg_dhcp COLLECT in BULK IN l_stg_dhcp;
BECAUSE me IN l_stg_dhcp. FIRST... l_stg_dhcp. LAST
LOOP
SELECT COUNT (1)
IN l_dhcp_cnt
OF subntwk
WHERE subntwk_nm = l_stg_dhcp (i) .dhcp_token;
IF l_dhcp_cnt < 1 THEN
INSERT
IN SUBNTWK
(
subntwk_nm
)
VALUES
(
l_stg_dhcp (i) .dhcp_token
);
DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_dhcp (i) .dhcp_token);
ON THE OTHER
DBMS_OUTPUT. Put_line ("token is already present'");
END IF;
WHEN l_stg_dhcp EXIT. COUNT = 0;
END LOOP;
commit;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
for link -.
coil load_link.log
Set serveroutput on
DECLARE
l_cmts_1 VARCHAR2 (4000 CHAR);
l_cmts_add VARCHAR2 (200 CHAR);
l_dhcp_cnt NUMBER;
l_cmts_cnt NUMBER;
l_link_cnt NUMBER;
l_add_link_nm VARCHAR2 (200 CHAR);
BEGIN
FOR (IN) r
SELECT dhcp_token, cmts_to_add | ',' cmts_add
OF stg_link_data
)
LOOP
l_cmts_1: = r.cmts_add;
l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));
SELECT COUNT (1)
IN l_dhcp_cnt
OF subntwk
WHERE subntwk_nm = r.dhcp_token;
IF l_dhcp_cnt = 0 THEN
DBMS_OUTPUT. Put_line ("device not found: ' |") r.dhcp_token);
ON THE OTHER
While l_cmts_add IS NOT NULL
LOOP
l_add_link_nm: = r.dhcp_token |' _TO_' | l_cmts_add;
SELECT COUNT (1)
IN l_cmts_cnt
OF subntwk
WHERE subntwk_nm = TRIM (l_cmts_add);
SELECT COUNT (1)
IN l_link_cnt
LINK
WHERE link_nm = l_add_link_nm;
IF l_cmts_cnt > 0 AND l_link_cnt = 0 THEN
INSERT INTO link (link_nm)
VALUES (l_add_link_nm);
DBMS_OUTPUT. Put_line (l_add_link_nm |) » '||' Has been added. ") ;
ELSIF l_link_cnt > 0 THEN
DBMS_OUTPUT. Put_line (' link is already present: ' | l_add_link_nm);
ELSIF l_cmts_cnt = 0 then
DBMS_OUTPUT. Put_line (' no. CMTS FOUND for device to create the link: ' | l_cmts_add);
END IF;
l_cmts_1: = TRIM (SUBSTR (l_cmts_1, INSTR (l_cmts_1, ',') + 1));
l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));
END LOOP;
END IF;
END LOOP;
COMMIT;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
control files -
DOWNLOAD THE DATA
INFILE 'cmts_data.csv '.
ADD
IN THE STG_CMTS_DATA TABLE
When (cmts_token! = ") AND (cmts_token! = 'NULL') AND (cmts_token! = 'null')
and (cmts_ip! = ") AND (cmts_ip! = 'NULL') AND (cmts_ip! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:cmts_token))' cmts_token,
cmts_ip ' RTRIM (LTRIM(:cmts_ip)) ")". "
for dhcp.
DOWNLOAD THE DATA
INFILE 'dhcp_data.csv '.
ADD
IN THE STG_DHCP_DATA TABLE
When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')
and (dhcp_ip! = ") AND (dhcp_ip! = 'NULL') AND (dhcp_ip! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:dhcp_token))' dhcp_token,
dhcp_ip ' RTRIM (LTRIM(:dhcp_ip)) ")". "
for link -.
DOWNLOAD THE DATA
INFILE 'link_data.csv '.
ADD
IN THE STG_LINK_DATA TABLE
When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')
and (cmts_to_add! = ") AND (cmts_to_add! = 'NULL') AND (cmts_to_add! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:dhcp_token))' dhcp_token,
cmts_to_add TANK (4000) RTRIM (LTRIM(:cmts_to_add)) ")" ""
SHELL SCRIPT-
If [!-d / log]
then
Mkdir log
FI
If [!-d / finished]
then
mkdir makes
FI
If [!-d / bad]
then
bad mkdir
FI
nohup time sqlldr username/password@SID CONTROL = load_cmts_data.ctl LOG = log/ldr_cmts_data.log = log/ldr_cmts_data.bad DISCARD log/ldr_cmts_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
nohup time username/password@SID @load_cmts.sql
nohup time sqlldr username/password@SID CONTROL = load_dhcp_data.ctl LOG = log/ldr_dhcp_data.log = log/ldr_dhcp_data.bad DISCARD log/ldr_dhcp_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
time nohup sqlplus username/password@SID @load_dhcp.sql
nohup time sqlldr username/password@SID CONTROL = load_link_data.ctl LOG = log/ldr_link_data.log = log/ldr_link_data.bad DISCARD log/ldr_link_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
time nohup sqlplus username/password@SID @load_link.sql
MV *.log. / log
If the problem I encounter is here for loading data in the link table that I check if DHCP is present in the subntwk table, then continue to another mistake of the newspaper. If CMTS then left create link to another error in the newspaper.
Now that we can here multiple CMTS are associated with unique DHCP.
So here in the table links to create the link, but for the last iteration of the loop, where I get separated by commas separate CMTS table stg_link_data it gives me log as not found CMTS.
for example
DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2
Here, I guess to link the dhcp-1-1-1 with balancing-CMTS-01-1 and wnlb-CMTS-02-2
Theses all the data present in the subntwk table, but still it gives me journal wnlb-CMTS-02-2 could not be FOUND, but we have already loaded into the subntwk table.
same thing is happening with all the CMTS table stg_link_data who are in the last (I think here you got what I'm trying to explain).
But when I run the SQL scripts in the SQL Developer separately then it inserts all valid links in the table of links.
Here, she should create 9 lines in the table of links, whereas now he creates only 5 rows.
I use COMMIT in my script also but it only does not help me.
Run these scripts in your machine let me know if you also get the same behavior I get.
and please give me a solution I tried many thing from yesterday, but it's always the same.
It is the table of link log
link is already present: dhcp-1-1-1_TO_wnlb-cmts-01-1 NOT FOUND CMTS for device to create the link: wnlb-CMTS-02-2
link is already present: dhcp-1-1-2_TO_wnlb-cmts-03-3 link is already present: dhcp-1-1-2_TO_wnlb-cmts-04-4 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-5
NOT FOUND CMTS for device to create the link: wnlb-CMTS-01-1
NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-8 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-6 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-0 NOT FOUND CMTS for device to create the link: wnlb-CMTS-03-3
link is already present: dhcp-1-1-5_TO_wnlb-cmts-02-2 link is already present: dhcp-1-1-5_TO_wnlb-cmts-04-4 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-7
Device not found: wnlb-dhcp-1-13 IF NEED MORE INFORMATION PLEASE LET ME KNOW
Thank you
I felt later in the night that during the loading in the staging table using UNIX machine he created the new line for each line. That is why the last CMTS is not found, for this I use the UNIX 2 BACK conversion and it starts to work perfectly.
It was the dos2unix error!
Thank you all for your interest and I may learn new things, as I have almost 10 months of experience in (PLSQL, SQL)
-
SQL * Loader does not import data
Hi all -
I have a very basic package which should load data from a file delimited by tabs. My problem is that when I run my package, no data is loaded. Although the correct number of records is created (based on relaxation of the table). All records contain no data.
OPTIONS (skip = 1, errors = 10, lines = 10000, direct = True)
DOWNLOAD THE DATA
INFILE "C:\ECOMMERCE\VFT\Marin\inbound\DSGSiteCatalystPassbackKeywords.csv" "str"\n"
BADFILE "C:\ECOMMERCE\MFT\NxtGen_Catalog_Int\ErrorFiles\MARIN_DSG_KEYWORDS.bad."
DISCARDFILE 'C:\ECOMMERCE\MFT\NxtGen_Catalog_Int\ErrorFiles\MARIN_DSG_KEYWORDS.dsc '.
IN THE TABLE "ETL_STAGE". "" MARIN_KEYWORD ".
ADD
EVALUATE CHECK_CONSTRAINTS
"FIELDS TERMINATED BY ' 09 X."
TRAILING NULLCOLS
(
MARIN_KEYWORD_ID,
KEYWORD,
BUSINESS_DATE,
EDITOR,
ACCOUNT,
CAMPAIGN,
AD_GROUP,
TYPE CHAR (100000),
DESTINATION_URL,
UNIQUE_ID,
PUB_ID,
PRINT,
CLICKS,
PUB_COST,
ATTRIBUTED_CONVERSIONS_CONV,
CLICK_PATH_CONV,
LAST_CLICK_CONV,
EMAIL_SIGNUPS_CONV,
SCORECARD_SIGNUPS_CONV,
STORE_LOCATOR_PAGE_CONV
)
My table creation script is:
CREATE THE TABLE ETL_STAGE. MARIN_KEYWORD
(
MARIN_KEYWORD_ID VARCHAR2 (1000 BYTE),
KEYWORD VARCHAR2 (1000 BYTE).
BUSINESS_DATE VARCHAR2 (200 BYTE),
EDITOR VARCHAR2 (1000 BYTE),
ACCOUNT VARCHAR2 (1000 BYTE),
CAMPAIGN VARCHAR2 (1000 BYTE),
AD_GROUP VARCHAR2 (1000 BYTE),
TYPE VARCHAR2 (1000 BYTE),
DESTINATION_URL VARCHAR2 (1000 BYTE),
UNIQUE_ID VARCHAR2 (1000 BYTE),
PUB_ID VARCHAR2 (1000 BYTE),
VARCHAR2 (1000 BYTE) PRINT,
VARCHAR2 (1000 BYTE) CLICKS,
PUB_COST VARCHAR2 (1000 BYTE),
ATTRIBUTED_CONVERSIONS_CONV VARCHAR2 (1000 BYTE),
CLICK_PATH_CONV VARCHAR2 (1000 BYTE),
LAST_CLICK_CONV VARCHAR2 (1000 BYTE),
EMAIL_SIGNUPS_CONV VARCHAR2 (1000 BYTE),
SCORECARD_SIGNUPS_CONV VARCHAR2 (1000 BYTE),
STORE_LOCATOR_PAGE_CONV VARCHAR2 (1000 BYTE),
IMEX_LOG_REFERENCE_ID VARCHAR2 (1000 BYTE),
DATE_ADDED DATE DEFAULT SYSDATE NOT NULL,.
ADDED_BY VARCHAR2 (50 BYTES) BY DEFAULT USER NOT NULL,.
DATE_LAST_MODIFIED DATE DEFAULT SYSDATE NOT NULL,.
MODIFIED_BY VARCHAR2 (50 BYTES) BY DEFAULT USER NOT NULL,.
RECORD_STATUS VARCHAR2 (1 BYTE) DEFAULT 'A' NOT NULL
)
TABLESPACE ECOM_DATA
PCTUSED 0
PCTFREE 10
INITRANS 1
MAXTRANS 255
STORAGE)
64K INITIALS
ACCORDING TO 1 M
MINEXTENTS 1
MAXEXTENTS UNLIMITED
PCTINCREASE 0
DEFAULT USER_TABLES
)
LOGGING
NOCOMPRESS
NOCACHE
NOPARALLEL
MONITORING;
The log displays the following text:
SQL * Loader: release 11.2.0.2.0 - Production on Thu Jun 25 14:31:35 2015
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Control file: C:\ECOMMERCE\MFT\NxtGen_Catalog_Int\SQLLoaderScripts\MARIN_DSG_CREATIVES.ctl
Data file: C:\ECOMMERCE\VFT\Marin\inbound\DSGSiteCatalystPassbackCreatives.csv
Bad leadership: C:\\ECOMMERCE\MFT\NxtGen_Catalog_Int\ErrorFiles\MARIN_DSG_CREATIVES.bad
Delete the file: C:\ECOMMERCE\MFT\NxtGen_Catalog_Int\ErrorFiles\MARIN_DSG_CREATIVES.dsc
(Allow all releases)
Number of loading: ALL
Number of jump: 10
Authorized errors: 50
Link table: 64 lines, maximum of 256000 bytes
Continuation of the debate: none is specified
Path used: classics
Table 'ETL_STAGE '. "" MARIN_CREATIVE ", loaded from every logical record.
Insert the option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Position Len term Encl. Datatype name
------------------------------ ---------- ----- ---- ---- ---------------------
CREATIVE_ID PRIME * WHT O(") CHARACTER
The maximum field length is 100,000
HEADLINE THEN * CHARACTER O(") WHT
BUSINESS_DATE NEXT * CHARACTER O(") WHT
The SQL string for the column: 'TRIM(:BUSINESS_DATE) '.
DESCRIPTION_LINE_1 NEXT * CHARACTER O(") WHT
DESCRIPTION_LINE_2 NEXT * CHARACTER O(") WHT
DISPLAY_URL NEXT * CHARACTER O(") WHT
DESTINATION_URL NEXT * WHT O(") CHARACTER
The maximum field length is 100,000
EDITOR NEXT * CHARACTER O(") WHT
THEN THE CAMPAIGN * CHARACTER O(") WHT
AD_GROUP NEXT * CHARACTER O(") WHT
UNIQUE_ID NEXT * CHARACTER O(") WHT
PUB_ID NEXT * CHARACTER O(") WHT
PRINT NEXT * CHARACTER O(") WHT
The SQL string for the column: "replace (: print, ',',") "."
CLICK ON NEXT * CHARACTER O(") WHT
The SQL string for the column: "replace (: CLICKS, ',',") "."
ATTRIBUTED_CONVERSIONS_CONV NEXT * CHARACTER O(") WHT
CLICK_PATH_CONV NEXT * CHARACTER O(") WHT
LAST_CLICK_CONV NEXT * CHARACTER O(") WHT
EMAIL_SIGNUPS_CONV NEXT * CHARACTER O(") WHT
SCORECARD_SIGNUPS_CONV NEXT * CHARACTER O(") WHT
STORE_LOCATOR_PAGE_HITS_CONV NEXT * CHARACTER O(") WHT
IMEX_LOG_REFERENCE_ID NEXT * CHARACTER O(") WHT
Why WHT, showing when I use tabs?
In addition, why is not display all the data in the files?
NLS_CHARACTERSET WE8ISO8859P1 -
Name of dynamic INFILE SQL loader
Hi all
I use oracle 10g on windows server.
I have a log on a remote server. each line in the log file is as below
LogIN Mar 02/09/2014 10:10:48 ss18 N419FS40 1
I create a table for that as below;
Create table free_pc (log_in varchar2(10),log_day varchar2(10),log_date date,log_time date,log_user varchar2(30),log_Lab varchar2(30),log_pc varchar2(30), log_status char(1));
I have two problems;
1 N419FS40, the N419 is the name of a student in the lab, while FS40 is the name of the computer in this laboratory. I want only the first four characters to be inserted into the column of Log_lab and the rest of characters iin Log_pc column. my control file is less than
DOWNLOAD THE DATA
INFILE '\\remote_location\login02_09_14.txt '.
ADD
IN THE TABLE free_pc
FIELDS FINISHED WITH A WHITE SPACE
(log_in,
log_day,
log_date DATE "DD/MM/YYYY",
log_time DATE "HH24 MI SS."
LOG_USER,
log_Lab,
log_PC,
log_status
)
How to do this?
2. the log file is generated on the server, every day with a different name, he cancatenate the current date with Word LOGIN, for example login02_09_14.txt, login03_09_14.txt and login04_09_14 etc. in my INFILE tag, how can I put the name, so that it automatically take the name of the remote location?
Thank you.
Kind regards.
1. you can use the SUBSTR function to separate the two values. You put the fields in the control file in the same order in the data file, with all the columns that are formulated from data in other areas at the end.
2. There are different ways to do this. The file name can be in the control file or the SQL * Loader command line. You can create either using SQL/SQL * Plus or operating system commands. I tend to prefer change just the line of SQL * Loader command line, instead of the entire control file. I prefer to do this in a SQL file instead of a file breaks Windows or * ix shell script or something, so that it is independent of the operating system.
In addition, it is best to store all of your day, date and time in a column, then you can use to_char to display it as you want, in a column, or two or three.
Please see the example below, in which I have shown above.
Scott@ORCL >-data file you provided with the name changed to include today's date:
Scott@ORCL > HOST TYPE 'c:\remote_location\login27_04_15.txt '.
Opening on Tuesday, September 2, 2014 ss18 N419FS40 1 10:10:48
Scott@ORCL >-control file with no infile, fields in the order of the data file, using fill for the unused columns,.
Scott@ORCL >-with forumulated columns, using boundfiller for filling necessary for columns columns drawn up:
Scott@ORCL > test.ctl TYPE of HOST
DOWNLOAD THE DATA
ADD
IN THE TABLE free_pc
FIELDS FINISHED WITH A WHITE SPACE
TRAILING NULLCOLS
(log_in,
log_day FILLING,
log_date BOUNDFILLER,
log_time BOUNDFILLER,
LOG_USER,
log_Lab ' SUBSTR (: log_lab, 1, 4). "
log_status,
log_date_time ' TO_DATE (: log_date |: log_time, "MM/DD / YYYYHH24:MI:SS'"). "
log_PC ' SUBSTR (: log_lab, 5) ')
Scott@ORCL >-sql script which now enables to get the data and it concatenates the file name with the path of the directory
Scott@ORCL >- and loads the data running SQL * Loader of SQL * Plus, using the HOST command:
Scott@ORCL > test.sql TYPE of HOST
COLUMN data_file new_value data_file
SELECT 'c:\remote_location\login ' | TO_CHAR (SYSDATE, 'DD_MM_YY'). ".txt" AS data_file FROM DUAL
/
HOST SQLLDR scott/tiger CONTROL = test.ctl DATA = '& data_file' LOG = test.log
CLEAR COLUMN
Scott@ORCL >-table which includes the date and time in a column, for the day of the week can be extracted:
Scott@ORCL > create table free_pc
2 (log_in varchar2 (10))
3, date log_date_time
4, log_user varchar2 (30)
5, log_Lab varchar2 (30)
6, log_pc varchar2 (30)
7, log_status char (1))
8.
Table created.
Scott@ORCL >-load data file sql running running SQL * Loader using the control file:
Scott@ORCL > test.sql START
Scott@ORCL > data_file new_value data_file COLUMN
Scott@ORCL > SELECT 'c:\remote_location\login ' | TO_CHAR (SYSDATE, 'DD_MM_YY'). ".txt" AS data_file FROM DUAL
2.
DATA_FILE
------------------------------------
c:\remote_location\login27_04_15.txt
1 selected line.
Scott@ORCL > HOST SQLLDR scott/tiger CONTROL = test.ctl DATA = '& data_file' LOG = test.log
SQL * Loader: release 11.2.0.1.0 - Production on my Apr 27 12:00:53 2015
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit the point reached - the number of logical records 1
Scott@ORCL > CLEAR COLUMN
Scott@ORCL >-results with Date_Time field displayed in the form of day, date and time:
Scott@ORCL > log_in COLUMN FORMAT A6
Scott@ORCL > log_day COLUMN FORMAT A7
Scott@ORCL > log_user COLUMN FORMAT A8
Scott@ORCL > log_lab COLUMN FORMAT A8
Scott@ORCL > log_pc COLUMN FORMAT A6
Scott@ORCL > log_status FORMAT A10 COLUMN
Scott@ORCL > SELECT log_in,.
2 TO_CHAR (log_date_time, ' HH24:MI:SS Dy DD-Mon-YYYY ') ' DAY DATE TIME. "
3 log_user, log_lab, log_pc, log_status
4 FROM free_pc
5.
LOG_IN DAY DATE TIME LOG_USER LOG_LAB LOG_PC LOG_STATUS
------ ------------------------------------------ -------- -------- ------ ----------
LogIN Mar 02 - Sep - 2014 ss18 N419 FS40 1 10:10:48
1 selected line.
-
Hi I'm using version 11.2.0.3.0 version of oracle, and I'm trying to push the file using sql loader utility to my DB table, below is my table structure, two of the column will be constant appears as mentioned below in the table structure. And the File_DATE column must be the combination of two fields (Date and time) of flat file to have the appropriate data column format. I get the error message and not be able to download the data using slot control file, so need help.
sample data file
OrderDate, name, size, Records, Date, Time
06202014, authlogfile06202014.txt, 40777214, 198915, June 21 at 03:51
06202014, transferfile06202014.txt, 372144, 2255, June 21 at 01:34
06202014, balancefile06202014.txt, 651075, 10343, June 21 at 03:28The table structure
Create the table file_stats
(Systemtypecode VARCHAR2 (4000)-this will be a value to hardcode "CBD")
LETTER Date,
FILENAME VARCHAR2 (4000).
Size of the file Number (20.0).
Noofrecords NUMBER (20.0).
File_DATE VARCHAR2 (4000).
CREATED_DATE Date - it will be filled with the value SYSDATE
);Here's my control file
Options
(SKIP = 1)
DOWNLOAD THE DATA
INFILE 'files.csv '.
ADD
IN the file_stats table
FIELDS TERMINATED BY ', '.
(Systemtypecode CONSTANT 'CBD',
LETTER DATE 'MMDDYYYY.
CHAR FILE NAME,
Size of the ENTIRE file,
Noofrecords INTEGER,
boundfiller TANK file_DATE_DDmon
, file_DATE ' to_date ('2014':: file_DATE_DDmon:: file_DATE, 'YYYYMON DDHH24:MI').
, created_date CONSTANT 'sysdate.
)When running under command, all erroneous records on as below
sqlldr schema1/pwd@db1 control = file_stats.ctl log = file_stats.log file_stats.bad = bad
ERROR:
Sheet 1: Rejected - error on the table FILE_STATS, column FILE_DATE.
ORA-01843: not one month validYou must add TRAILING NULLCOLS and use EXTERNAL INTEGER instead of an INTEGER and your file_date column must be of type DATE data. Please see the demo below with some additional corrections. Note that since there is no year in the data file for the file_date column, which by default is the current year. If you want to 2014 so you need to concatenate that and add the date format YYYY.
Scott@orcl12c > files.csv TYPE of HOST
OrderDate, name, size, Records, Date, Time
06202014, authlogfile06202014.txt, 40777214, 198915, June 21 at 03:51
06202014, transferfile06202014.txt, 372144, 2255, June 21 at 01:34
06202014, balancefile06202014.txt, 651075, 10343, June 21 at 03:28
Scott@orcl12c > file_stats.ctl TYPE of HOST
OPTIONS (SKIP = 1)
DOWNLOAD THE DATA
INFILE 'files.csv '.
ADD IN THE TABLE file_stats
FIELDS TERMINATED BY ',' TRAILING NULLCOLS
(systemtypecode CONSTANT "CBD"
, odate DATE 'MMDDYYYY '.
filename CHAR
INTEGER EXTERNAL file size
noofrecords INTEGER EXTERNAL
file_date_ddmon BOUNDFILLER TANK
, file_date DATE 'My DDHH24:MI' ': file_date_ddmon |: file_date '
, created_date 'SYSDATE.
)
Scott@orcl12c > file_stats CREATE TABLE
2 (systemtypecode VARCHAR2 (14),)
3 DATE odate,.
4 name of file VARCHAR2 (24).
size of the file NUMBER 5 (20.0).
6 noofrecords NUMBER (20.0).
7 file_date DATE,
8 created_date DATE)
9.
Table created.
Scott@orcl12c > HOST SQLLDR scott/tiger CONTROL = file_stats.ctl LOG = file_stats.log file_stats.bad = BAD
SQL * Loader: release 12.1.0.1.0 - Production on Thu Feb 5 15:26:49 2015
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: classics
Commit the point reached - the number of logical records 3
Table FILE_STATS:
3 rows loaded successfully.
Check the log file:
file_stats.log
For more information on the charge.
Scott@orcl12c > SELECT * FROM file_stats
2.
SYSTEMTYPECODE, KAI NOOFRECORDS FILE_DATE CREATED_DATE FILENAME FILE SIZE
-------------- --------------- ------------------------ ---------- ----------- --------------- ---------------
Friday, June 20, 2014 authlogfile06202014.txt 40777214 198915 Sunday 21 June 2015 CBD Thursday, February 5, 2015
Friday, June 20, 2014 transferfile06202014.txt 372144 2255 Sunday 21 June 2015 CBD Thursday, February 5, 2015
Friday, June 20, 2014 balancefile06202014.txt 651075 10343 Sunday 21 June 2015 CBD Thursday, February 5, 2015
3 selected lines.
-
How can I use sqlldr on SQL Developer version 4.0.2.15 command.
Now I'm able to use this command via the command prompt, but my requirement is that use this command on SQL Developer.
SQL * Loader is an external standalone tool, not a SQL or PL/SQL command.
Ask how to run through SQL Developer, that's like asking the execution of MS Word through SQL Developer, i.e. it is not really meaningful.
SQL * Developer, to the best of my knowledge, incorporates some built in options for loading data or extraction of data to CSV files, so maybe you want to just use those?
-
SQL Loader issue - CSV with commas and quotes IN the data
Hello, I have a dataset for a simple table of 2 columns like this:
Column 1, "it is given for"Column 2", with commas and quotes."
Data are delimited by commas and may be surrounded by double quotes. In ADDITION, it may include commas and quotation marks in the data fields. I CANNOT manipulate data before sending it to SQL Loader.
I set my file of control like this:
DOWNLOAD THE DATA
INFILE '. / TEST.dat'
BADFILE '. / TEST. BAD'
DISCARDFILE '. / TEST. DSC"
REPLACE IN THE TEST TABLE
Fields ended by ',' POSSIBLY BOX BY ' "' TRAILING NULLCOLS"
(
Col1 char (50),
Col2 char (500)
)
Now when I run the present via SQLLDR, I get the following error in the log file:
Sheet 1: Rejected - error on table TEST, column COL2.
no terminator found after CLOSE and CLOSED field
What are my options to get the loaded data as presented above? I'm working on Oracle 11 g (11.2.0.3.0) 64-bit on AIX 6.1.0.0.
Thank you!
In this case, there is no way she can tell who's a delimiter or pregnant and which is part of the data. As far as I know, there is no way that you can load it into the appropriate columns.
-
Trouble with the adapter SQL Loader
Hi all
I'm trying to sue the adapter SQL Loader to collect my SQL server instance performance statistics. I followed the instructions in Appendix A of this document: https://www.vmware.com/files/pdf/solutions/Monitoring-Business-Critical-Applications-VMware-vCenter-Operations-Manager-white-paper.pdf
Our environment is one vCOps 5.8 vCloud Suite Enterprise license licensed. SQL server SQL 2008 R2 and is reasonably up-to-date on patches, etc. (CUA8, I think)
I've gotten to the point where the database is logged in and my query is running, but it seems that the values returned by the query correspond to default values (0). I don't know why this is happening, but because the newspapers seem to show the actual values from the database:
2014-02-10 17:29:50, 438 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB - #SELECT "SQL LOADER" ADAPTERKIND, query
"SQL Server PERFMON STATS" RESOURCEKIND,.
GETUTCDATE() TIMESTAMP,
'Agent' RESOURCEKIND,
"SVC-mgmt-sql SQL Server" RESOURCENAME,.
(LTRIM (RTRIM (replace ([parameter], ':', '-'))) + "|" + counter_name) METRICNAME1,
Sum (cntr_value) VALUE1
FROM sys.dm_os_performance_counters
WHERE
(GETUTCDATE() > = (CONVERT (datetime, ' 02/10/2014 17:9:0 ', 101))) AND
(' 01 / 01/2000 ' < (CONVERT (datetime, ' 02/10/2014-17:29:50 ', 101))) AND
counter_name in)
"User connections"
"Compilations SQL/s."
"SQL Recompilations/sec."
"Server memory target (KB)"
"Total server memory (KB)"
"Scriptures delayed/s."
"Checkpoint pages/sec.
"Page life expectancy."
'Memory waiting requests',
"IO Page lock expects."
"Wait worker."
"Writing waiting to connect."
"IO expects network")
GROUP BY (LTRIM (RTRIM (replace ([parameter], ':', '-'))) + "|" + counter_name)
ORDER OF TIMESTAMP
2014-02-10 17:29:50, 467 com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB DEBUG [Collector worker thread 8] (20583) - try to connect...
2014-02-10 17:29:50, 478 com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB DEBUG [Collector worker thread 8] (20583) - connected...
2014-02-10 17:29:50, 490 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB - QueryExecutionTime = 12
2014-02-10 17:29:50, 509 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB - strMetricCount = 0
2014-02-10 17:29:50, 533 DEBUG [Collector worker thread 8] com.integrien.adapter3.generalsqldataloader.DataReader.getResourceKeyFromResultSet (20583) - work on resource: SVC - mgmt - sql SQL Server
2014-02-10 17:29:50, 538 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.isResourceRenameAllowed - rename audit allowed for resource key {resourceName = SVC - mgmt - sql SQL Server & adapterKindKey = SQL LOADER & resourceKindKey = SQL Server PERFORMANCE Monitor STATS}
2014-02-10 17:29:50, 571 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB - resource of treatment: SVC-mgmt-sql SQL Server extract the metric values are 12921
2014-02-10 17:29:50, 612 DEBUG [Collector worker thread 8] com.integrien.adapter3.generalsqldataloader.DataReader.getResourceKeyFromResultSet (20583) - work on resource: SVC - mgmt - sql SQL Server
2014-02-10 17:29:50, 624 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.isResourceRenameAllowed - rename audit allowed for resource key {resourceName = SVC - mgmt - sql SQL Server & adapterKindKey = SQL LOADER & resourceKindKey = SQL Server PERFORMANCE Monitor STATS}
2014-02-10 17:29:50, 624 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB - resource of treatment: SVC-mgmt-sql SQL Server extract the metric values are 10
2014-02-10 17:29:50, 626 DEBUG [Collector worker thread 8] com.integrien.adapter3.generalsqldataloader.DataReader.getResourceKeyFromResultSet (20583) - work on resource: SVC - mgmt - sql SQL Server
2014-02-10 17:29:50, 626 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.isResourceRenameAllowed - rename audit allowed for resource key {resourceName = SVC - mgmt - sql SQL Server & adapterKindKey = SQL LOADER & resourceKindKey = SQL Server PERFORMANCE Monitor STATS}
2014-02-10 17:29:50, 626 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB - resource of treatment: SVC-mgmt-sql SQL Server extract the metric values are 72869
2014-02-10 17:29:50, 626 DEBUG [Collector worker thread 8] com.integrien.adapter3.generalsqldataloader.DataReader.getResourceKeyFromResultSet (20583) - work on resource: SVC - mgmt - sql SQL Server
2014-02-10 17:29:50, 626 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.isResourceRenameAllowed - rename audit allowed for resource key {resourceName = SVC - mgmt - sql SQL Server & adapterKindKey = SQL LOADER & resourceKindKey = SQL Server PERFORMANCE Monitor STATS}
2014-02-10 17:29:50, 626 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB - resource of treatment: SVC-mgmt-sql SQL Server extract the metric values are 17
2014-02-10 17:29:50, 627 DEBUG [Collector worker thread 8] com.integrien.adapter3.generalsqldataloader.DataReader.getResourceKeyFromResultSet (20583) - work on resource: SVC - mgmt - sql SQL Server
2014-02-10 17:29:50, 627 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.isResourceRenameAllowed - rename audit allowed for resource key {resourceName = SVC - mgmt - sql SQL Server & adapterKindKey = SQL LOADER & resourceKindKey = SQL Server PERFORMANCE Monitor STATS}
2014-02-10 17:29:50, 627 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB - resource of treatment: SVC-mgmt-sql SQL Server extract the metric values are 72869
2014-02-10 17:29:50, 627 DEBUG [Collector worker thread 8] com.integrien.adapter3.generalsqldataloader.DataReader.getResourceKeyFromResultSet (20583) - work on resource: SVC - mgmt - sql SQL Server
2014-02-10 17:29:50, 628 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.isResourceRenameAllowed - rename audit allowed for resource key {resourceName = SVC - mgmt - sql SQL Server & adapterKindKey = SQL LOADER & resourceKindKey = SQL Server PERFORMANCE Monitor STATS}
2014-02-10 17:29:50, 628 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB - resource of treatment: SVC-mgmt-sql SQL Server extract the metric values are 10644
2014-02-10 17:29:50, 628 DEBUG [Collector worker thread 8] com.integrien.adapter3.generalsqldataloader.DataReader.getResourceKeyFromResultSet (20583) - work on resource: SVC - mgmt - sql SQL Server
2014-02-10 17:29:50, 628 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.isResourceRenameAllowed - rename audit allowed for resource key {resourceName = SVC - mgmt - sql SQL Server & adapterKindKey = SQL LOADER & resourceKindKey = SQL Server PERFORMANCE Monitor STATS}
2014-02-10 17:29:50, 628 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB - resource of treatment: SVC-mgmt-sql SQL Server extract the metric values are 77
2014-02-10 17:29:50, 629 DEBUG [Collector worker thread 8] com.integrien.adapter3.generalsqldataloader.DataReader.getResourceKeyFromResultSet (20583) - work on resource: SVC - mgmt - sql SQL Server
2014-02-10 17:29:50, 629 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.isResourceRenameAllowed - rename audit allowed for resource key {resourceName = SVC - mgmt - sql SQL Server & adapterKindKey = SQL LOADER & resourceKindKey = SQL Server PERFORMANCE Monitor STATS}
2014-02-10 17:29:50, 629 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB - resource of treatment: SVC-mgmt-sql SQL Server extract the metric values are 0
2014-02-10 17:29:50, 629 DEBUG [Collector worker thread 8] com.integrien.adapter3.generalsqldataloader.DataReader.getResourceKeyFromResultSet (20583) - work on resource: SVC - mgmt - sql SQL Server
2014-02-10 17:29:50, 629 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.isResourceRenameAllowed - rename audit allowed for resource key {resourceName = SVC - mgmt - sql SQL Server & adapterKindKey = SQL LOADER & resourceKindKey = SQL Server PERFORMANCE Monitor STATS}
2014-02-10 17:29:50, 630 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB - resource of treatment: SVC-mgmt-sql SQL Server extract the metric values are 425692
2014-02-10 17:29:50, 630 DEBUG [Collector worker thread 8] com.integrien.adapter3.generalsqldataloader.DataReader.getResourceKeyFromResultSet (20583) - work on resource: SVC - mgmt - sql SQL Server
2014-02-10 17:29:50, 630 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.isResourceRenameAllowed - rename audit allowed for resource key {resourceName = SVC - mgmt - sql SQL Server & adapterKindKey = SQL LOADER & resourceKindKey = SQL Server PERFORMANCE Monitor STATS}
2014-02-10 17:29:50, 630 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB - resource of treatment: SVC-mgmt-sql SQL Server extract the metric values are 11747062
2014-02-10 17:29:50, 630 DEBUG [Collector worker thread 8] com.integrien.adapter3.generalsqldataloader.DataReader.getResourceKeyFromResultSet (20583) - work on resource: SVC - mgmt - sql SQL Server
2014-02-10 17:29:50, 631 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.isResourceRenameAllowed - rename audit allowed for resource key {resourceName = SVC - mgmt - sql SQL Server & adapterKindKey = SQL LOADER & resourceKindKey = SQL Server PERFORMANCE Monitor STATS}
2014-02-10 17:29:50, 631 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB - resource of treatment: SVC-mgmt-sql SQL Server extract the metric values are 29360128
2014-02-10 17:29:50, 631 DEBUG [Collector worker thread 8] com.integrien.adapter3.generalsqldataloader.DataReader.getResourceKeyFromResultSet (20583) - work on resource: SVC - mgmt - sql SQL Server
2014-02-10 17:29:50, 631 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.isResourceRenameAllowed - rename audit allowed for resource key {resourceName = SVC - mgmt - sql SQL Server & adapterKindKey = SQL LOADER & resourceKindKey = SQL Server PERFORMANCE Monitor STATS}
2014-02-10 17:29:50, 631 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB - resource of treatment: SVC-mgmt-sql SQL Server extract the metric values are 29360128
2014-02-10 17:29:50, 632 DEBUG [Collector worker thread 8] com.integrien.adapter3.generalsqldataloader.DataReader.getResourceKeyFromResultSet (20583) - work on resource: SVC - mgmt - sql SQL Server
2014-02-10 17:29:50, 632 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.isResourceRenameAllowed - rename audit allowed for resource key {resourceName = SVC - mgmt - sql SQL Server & adapterKindKey = SQL LOADER & resourceKindKey = SQL Server PERFORMANCE Monitor STATS}
2014-02-10 17:29:50, 632 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB - resource of treatment: SVC-mgmt-sql SQL Server extract the metric values are 0
2014-02-10 17:29:50, 632 DEBUG [Collector worker thread 8] com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB (20583) - number of records processed 13 - used memory (MB): free Mem in 1757 (Mo): 1104
2014-02-10 17:29:50, 632 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.getDataFromDB - for request #= "SQL LOADER' SELECT ADAPTERKIND,.
"SQL Server PERFMON STATS" RESOURCEKIND,.
GETUTCDATE() TIMESTAMP,
'Agent' RESOURCEKIND,
"SVC-mgmt-sql SQL Server" RESOURCENAME,.
(LTRIM (RTRIM (replace ([parameter], ':', '-'))) + "|" + counter_name) METRICNAME1,
Sum (cntr_value) VALUE1
FROM sys.dm_os_performance_counters
WHERE
(GETUTCDATE() > = %f)) AND
(' 01 / 01/2000 ' < %t)) AND
counter_name in)
"User connections"
"Compilations SQL/s."
"SQL Recompilations/sec."
"Server memory target (KB)"
"Total server memory (KB)"
"Scriptures delayed/s."
"Checkpoint pages/sec.
"Page life expectancy."
'Memory waiting requests',
"IO Page lock expects."
"Wait worker."
"Writing waiting to connect."
"IO expects network")
GROUP BY (LTRIM (RTRIM (replace ([parameter], ':', '-'))) + "|" + counter_name)
ORDER OF TIMESTAMP
RecordCount = 13 FilteredRecordCount = 0
2014-02-10 17:29:50, 632 DEBUG [Collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.addDefaultData - Default Metrics size after a data loop = 13
2014-02-10 17:29:50, 633 INFO [collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.addDefaultData - number of default values for this piece of data = 13
2014-02-10 17:29:50, 633 INFO [collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.DataReader.read - 26 returned query parameters
2014-02-10 17:29:50, 634 INFO [collector worker thread 8] com.integrien.adapter3.generalsqldataloader.DataReader.updateResourceTimeMapAndRemoveDuplicates (20583) - 26 data 0 resource comments who have been sent the last time.
2014-02-10 17:29:50, 634 INFO [collector worker thread 8] (20583) com.integrien.adapter3.generalsqldataloader.GeneralSQLDataLoaderAdapter.groupByResource - grouping of resources
2014-02-10 17:29:50, 634 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 634 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 634 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 634 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 634 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 634 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 634 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 634 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 635 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 635 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 635 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 635 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 635 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 635 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 635 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 635 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 635 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 635 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 636 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 636 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 636 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 636 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 636 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 636 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 636 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2014-02-10 17:29:50, 636 DEBUG [Collector worker thread 8] (20583) com.integrien.alive.common.adapter3.AdapterBase.addMetricData - 1 added metrics to collect the result for the "SVC - mgmt - sql SQL Server" resource, resId = 20584, adapter "GeneralSQLDataLoaderAdapter".
2
Has anyone out there dealt with this or found a way around this kind of problem? There only seems to be a lot of information regarding the use of this (or of the) adapters.
Thank you
Jason
IS to use the option you gave.
Looks like that time is not the same timezone between solutions if he just started working after 5 to 6 hours. Make sure you that you don't need to use a different zone such as GMT.
-
Error loading data using SQL loader
I get an error message like "SQL * Loader - 350 combination illegal syntax of non-alphanumeriques characters error during loading of a file using SQL loader in RHEL." The command used to run SQL * Loader is:
Sqlldr userid = < user name > / < password > control = data.ctl
The control file is data.ctl:
DOWNLOAD the data
INFILE ' / home/oraprod/data.txt'
Add in the table test
{
EmpID completed by «,»,
fname completed by «,»,
lname completed by «,»,
treatment is completed with a white space
}
The data.txt file is:
1, Kaushal, Hamad, 5000
2, Chetan, Hamad, 1000
Hopefully, my question is clear.
Please get back with the answer to my query.
Concerning
Replace "{" by "("dans votre fichier de contrôle) "
DOWNLOAD the data
INFILE 'c:\data.txt.
Add the emp_t table
(
EmpID completed by «,»,
fname completed by «,»,
lname completed by «,»,
treatment is completed with a white space
)
C:\>sqlldr user/pwd@database control = c.ctl
SQL * Loader: release 10.2.0.3.0 - Production on Wed Nov 13 10:10:24 2013
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Commit the point reached - the number of logical records 1
Commit the point reached - the number of logical records 2
SQL > select * from emp_t;
EMPID, FNAME LNAME SALARY
---------- -------------------- -------------------- ----------
1 kone hamadi 5000
2 Chetan Hamad 1000
Best regards
Mohamed Houri
-
Option to enter the name of data file in SQL * Loader
Hello
I have a requirement to enter the name of the data file in the staging table, is it possible that I can capture in SQL * Loader or any other way to do it.
Need suggestion of experts please.
Thank you
Genoo
Genoo salvation,
Please have the basename command in Linux to enter the name of the file that is currently running. I don't have a linux machine at the moment access, so please check at your end.
BTW, as you file name of variant data in and values not consistent, you need to change the data file to include the file name. Unless it is approved for your management I don't see alternatives like the fill source tables are the data file. If this is approved then you can do the following:
SED EI / $/ ' basename $0' /'-i '.
Thus, for example, if you have a data file name 01test.csv, and it contains data such as:
11, AAA, ABC
22, BBB, BCD
The command: sed EI / $/, ' basename $0' /'-i 01test.csv will go to:
11, AAA, ABC, 01test.csv
22, BBB, BCD, 01test.csv
Then use SQL Loader to load the file
It is the closest solution to you, I can think. There is no way that you can achieve your goal of features SQL Loader but rather, you should use technical Linux for a work around.
Best regards
-
Hi s/n,.
I am facing problems when loading the dates in the database by using sql loader. My data file can have several date formats, so I have a function that interprets date and removes the time settings and the date of return.
Something like that.
For example:
I have a table-
CREATE TABLE TEMP1234
(
IDENTIFICATION NUMBER,
DATE OF ASOF_DATE
);
Data file
10001172 | 09/12/1945
Control file:
OPTIONS (DIRECT = TRUE, SILENT (FEEDBACK) =, skip = 0)
DATA RELATING TO SUNK COSTS
REPLACE
in the temp1234 table
' fields completed by "|" possibly provided by ' "'
TRAILING NULLCOLS
(ID,
ASOF_DATE "decode (: ASOF_DATE, null,:ASOF_DATE,conv_date1(:ASOF_DATE))).
)
Function CONV_DATE1:
FUNCTION to CREATE or REPLACE conv_date1 (p_str IN VARCHAR2)
DATE OF RETURN
IS
RETURN_VALUE DATE;p_str1 VARCHAR2 (15): = NULL;
FmtArray TYPE IS a TABLE OF VARCHAR2 (30);
g_fmts fmtArray
: = fmtArray ("yyyy-mm-dd",
"yyyy/mm/dd"
mm/dd/yyyy"."
"dd-mm-yyyy",
"dd/mm/yyyy",
"mm-dd-yyyy");
BEGIN
p_str1: = SUBSTR (p_str, 1, 10);BECAUSE me in 1... g_fmts. COUNTY
LOOP
BEGIN
return_value: = TO_DATE (p_str1, g_fmts (i));
EXIT;
EXCEPTION
WHILE OTHERS
THEN
NULL;
END;
END LOOP;IF (return_value IS NULL)
THEN
RAISE PROGRAM_ERROR;
END IF;RETURN Return_value;
END;
/In this case, if the year in the data file shown in 1945, date which is load in the poster 2045 database.
But when I run this function by putting a sql editor - it returns the correct value.
Select double conv_date1('12/09/1945');
Please help me understand what is causing the problem.
I think there may be an implicit conversion going on with the combination of decode and checking nvl and function within SQL * Loader. I put the nvl checking in function and simply use the function in the control file, as shown below.
OPTIONS (DIRECT = TRUE, SILENT (FEEDBACK) =, skip = 0)
DATA RELATING TO SUNK COSTS
REPLACE
in the temp1234 table
' fields completed by "|" possibly provided by ' "'
TRAILING NULLCOLS
(ID,
asof_date ' conv_date1 (: asof_date).
)
FUNCTION to CREATE or REPLACE conv_date1 (p_str IN VARCHAR2)
DATE OF RETURN
IS
RETURN_VALUE DATE;
p_str1 VARCHAR2 (15): = NULL;
FmtArray TYPE IS a TABLE OF VARCHAR2 (30);
g_fmts fmtArray
: = fmtArray ("yyyy-mm-dd",
"yyyy/mm/dd"
mm/dd/yyyy"."
"dd-mm-yyyy",
"dd/mm/yyyy",
"mm-dd-yyyy");
BEGIN
IF p_str IS NULL
THEN RETURN NULL;
ON THE OTHER
p_str1: = SUBSTR (p_str, 1, 10);
BECAUSE me in 1... g_fmts. COUNTY
LOOP
BEGIN
return_value: = TO_DATE (p_str1, g_fmts (i));
EXIT;
EXCEPTION
WHILE OTHERS
THEN
NULL;
END;
END LOOP;
IF (return_value IS NULL)
THEN
RAISE PROGRAM_ERROR;
END IF;
RETURN Return_value;
END IF;
END;
/
-
SQL Loader (how to cut data header)
Hello
[oracle 11g]
I got the following text file:
Now I am interested in the column TBL and ATR and follwing rawdatamod; DD.MM.YYYY; HH:MM:SS; aligned src; "ptv "; "15.04.2012"; "10:47:49" chs; "ISO8859-1" ver; "V1.0" ifv; "V1.0" dve; "V1.0" fft; "LIO" tbl; MENGE_FGR atr; BASIS_VERSION; FGR_NR; FGR_TEXT frm; num[9.0]; num[5.0]; char[40] rec; 122; 8; "VVZ" rec; 123; 18; "VHZ" rec; 124; 13; "VTZ"
Can you see a way to automatically create the MENGE_FR table with columns BASIS_VERSION. FGR_NR; FGR_TEST and column types num, num, char and insert raw data below?
PS:OK, it's mysql... .so I need to convert this sql first. So, you should see num as number.
THX in advance Thorsten
Published by: Thorsten on 16.05.2013 07:30
Published by: Thorsten on 16.05.2013 07:32There are various ways you could do this. I demonstrated one of the methods below. I created a table with two columns, and then use SQL * Loader to load the data from the text file in these two columns. Skip header lines is optional. You can also use a portal instead, if the text file is located on your server, not your client. I then used some PL/SQL to create and run 'create table' and 'insert '. It's just a startup code. You will need to make changes to manage other types of data, and others who were not in the example you provided, but it should give you the general idea.
SCOTT@orcl_11gR2> host type text_file.dat mod; DD.MM.YYYY; HH:MM:SS; aligned src; "ptv "; "15.04.2012"; "10:47:49" chs; "ISO8859-1" ver; "V1.0" ifv; "V1.0" dve; "V1.0" fft; "LIO" tbl; MENGE_FGR atr; BASIS_VERSION; FGR_NR; FGR_TEXT frm; num[9.0]; num[5.0]; char[40] rec; 122; 8; "VVZ" rec; 123; 18; "VHZ" rec; 124; 13; "VTZ" SCOTT@orcl_11gR2> host type test.ctl options(skip=7) load data infile text_file.dat into table tbl (col1 terminated by ';', col2 terminated by x'0a') SCOTT@orcl_11gR2> create table tbl 2 (col1 varchar2(4), 3 col2 varchar2(60)) 4 / Table created. SCOTT@orcl_11gR2> host sqlldr scott/tiger control=test.ctl log=test.log SQL*Loader: Release 11.2.0.1.0 - Production on Thu May 16 13:44:24 2013 Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved. Commit point reached - logical record count 6 SCOTT@orcl_11gR2> select * from tbl 2 / COL1 COL2 ---- ------------------------------------------------------------ tbl MENGE_FGR atr BASIS_VERSION; FGR_NR; FGR_TEXT frm num[9.0]; num[5.0]; char[40] rec 122; 8; "VVZ" rec 123; 18; "VHZ" rec 124; 13; "VTZ" 6 rows selected. SCOTT@orcl_11gR2> declare 2 v_tab varchar2(30); 3 v_atr varchar2(32767); 4 v_frm varchar2(32767); 5 v_sql varchar2(32767); 6 v_cols number; 7 v_next varchar2(32767); 8 begin 9 select col2 into v_tab from tbl where col1 = 'tbl'; 10 select col2 || ';' into v_atr from tbl where col1 = 'atr'; 11 select col2 || ';' into v_frm from tbl where col1 = 'frm'; 12 v_sql := 'CREATE TABLE ' || v_tab || ' ('; 13 select regexp_count (col2, ';') + 1 into v_cols from tbl where col1 = 'atr'; 14 for i in 1 .. v_cols loop 15 v_sql := v_sql || substr (v_atr, 1, instr (v_atr, ';') - 1) || ' '; 16 v_next := substr (v_frm, 1, instr (v_frm, ';') - 1); 17 v_next := replace (v_next, '[', '('); 18 v_next := replace (v_next, ']', ')'); 19 v_next := replace (v_next, '.', ','); 20 v_next := replace (v_next, 'num', 'number'); 21 v_next := replace (v_next, 'char', 'varchar2'); 22 v_sql := v_sql || v_next || ','; 23 v_atr := substr (v_atr, instr (v_atr, ';') + 1); 24 v_frm := substr (v_frm, instr (v_frm, ';') + 1); 25 end loop; 26 v_sql := rtrim (v_sql, ',') || ')'; 27 dbms_output.put_line (v_sql); 28 execute immediate v_sql; 29 for r in (select col2 from tbl where col1 = 'rec') loop 30 v_sql := 'INSERT INTO ' || v_tab || ' VALUES ('''; 31 v_sql := v_sql || replace (replace (r.col2, ';', ''','''), '"', ''); 32 v_sql := v_sql || ''')'; 33 dbms_output.put_line (v_sql); 34 execute immediate v_sql; 35 end loop; 36 end; 37 / CREATE TABLE MENGE_FGR ( BASIS_VERSION number(9,0), FGR_NR number(5,0), FGR_TEXT varchar2(40)) INSERT INTO MENGE_FGR VALUES (' 122',' 8',' VVZ') INSERT INTO MENGE_FGR VALUES (' 123',' 18',' VHZ') INSERT INTO MENGE_FGR VALUES (' 124',' 13',' VTZ') PL/SQL procedure successfully completed. SCOTT@orcl_11gR2> describe menge_fgr Name Null? Type ----------------------------------------- -------- ---------------------------- BASIS_VERSION NUMBER(9) FGR_NR NUMBER(5) FGR_TEXT VARCHAR2(40) SCOTT@orcl_11gR2> select * from menge_fgr 2 / BASIS_VERSION FGR_NR FGR_TEXT ------------- ---------- ---------------------------------------- 122 8 VVZ 123 18 VHZ 124 13 VTZ 3 rows selected.
-
I have Oracle 11 g R2 running on Windows Server 2008 R2
When trying to load the data (csv file) with SQL Loader in Enterprise Manager, he told me the work successfully completed, but there is no data in the table. The bat is created as the ctl and sh file. I checked that sqldr is in my directory \bin to $ORACLE_HOME$ and I checked that this 'path' is also in my path environment variable. When I try to run it from the command line that I don't get recognized as builtin or external, an executable program or batch file. I try to run it in SQL Developer, and after his execution, it is said canceled task. Can help you. I expect a record 70 million next week and I want to use SQL Loader.
Message from Enterprise Manager
State managed
Exit code 0
31327 step ID
Target leads.global
Started may 10, 2013 15:55:14 (UTC-07:00)
Ended May 10, 2013 15:55:22 (UTC-07:00)
Step 8 seconds
WIN Service Management - D1CINRVM11K:1158_Management_Service
Including the job step ADVANCED management service was sent.
Natural logarithm of output
User name:
SQL * Loader: release 11.2.0.1.0 - Production Friday, may 10, 15:55:15 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Loaded - the number of logical records 51.I usually run SQL * Loader of SQL * Plus, so that it avoids some of the questions of privilege. I also use a log file, so that I can check later to see what happened and why. For example:
host sqlldr scott/tiger control = test.ctl log = test.log
A common question is the lack of privileges for the data file to load. You can change an empty file in SQL * more and copy and paste the data into it, to eliminate such problems or identify them, so that you can check the privileges on the original file.
To provide a lot of other things, we would need more information, such as a few lines of sample data, your SQL * Loader control file and the structure of your table.
-
SQL loader file path control...
Hi all
I run the query through sql loader client machine below...
saved query in the file loader.ctl, which I kept it in the path "D:\loader.ctl" below...
options (skip = 1)
load data
INFILE 'D:\flat.txt '.
in the GL_INTERFACE table
fields ended by ',' optionally surrounded "" "
* (STATUS, LEDGER_ID, USER_JE_SOURCE_NAME, USER_JE_CATEGORY_NAME, ACCOUNTING_DATE, CURRENCY_CODE, DATE_CREATED, CREATED_BY, *)
SEGMENT1, SEGMENT2 SEGMENT3, SEGMENT4, SEGMENT5, ACTUAL_FLAG, ENTERED_DR, ENTERED_CR, GROUP_ID)
Now my doubt is, to run the above, I need to run the command in sql loader below...
sqlldr USERID = apps/apps CONTROL = loader.ctl
In the control file, we have given the path to the data file, but * how the command (SQLLDR) identifies the path of the control file. ? *
What I have to give or...?
Is it possible to run the loader program above in the customer's computer? or we have to run only one server...
Please clarify my doubt.
Thank you and best regards,
MuthuHello
The preferred method is, if you use oracle apps, then a simultaneous recording of type sql * loader
the ctl yor file in $/bin. Run the program at the same time to load the data. I'm just givng you advice, you search on google, that you will get several ways to do the same things.
There are many ways such as the use of the external table, UTL_FILE or Shell Script to do the same thing.
Thank you
Lenora
Maybe you are looking for
-
Satellite P850-12Z - Toshiba DVD/BD drive is not compatible?
My disc drive won't read disks. I get an error message saying your ot Toshiba dvd/blue-ray player version is not compatible with this version of windows which is 8.1. How can I fix it?
-
Cliq 2 appears only not on the site updated?
Hi, everyone I'm just curious to know that I checked for almost a year now and it is not news about the cliq 2 get an update, why is this? In addition, the cliq 2 will have a chance to get ICS 4.0 (Ice Cream sandwich)?
-
Error message "A problem with the cooling system" on Satellite A660
Hi all Hope I can get help here. I got about 5 laptop computers HP after each of them had problems and errors. I got sick of them and so got a refund and bought a high spec A660-1FH. The A660 is a great laptop for two days until yesterday when I star
-
List of Palm high dest contacts in outlook?
Hello. Would appreciate help with the list of contacts? I have an Iphone that I want to top of Palm Desktop to add to my contact list. How can I get this list in my Apple Iphone from the top of Palm Desktop?
-
Rome total war get this message unable to initialize 3d audio!
Rome total war get this message unable to initialize 3d audio!