Problem using denounced by ' end_responder_Comments ~' in SQL * Loader
When I run the sqlldr to through the program simultaneous host/UNIX in Oracle apps 11.5.10.2. I get the following error. I don't understand why it doesn't work. I have made a mistake in declaring? Please let me know.log file:
SQL * Loader-350: at line 44 syntax error.
Expecting tank single, found "end_responder_Comments ~".
RESPONDER_COMMENT completed by ' end_responder_Comments ~ ', ^.
SQL * Loader: Release 8.0.6.3.0 - Production on Wed Feb 16 11:38:21 2011
CTL file:
---------
DOWNLOAD THE DATA
ADD
IN THE TABLE 'NR_SPER_DATA2' WHEN SERVICE_REQUEST_NUMBER <>' 0'
FIELDS TERMINATED BY ' ~'
SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
(SECURITY_ID,
---
---
"RESPONDER_COMMENT tank (32000) completed by ' end_responder_Comments ~' OPTIONALLY ENCLOSED BY '" '.
"COMMENTS char (4000) completed by ' end_comments ~' OPTIONALLY ENCLOSED BY '" '.
---
)
Appreciate any help provided as soon as possible.
Thank you
REDA
Hello
CLOB column can take up to 4 GB of data, and default value for char is 255. Anything above 255, you must specify the length
Concerning
OrionNet
Published by: OrionNet on February 16, 2011 15:13
Tags: Database
Similar Questions
-
SQL * Loader Control File - Error 510
Hello!
I am trying to load a file TXT using SQL * Loader, my control file is a little too big I guess: 83KO.
The problem is the message I get:
SQL * Loader: release 10.2.0.3.0 - Production on Qua game 16 11:18:58 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
SQL * Loader-510: O registro fisico dados (C:\Teste\Exe548teste_zanthus.old.ctl) e but full o' maximo land (2147473647).
The language is Brazilian Portuguese.
But the size of the file is only 83 KB, not the maximum size. I'm confused.
The data is a TXT file with more than 1000 characters per line.
The problem is that the generated control file is 'bigger' than the maximum. The error is not from the data file (error in the control file, not the data file). If you want I can put the file here and a sample of the data file too.
I guess that is not find expressions of the folklore of the control file, but there a tank < ENTER > at the EOF (CR + LF (I think).
I really want to split the data file or in multiple files control file and then control the import with the software, which would be boring...
I don't know if this information helps but my control file using multiple INTO TABLE...
Thank you!I still feel that the problem is the end character.
André Luis wrote:
The comparison is ok since I have already tested with a small control file.
The problem is not the size of the file, the problem is the size of the line, with a small control file sql loader to read the file only online, and in this case it works because it is under the limit for the size of the line. (The smal file perhaps has the same problems with the characters of late, but due to its size it recount not error.)
I hope that I missed something in the creation of the control file, I think he tries to read data from it, despite the fact that I have to tell him otherwise, because I saw something to tell the size limit for a control file with data inside...
The control file syntax is OK, I don't think that he search data in the control file.
I insist on the fact that you are creating the control file with another editor, otherwise...
-
Hello
IM receiving a sqlldr: not found error. IM is going to discuss with our system of administration of the situation. Before I wanted to make sure that SQL Loader (sqlldr) is an add-on available for customer Oracle 11. A colleague mentioned SQLLDR maybe isn't available as an add-on in the Oracle 11 client and that we should rather use IMPORT/EXPORT. Is that a true statement?
Can someone please clarify these questions for me?
Thanks a bunch!SQL * Loader is certainly available in 11.1 or 11.2 full client install. It may or may not be a component that is installed by default according to the type of installation you choose during installation. But you can always go back and install this component.
If you mean the instant customer, I'm not sure that SQL * Loader or import and export work with the Instant Client.
And just to the point, if you are using 11g, you usually would using external tables rather than SQL * Loader.
Justin
-
Problem loading xml using sql loader file
I am trying to load data into the table test_xml (xmldata XMLType)
I have an xml file and I want any file to load in a single column
When I use the following control file and run from the command-line as follows
sqlldr $1@$TWO_TASK direct control=$XXTOP/bin/LOAD_XML.ctl = true; :
DOWNLOAD THE DATA
INFILE *.
TRUNCATE INTO TABLE test_xml
XmlType (XMLDATA)
FIELDS
(
tank fill ext_fname (100),
XMLDATA LOBFILE (ext_fname) COMPLETED BY expressions of folklore
)
START DATA
U01/appl/apps/apps_st/appl/XXTop/12.0.0/bin/file. XML
the file is loaded in the table perfectly.
Unfortunately I can't hard-code the name of file as file name will be changed dynamically.
so I removed the block
START DATA
U01/appl/apps/apps_st/appl/XXTop/12.0.0/bin/file. XML
control file and tried to run by giving the following command line path
sqlldr $1@$TWO_TASK control=$XXTOP/bin/LOAD_XML.ctl direct data=/u01/APPL/apps/apps_st/appl/xxtop/12.0.0/bin/file.xml = true;
But strangely it attempts to load each line of the xml file in the table instead of the whole file
Please find the log of the program with the error
------------------------------------------------------------------
Loading XML through SQL * Loader begins
------------------------------------------------------------------
SQL * Loader-502: cannot open the data file ' <? XML version = "1.0"? > ' table field TEST_XML XMLDATA
SQL * Loader-553: file not found
SQL * Loader-509: System error: no such file or directory
SQL * Loader-502: cannot open the data file '< root >' XMLDATA field table TEST_XML
SQL * Loader-553: file not found
SQL * Loader-509: System error: no such file or directory
SQL * Loader-502: cannot open the data file '< ScriptFileType >' field XMLDATA table TEST_XML
SQL * Loader-553: file not found
SQL * Loader-509: System error: no such file or directory
SQL * Loader-502: cannot open the data file ' < Type > forms < / Type > ' table field TEST_XML XMLDATA
SQL * Loader-553: file not found
SQL * Loader-509: System error: no such file or directory
SQL * Loader-502: cannot open the data file ' < / ScriptFileType > ' table field TEST_XML XMLDATA
SQL * Loader-553: file not found
SQL * Loader-509: System error: no such file or directory
SQL * Loader-502: cannot open the data file '< ScriptFileType >' field XMLDATA table TEST_XML
SQL * Loader-553: file not found
SQL * Loader-509: System error: no such file or directory
SQL * Loader-502: cannot open the data file ' < Type > PLL < / Type > ' table field TEST_XML XMLDATA
SQL * Loader-553: file not found
SQL * Loader-509: System error: no such file or directory
SQL * Loader-502: cannot open the data file ' < / ScriptFileType > ' table field TEST_XML XMLDATA
SQL * Loader-553: file not found
SQL * Loader-509: System error: no such file or directory
SQL * Loader-502: cannot open the data file '< ScriptFileType >' field XMLDATA table TEST_XML
Please help me how can I load full xml in a single column using command line without Hardcoding in the control file
Published by: 907010 on January 10, 2012 02:24But strangely it attempts to load each line of the xml file in the table instead of the whole file
Nothing strange, that the data parameter specifies the file containing the data to load.
If you use the name of the XML here, the control file will try to interpret each line of XML as being separate ways.The traditional approach this is to have the name of the file stored in another file, say filelist.txt and use, for example:
echo "/u01/APPL/apps/apps_st/appl/xxtop/12.0.0/bin/file.xml" > filelist.txt sqlldr $1@$TWO_TASK control=$XXTOP/bin/LOAD_XML.ctl data=filelist.txt direct=true;
-
Question to load data using sql loader in staging table, and then in the main tables!
Hello
I'm trying to load data into our main database table using SQL LOADER. data will be provided in separate pipes csv files.
I have develop a shell script to load the data and it works fine except one thing.
Here are the details of a data to re-create the problem.
Staging of the structure of the table in which data will be filled using sql loader
create table stg_cmts_data (cmts_token varchar2 (30), CMTS_IP varchar2 (20));
create table stg_link_data (dhcp_token varchar2 (30), cmts_to_add varchar2 (200));
create table stg_dhcp_data (dhcp_token varchar2 (30), DHCP_IP varchar2 (20));
DATA in the csv file-
for stg_cmts_data-
cmts_map_03092015_1.csv
WNLB-CMTS-01-1. 10.15.0.1
WNLB-CMTS-02-2 | 10.15.16.1
WNLB-CMTS-03-3. 10.15.48.1
WNLB-CMTS-04-4. 10.15.80.1
WNLB-CMTS-05-5. 10.15.96.1
for stg_dhcp_data-
dhcp_map_03092015_1.csv
DHCP-1-1-1. 10.25.23.10, 25.26.14.01
DHCP-1-1-2. 56.25.111.25, 100.25.2.01
DHCP-1-1-3. 25.255.3.01, 89.20.147.258
DHCP-1-1-4. 10.25.26.36, 200.32.58.69
DHCP-1-1-5 | 80.25.47.369, 60.258.14.10
for stg_link_data
cmts_dhcp_link_map_0309151623_1.csv
DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2
DHCP-1-1-2. WNLB-CMTS-03-3,WNLB-CMTS-04-4,WNLB-CMTS-05-5
DHCP-1-1-3. WNLB-CMTS-01-1
DHCP-1-1-4. WNLB-CMTS-05-8,WNLB-CMTS-05-6,WNLB-CMTS-05-0,WNLB-CMTS-03-3
DHCP-1-1-5 | WNLB-CMTS-02-2,WNLB-CMTS-04-4,WNLB-CMTS-05-7
WNLB-DHCP-1-13 | WNLB-CMTS-02-2
Now, after loading these data in the staging of table I have to fill the main database table
create table subntwk (subntwk_nm varchar2 (20), subntwk_ip varchar2 (30));
create table link (link_nm varchar2 (50));
SQL scripts that I created to load data is like.
coil load_cmts.log
Set serveroutput on
DECLARE
CURSOR c_stg_cmts IS SELECT *.
OF stg_cmts_data;
TYPE t_stg_cmts IS TABLE OF stg_cmts_data % ROWTYPE INDEX BY pls_integer;
l_stg_cmts t_stg_cmts;
l_cmts_cnt NUMBER;
l_cnt NUMBER;
NUMBER of l_cnt_1;
BEGIN
OPEN c_stg_cmts.
Get the c_stg_cmts COLLECT in BULK IN l_stg_cmts;
BECAUSE me IN l_stg_cmts. FIRST... l_stg_cmts. LAST
LOOP
SELECT COUNT (1)
IN l_cmts_cnt
OF subntwk
WHERE subntwk_nm = l_stg_cmts (i) .cmts_token;
IF l_cmts_cnt < 1 THEN
INSERT
IN SUBNTWK
(
subntwk_nm
)
VALUES
(
l_stg_cmts (i) .cmts_token
);
DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_cmts (i) .cmts_token);
ON THE OTHER
DBMS_OUTPUT. Put_line ("token is already present'");
END IF;
WHEN l_stg_cmts EXIT. COUNT = 0;
END LOOP;
commit;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
for dhcp
coil load_dhcp.log
Set serveroutput on
DECLARE
CURSOR c_stg_dhcp IS SELECT *.
OF stg_dhcp_data;
TYPE t_stg_dhcp IS TABLE OF stg_dhcp_data % ROWTYPE INDEX BY pls_integer;
l_stg_dhcp t_stg_dhcp;
l_dhcp_cnt NUMBER;
l_cnt NUMBER;
NUMBER of l_cnt_1;
BEGIN
OPEN c_stg_dhcp.
Get the c_stg_dhcp COLLECT in BULK IN l_stg_dhcp;
BECAUSE me IN l_stg_dhcp. FIRST... l_stg_dhcp. LAST
LOOP
SELECT COUNT (1)
IN l_dhcp_cnt
OF subntwk
WHERE subntwk_nm = l_stg_dhcp (i) .dhcp_token;
IF l_dhcp_cnt < 1 THEN
INSERT
IN SUBNTWK
(
subntwk_nm
)
VALUES
(
l_stg_dhcp (i) .dhcp_token
);
DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_dhcp (i) .dhcp_token);
ON THE OTHER
DBMS_OUTPUT. Put_line ("token is already present'");
END IF;
WHEN l_stg_dhcp EXIT. COUNT = 0;
END LOOP;
commit;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
for link -.
coil load_link.log
Set serveroutput on
DECLARE
l_cmts_1 VARCHAR2 (4000 CHAR);
l_cmts_add VARCHAR2 (200 CHAR);
l_dhcp_cnt NUMBER;
l_cmts_cnt NUMBER;
l_link_cnt NUMBER;
l_add_link_nm VARCHAR2 (200 CHAR);
BEGIN
FOR (IN) r
SELECT dhcp_token, cmts_to_add | ',' cmts_add
OF stg_link_data
)
LOOP
l_cmts_1: = r.cmts_add;
l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));
SELECT COUNT (1)
IN l_dhcp_cnt
OF subntwk
WHERE subntwk_nm = r.dhcp_token;
IF l_dhcp_cnt = 0 THEN
DBMS_OUTPUT. Put_line ("device not found: ' |") r.dhcp_token);
ON THE OTHER
While l_cmts_add IS NOT NULL
LOOP
l_add_link_nm: = r.dhcp_token |' _TO_' | l_cmts_add;
SELECT COUNT (1)
IN l_cmts_cnt
OF subntwk
WHERE subntwk_nm = TRIM (l_cmts_add);
SELECT COUNT (1)
IN l_link_cnt
LINK
WHERE link_nm = l_add_link_nm;
IF l_cmts_cnt > 0 AND l_link_cnt = 0 THEN
INSERT INTO link (link_nm)
VALUES (l_add_link_nm);
DBMS_OUTPUT. Put_line (l_add_link_nm |) » '||' Has been added. ") ;
ELSIF l_link_cnt > 0 THEN
DBMS_OUTPUT. Put_line (' link is already present: ' | l_add_link_nm);
ELSIF l_cmts_cnt = 0 then
DBMS_OUTPUT. Put_line (' no. CMTS FOUND for device to create the link: ' | l_cmts_add);
END IF;
l_cmts_1: = TRIM (SUBSTR (l_cmts_1, INSTR (l_cmts_1, ',') + 1));
l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));
END LOOP;
END IF;
END LOOP;
COMMIT;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
control files -
DOWNLOAD THE DATA
INFILE 'cmts_data.csv '.
ADD
IN THE STG_CMTS_DATA TABLE
When (cmts_token! = ") AND (cmts_token! = 'NULL') AND (cmts_token! = 'null')
and (cmts_ip! = ") AND (cmts_ip! = 'NULL') AND (cmts_ip! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:cmts_token))' cmts_token,
cmts_ip ' RTRIM (LTRIM(:cmts_ip)) ")". "
for dhcp.
DOWNLOAD THE DATA
INFILE 'dhcp_data.csv '.
ADD
IN THE STG_DHCP_DATA TABLE
When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')
and (dhcp_ip! = ") AND (dhcp_ip! = 'NULL') AND (dhcp_ip! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:dhcp_token))' dhcp_token,
dhcp_ip ' RTRIM (LTRIM(:dhcp_ip)) ")". "
for link -.
DOWNLOAD THE DATA
INFILE 'link_data.csv '.
ADD
IN THE STG_LINK_DATA TABLE
When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')
and (cmts_to_add! = ") AND (cmts_to_add! = 'NULL') AND (cmts_to_add! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:dhcp_token))' dhcp_token,
cmts_to_add TANK (4000) RTRIM (LTRIM(:cmts_to_add)) ")" ""
SHELL SCRIPT-
If [!-d / log]
then
Mkdir log
FI
If [!-d / finished]
then
mkdir makes
FI
If [!-d / bad]
then
bad mkdir
FI
nohup time sqlldr username/password@SID CONTROL = load_cmts_data.ctl LOG = log/ldr_cmts_data.log = log/ldr_cmts_data.bad DISCARD log/ldr_cmts_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
nohup time username/password@SID @load_cmts.sql
nohup time sqlldr username/password@SID CONTROL = load_dhcp_data.ctl LOG = log/ldr_dhcp_data.log = log/ldr_dhcp_data.bad DISCARD log/ldr_dhcp_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
time nohup sqlplus username/password@SID @load_dhcp.sql
nohup time sqlldr username/password@SID CONTROL = load_link_data.ctl LOG = log/ldr_link_data.log = log/ldr_link_data.bad DISCARD log/ldr_link_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
time nohup sqlplus username/password@SID @load_link.sql
MV *.log. / log
If the problem I encounter is here for loading data in the link table that I check if DHCP is present in the subntwk table, then continue to another mistake of the newspaper. If CMTS then left create link to another error in the newspaper.
Now that we can here multiple CMTS are associated with unique DHCP.
So here in the table links to create the link, but for the last iteration of the loop, where I get separated by commas separate CMTS table stg_link_data it gives me log as not found CMTS.
for example
DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2
Here, I guess to link the dhcp-1-1-1 with balancing-CMTS-01-1 and wnlb-CMTS-02-2
Theses all the data present in the subntwk table, but still it gives me journal wnlb-CMTS-02-2 could not be FOUND, but we have already loaded into the subntwk table.
same thing is happening with all the CMTS table stg_link_data who are in the last (I think here you got what I'm trying to explain).
But when I run the SQL scripts in the SQL Developer separately then it inserts all valid links in the table of links.
Here, she should create 9 lines in the table of links, whereas now he creates only 5 rows.
I use COMMIT in my script also but it only does not help me.
Run these scripts in your machine let me know if you also get the same behavior I get.
and please give me a solution I tried many thing from yesterday, but it's always the same.
It is the table of link log
link is already present: dhcp-1-1-1_TO_wnlb-cmts-01-1 NOT FOUND CMTS for device to create the link: wnlb-CMTS-02-2
link is already present: dhcp-1-1-2_TO_wnlb-cmts-03-3 link is already present: dhcp-1-1-2_TO_wnlb-cmts-04-4 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-5
NOT FOUND CMTS for device to create the link: wnlb-CMTS-01-1
NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-8 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-6 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-0 NOT FOUND CMTS for device to create the link: wnlb-CMTS-03-3
link is already present: dhcp-1-1-5_TO_wnlb-cmts-02-2 link is already present: dhcp-1-1-5_TO_wnlb-cmts-04-4 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-7
Device not found: wnlb-dhcp-1-13 IF NEED MORE INFORMATION PLEASE LET ME KNOW
Thank you
I felt later in the night that during the loading in the staging table using UNIX machine he created the new line for each line. That is why the last CMTS is not found, for this I use the UNIX 2 BACK conversion and it starts to work perfectly.
It was the dos2unix error!
Thank you all for your interest and I may learn new things, as I have almost 10 months of experience in (PLSQL, SQL)
-
SQL Loader loading data into two Tables using a single CSV file
Dear all,
I have a requirement where in I need to load the data into 2 tables using a simple csv file.
So I wrote the following control file. But it loads only the first table and also there nothing in the debug log file.
Please suggest how to achieve this.
Examples of data
Source_system_code,Record_type,Source_System_Vendor_number,$vendor_name,Vendor_site_code,Address_line1,Address_line2,Address_line3
Victor, New, Ven001, Vinay, Vin001, abc, def, xyz
Control file script
================
OPTIONS (errors = 0, skip = 1)
load data
replace
in the table1 table:
fields ended by ',' optionally surrounded "" "
(
Char Source_system_code (1) POSITION "ltrim (rtrim (:Source_system_code)),"
Record_type tank "ltrim (rtrim (:Record_type)),"
Source_System_Vendor_number tank "ltrim (rtrim (:Source_System_Vendor_number)),"
$vendor_name tank "ltrim (rtrim (:Vendor_name)),"
)
in the Table2 table
1 = 1
fields ended by ',' optionally surrounded "" "
(
$vendor_name tank "ltrim (rtrim (:Vendor_name)),"
Vendor_site_code tank "ltrim (rtrim (:Vendor_site_code)),"
Address_line1 tank "ltrim (rtrim (:Address_line1)),"
Address_line2 tank "ltrim (rtrim (:Address_line2)),"
Address_line3 tank "ltrim (rtrim (:Address_line3)).
)the problem here is loading into a table, only the first. (Table 1)
Please guide me.
Thank you
Kumar
When you do not provide a starting position for the first field in table2, it starts with the following after a last referenced in table1 field, then it starts with vendor_site_code, instead of $vendor_name. So what you need to do instead, is specify position (1) to the first field in table2 and use the fields to fill. In addition, he dislikes when 1 = 1, and he didn't need anyway. See the example including the corrected below control file.
Scott@orcl12c > test.dat TYPE of HOST
Source_system_code, Record_type, Source_System_Vendor_number, $vendor_name, Vendor_site_code, Address_line1, Address_line2, Address_line3
Victor, New, Ven001, Vinay, Vin001, abc, def, xyz
Scott@orcl12c > test.ctl TYPE of HOST
OPTIONS (errors = 0, skip = 1)
load data
replace
in the table1 table:
fields ended by ',' optionally surrounded "" "
(
Char Source_system_code (1) POSITION "ltrim (rtrim (:Source_system_code)),"
Record_type tank "ltrim (rtrim (:Record_type)),"
Source_System_Vendor_number tank "ltrim (rtrim (:Source_System_Vendor_number)),"
$vendor_name tank "ltrim (rtrim (:Vendor_name)).
)
in the Table2 table
fields ended by ',' optionally surrounded "" "
(
source_system_code FILL (1) POSITION.
record_type FILLING,
source_system_vendor_number FILLING,
$vendor_name tank "ltrim (rtrim (:Vendor_name)),"
Vendor_site_code tank "ltrim (rtrim (:Vendor_site_code)),"
Address_line1 tank "ltrim (rtrim (:Address_line1)),"
Address_line2 tank "ltrim (rtrim (:Address_line2)),"
Address_line3 tank "ltrim (rtrim (:Address_line3)).
)
Scott@orcl12c > CREATE TABLE table1:
2 (Source_system_code VARCHAR2 (13),)
3 Record_type VARCHAR2 (11),
4 Source_System_Vendor_number VARCHAR2 (27),
5 $vendor_name VARCHAR2 (11))
6.
Table created.
Scott@orcl12c > CREATE TABLE table2
2 ($vendor_name VARCHAR2 (11),)
3 Vendor_site_code VARCHAR2 (16).
4 Address_line1 VARCHAR2 (13),
5 Address_line2 VARCHAR2 (13),
Address_line3 6 VARCHAR2 (13))
7.
Table created.
Scott@orcl12c > HOST SQLLDR scott/tiger CONTROL = test.ctl DATA = test.dat LOG = test.log
SQL * Loader: release 12.1.0.1.0 - Production on Thu Mar 26 01:43:30 2015
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: classics
Commit the point reached - the number of logical records 1
TABLE1 table:
1 row loaded successfully.
Table TABLE2:
1 row loaded successfully.
Check the log file:
test.log
For more information on the charge.
Scott@orcl12c > SELECT * FROM table1
2.
RECORD_TYPE SOURCE_SYSTEM_VENDOR_NUMBER $VENDOR_NAME SOURCE_SYSTEM
------------- ----------- --------------------------- -----------
Victor Ven001 new Vinay
1 selected line.
Scott@orcl12c > SELECT * FROM table2
2.
$VENDOR_NAME VENDOR_SITE_CODE ADDRESS_LINE1 ADDRESS_LINE2 ADDRESS_LINE3
----------- ---------------- ------------- ------------- -------------
Vinay Vin001 abc def xyz
1 selected line.
Scott@orcl12c >
-
SQL loader, problem with the POSITION &; EXTERNAL
Hi gurus of the Oracle.
I have problem with position and external.
I have the data file with the data of 1 million records
data delimiter is to and eventually closed by «»
some lines are not loaded due to errors of data, i.e. data contains some «»
now, we decided to use the position & external. am unable to write the control file.
any help would be much appreciated
The table name is person_memo, 4 columns
ID_PERSON VARCHAR2 (9 bytes) TX_MEMO VARCHAR2 (1000 bytes) ID_USER VARCHAR2 (20 bytes) TM_STAMP TIMESTAMP (6) my control file is
DOWNLOAD THE DATA
ADD THE PERSON_MEMO TABLE
FIELDS TERMINATED BY ' |'
SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
(
ID_PERSON POSITION (1) "CUT (: ID_PERSON).
, TX_MEMO POSITION (10) TANK (1000) "CUT (: TX_MEMO).
, POSITION (1012) ID_USER "TRIM (: ID_USER).
, TM_STAMP POSITION (1031) EXTERNAL (26) ' DECODE (: TM_STAMP, NULL, NULL, TO_TIMESTAMP (: TM_STAMP, ' YYYY-MM-DD - HH24.MI.)). SS. "(FF')).
)
specimen of data file
"04725813" | "aka"Little Will"" "|" " 095TDEAN «|» 2013-02-21 - 11.13.44.632000
"05599076" | "" FIRST NAME - ADDED A 'T' AS ON THE REG MAP | ' 016DDEAL ' |. ' 2014-04-11 - 10.06.35.598000
Thanks and greetings
REDA
In your control file, EXTERNAL (26) must be INTEGER EXTERNAL (26).
Unfortunately, this forum destroyed the spacing, so I can't say whether or not your data are positionnelles. If it is positional, you can then use positions, but you need start and end positions. The positions that you posted have nothing to do with the data you've posted. If you use positions, you can eliminate the delimiters and the beginning and the end of citations using these positions.
If your data are not positionnelles and you have quotes in your data in quotes, but you don't have the pipe delimiters in your data, then you can only use the delimiters and trim the quotes start and final data.
I have demonstrated the two methods below, using test1.ctl for the positional method and test2.ctl for the defined method.
Scott@orcl12c > host type test.dat
"04725813" | "aka"Little Will"" | "" 095TDEAN «|» 2013-02-21 - 11.13.44.632000
"05599076" | "" FIRST NAME - ADDED A 'T' AS ON THE REG MAP | ' 016DDEAL ' |. ' 2014-04-11 - 10.06.35.598000
Scott@orcl12c > type host test1.ctl
DOWNLOAD THE DATA
ADD THE PERSON_MEMO TABLE
FIELDS TERMINATED BY ' |'
TRAILING NULLCOLS
(ID_PERSON POSITION (02:10))
, TX_MEMO POSITION (14:59)
POSITION ID_USER (63:82)
, TM_STAMP POSITION (85:110) EXTERNAL INTEGER (26) ' DECODE (: TM_STAMP, NULL, NULL,).
TO_TIMESTAMP (: TM_STAMP, ' YYYY-MM-DD - HH24.MI.) SS. "(FF')).
)
Scott@orcl12c > type host test2.ctl
DOWNLOAD THE DATA
ADD THE PERSON_MEMO TABLE
FIELDS TERMINATED BY ' |'
TRAILING NULLCOLS
(ID_PERSON "TRIM ("------"" FROM: ID_PERSON ")")
, TX_MEMO CHAR (1000) "TRIM ("------"" FROM: TX_MEMO ").
, ID_USER "TRIM ("------"" FROM: ID_USER ").
, TM_STAMP INTEGER EXTERNAL (26) ' DECODE (: TM_STAMP, NULL, NULL,).
TO_TIMESTAMP (: TM_STAMP, ' YYYY-MM-DD - HH24.MI.) SS. "(FF')).
)
Scott@orcl12c > create table person_memo
2 (ID_PERSON VARCHAR2 (9 bytes)
3, TX_MEMO VARCHAR2 (1000 bytes)
4, ID_USER VARCHAR2 (20 bytes)
5, TM_STAMP TIMESTAMP (6))
6.
Table created.
Scott@orcl12c > host sqlldr scott/tiger control = test1.ctl data = test.dat log = test1.log
SQL * Loader: release 12.1.0.1.0 - Production Thursday, May 15, 10:53:11-2014
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: classics
Commit the point reached - the number of logical records 2
Table PERSON_MEMO:
2 rows loaded successfully.
Check the log file:
test1.log
For more information on the charge.
Scott@orcl12c > select * from person_memo
2.
ID_PERSON
---------
TX_MEMO
--------------------------------------------------------------------------------
ID_USER
--------------------
TM_STAMP
---------------------------------------------------------------------------
04725813
aka "Little Will"
095TDEAN
21 FEBRUARY 13 11.13.44.632000 AM
05599076
FIRST NAME - ADDED A 'T' AS ON THE REG MAP
016DDEAL
11 APRIL 14 10.06.35.598000 AM
2 selected lines.
Scott@orcl12c > truncate table person_memo
2.
Table truncated.
Scott@orcl12c > host sqlldr scott/tiger control = test2.ctl data = test.dat log = test2.log
SQL * Loader: release 12.1.0.1.0 - Production Thursday, May 15, 10:53:11-2014
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: classics
Commit the point reached - the number of logical records 2
Table PERSON_MEMO:
2 rows loaded successfully.
Check the log file:
test2.log
For more information on the charge.
Scott@orcl12c > select * from person_memo
2.
ID_PERSON
---------
TX_MEMO
--------------------------------------------------------------------------------
ID_USER
--------------------
TM_STAMP
---------------------------------------------------------------------------
04725813
aka "Little Will"
095TDEAN
21 FEBRUARY 13 11.13.44.632000 AM
05599076
FIRST NAME - ADDED A 'T' AS ON THE REG MAP
016DDEAL
11 APRIL 14 10.06.35.598000 AM
2 selected lines.
-
Reg: Problem SQL Loader-
Hi Experts,
I am trying to load data from a flat file in an Oracle table, but face some problems.
Concern (1) (1)
I have a directory where there are about 20 similar files named as 'rb_1', 'rb_2', 'rb_3', etc...
All these data should be loaded in a Word only table 'X '.
Is it possible that only 1 CTL file will make a loop and load all the files in X?
Concern (2) (2)
Field delimiter is Ctrl-X (CAN: cancel) and Ctrl-M (EM: em) characters
Line delimiter is Ctrl-Z (SUB: substitute) and Ctrl-T (DC4: Device Control) characters
Is there a way I can specify this in my CTL file?
(I've only worked on delimiter of field as comma "«»", de champ comme virgule «», pas speciales personnages comme ca not special characters like that)
Please let me know if any additional information is required.
Help much appreciated.
Thank you
-Nordine
Agree with Hoek, you'd better to use external tables you can specify multiple filenames at once, otherwise you will have a script that provides the name of the input file when calling sql loader and make this script loop for each file.
Regarding the ctrl characters in your delimiters, control file supports the hexagonal versions of them for example
fields terminated by ' 09 x. "
where 9 is the hexagon of the ascii value of the character (in this case a tab character).
CTRL-m, that would be x ' 0 of etc.
-
Hi s/n,.
I am facing problems when loading the dates in the database by using sql loader. My data file can have several date formats, so I have a function that interprets date and removes the time settings and the date of return.
Something like that.
For example:
I have a table-
CREATE TABLE TEMP1234
(
IDENTIFICATION NUMBER,
DATE OF ASOF_DATE
);
Data file
10001172 | 09/12/1945
Control file:
OPTIONS (DIRECT = TRUE, SILENT (FEEDBACK) =, skip = 0)
DATA RELATING TO SUNK COSTS
REPLACE
in the temp1234 table
' fields completed by "|" possibly provided by ' "'
TRAILING NULLCOLS
(ID,
ASOF_DATE "decode (: ASOF_DATE, null,:ASOF_DATE,conv_date1(:ASOF_DATE))).
)
Function CONV_DATE1:
FUNCTION to CREATE or REPLACE conv_date1 (p_str IN VARCHAR2)
DATE OF RETURN
IS
RETURN_VALUE DATE;p_str1 VARCHAR2 (15): = NULL;
FmtArray TYPE IS a TABLE OF VARCHAR2 (30);
g_fmts fmtArray
: = fmtArray ("yyyy-mm-dd",
"yyyy/mm/dd"
mm/dd/yyyy"."
"dd-mm-yyyy",
"dd/mm/yyyy",
"mm-dd-yyyy");
BEGIN
p_str1: = SUBSTR (p_str, 1, 10);BECAUSE me in 1... g_fmts. COUNTY
LOOP
BEGIN
return_value: = TO_DATE (p_str1, g_fmts (i));
EXIT;
EXCEPTION
WHILE OTHERS
THEN
NULL;
END;
END LOOP;IF (return_value IS NULL)
THEN
RAISE PROGRAM_ERROR;
END IF;RETURN Return_value;
END;
/In this case, if the year in the data file shown in 1945, date which is load in the poster 2045 database.
But when I run this function by putting a sql editor - it returns the correct value.
Select double conv_date1('12/09/1945');
Please help me understand what is causing the problem.
I think there may be an implicit conversion going on with the combination of decode and checking nvl and function within SQL * Loader. I put the nvl checking in function and simply use the function in the control file, as shown below.
OPTIONS (DIRECT = TRUE, SILENT (FEEDBACK) =, skip = 0)
DATA RELATING TO SUNK COSTS
REPLACE
in the temp1234 table
' fields completed by "|" possibly provided by ' "'
TRAILING NULLCOLS
(ID,
asof_date ' conv_date1 (: asof_date).
)
FUNCTION to CREATE or REPLACE conv_date1 (p_str IN VARCHAR2)
DATE OF RETURN
IS
RETURN_VALUE DATE;
p_str1 VARCHAR2 (15): = NULL;
FmtArray TYPE IS a TABLE OF VARCHAR2 (30);
g_fmts fmtArray
: = fmtArray ("yyyy-mm-dd",
"yyyy/mm/dd"
mm/dd/yyyy"."
"dd-mm-yyyy",
"dd/mm/yyyy",
"mm-dd-yyyy");
BEGIN
IF p_str IS NULL
THEN RETURN NULL;
ON THE OTHER
p_str1: = SUBSTR (p_str, 1, 10);
BECAUSE me in 1... g_fmts. COUNTY
LOOP
BEGIN
return_value: = TO_DATE (p_str1, g_fmts (i));
EXIT;
EXCEPTION
WHILE OTHERS
THEN
NULL;
END;
END LOOP;
IF (return_value IS NULL)
THEN
RAISE PROGRAM_ERROR;
END IF;
RETURN Return_value;
END IF;
END;
/
-
Hello
I have problems using SQL Loader with ODI. I am filling an oracle table with data from a txt file. Initially, I had used 'File to SQL' LKM, but due to the size of the file (700 MB) source, I decided to use 'file to Oracle (SQLLDR)' LKM.
The error that appears in myFile.txt.log is: "SQL * Loader-101: invalid argument for the name of user and password.
I think the problem might be in the definition of the database server (physical JavaBeans in topology), because I left empty Host, user and password.
What is the problem? What host and user should I use? With "File to SQL" works fine living that white, but takes too much time.
Thanks in advanceNote the last * @*. He expects a connection string after him.
So, goto Oracle server topology, specify Instance/Dblink (server) as the SID of the oracle database you want to connect.
-
On the use of SQL * loader program pl/sql command
Is it possible to develop a PL/SQL program that calls SQL * Loader (sqlldr userid =...) command?
We want to package a process that loads data from a text file into a table, and we research it is possible to use SQL * Loader to the data loading process.
You can launch sqlldr like all other orders of operation. There are several methods of cooking OS command, as
1. external procedure PL/SQL: CHMOD FROM A PLSQL?
Johan's blog: how to call PL/SQL kernel32.dll.
2 Java Stored Procedure: Blog of Johan: using JAVA in PL/SQL - PART - I list files with timestamp , Blog of Johan: using JAVA in PL/SQL - PART - II operating system information obtaining
3. external PREPROCESSOR Table function: no response on java not call windows in oracle command
4. using DBMS_SCHEDULER,
job_type => 'executable' and
job_action => '/bin/sh' (may be)
-
Hi I'm using version 11.2.0.3.0 version of oracle, and I'm trying to push the file using sql loader utility to my DB table, below is my table structure, two of the column will be constant appears as mentioned below in the table structure. And the File_DATE column must be the combination of two fields (Date and time) of flat file to have the appropriate data column format. I get the error message and not be able to download the data using slot control file, so need help.
sample data file
OrderDate, name, size, Records, Date, Time
06202014, authlogfile06202014.txt, 40777214, 198915, June 21 at 03:51
06202014, transferfile06202014.txt, 372144, 2255, June 21 at 01:34
06202014, balancefile06202014.txt, 651075, 10343, June 21 at 03:28The table structure
Create the table file_stats
(Systemtypecode VARCHAR2 (4000)-this will be a value to hardcode "CBD")
LETTER Date,
FILENAME VARCHAR2 (4000).
Size of the file Number (20.0).
Noofrecords NUMBER (20.0).
File_DATE VARCHAR2 (4000).
CREATED_DATE Date - it will be filled with the value SYSDATE
);Here's my control file
Options
(SKIP = 1)
DOWNLOAD THE DATA
INFILE 'files.csv '.
ADD
IN the file_stats table
FIELDS TERMINATED BY ', '.
(Systemtypecode CONSTANT 'CBD',
LETTER DATE 'MMDDYYYY.
CHAR FILE NAME,
Size of the ENTIRE file,
Noofrecords INTEGER,
boundfiller TANK file_DATE_DDmon
, file_DATE ' to_date ('2014':: file_DATE_DDmon:: file_DATE, 'YYYYMON DDHH24:MI').
, created_date CONSTANT 'sysdate.
)When running under command, all erroneous records on as below
sqlldr schema1/pwd@db1 control = file_stats.ctl log = file_stats.log file_stats.bad = bad
ERROR:
Sheet 1: Rejected - error on the table FILE_STATS, column FILE_DATE.
ORA-01843: not one month validYou must add TRAILING NULLCOLS and use EXTERNAL INTEGER instead of an INTEGER and your file_date column must be of type DATE data. Please see the demo below with some additional corrections. Note that since there is no year in the data file for the file_date column, which by default is the current year. If you want to 2014 so you need to concatenate that and add the date format YYYY.
Scott@orcl12c > files.csv TYPE of HOST
OrderDate, name, size, Records, Date, Time
06202014, authlogfile06202014.txt, 40777214, 198915, June 21 at 03:51
06202014, transferfile06202014.txt, 372144, 2255, June 21 at 01:34
06202014, balancefile06202014.txt, 651075, 10343, June 21 at 03:28
Scott@orcl12c > file_stats.ctl TYPE of HOST
OPTIONS (SKIP = 1)
DOWNLOAD THE DATA
INFILE 'files.csv '.
ADD IN THE TABLE file_stats
FIELDS TERMINATED BY ',' TRAILING NULLCOLS
(systemtypecode CONSTANT "CBD"
, odate DATE 'MMDDYYYY '.
filename CHAR
INTEGER EXTERNAL file size
noofrecords INTEGER EXTERNAL
file_date_ddmon BOUNDFILLER TANK
, file_date DATE 'My DDHH24:MI' ': file_date_ddmon |: file_date '
, created_date 'SYSDATE.
)
Scott@orcl12c > file_stats CREATE TABLE
2 (systemtypecode VARCHAR2 (14),)
3 DATE odate,.
4 name of file VARCHAR2 (24).
size of the file NUMBER 5 (20.0).
6 noofrecords NUMBER (20.0).
7 file_date DATE,
8 created_date DATE)
9.
Table created.
Scott@orcl12c > HOST SQLLDR scott/tiger CONTROL = file_stats.ctl LOG = file_stats.log file_stats.bad = BAD
SQL * Loader: release 12.1.0.1.0 - Production on Thu Feb 5 15:26:49 2015
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: classics
Commit the point reached - the number of logical records 3
Table FILE_STATS:
3 rows loaded successfully.
Check the log file:
file_stats.log
For more information on the charge.
Scott@orcl12c > SELECT * FROM file_stats
2.
SYSTEMTYPECODE, KAI NOOFRECORDS FILE_DATE CREATED_DATE FILENAME FILE SIZE
-------------- --------------- ------------------------ ---------- ----------- --------------- ---------------
Friday, June 20, 2014 authlogfile06202014.txt 40777214 198915 Sunday 21 June 2015 CBD Thursday, February 5, 2015
Friday, June 20, 2014 transferfile06202014.txt 372144 2255 Sunday 21 June 2015 CBD Thursday, February 5, 2015
Friday, June 20, 2014 balancefile06202014.txt 651075 10343 Sunday 21 June 2015 CBD Thursday, February 5, 2015
3 selected lines.
-
Can I use function defined by the user in the control file Sql Loader?
Hi Master,
Can I use fuction user-defined in the sql loader control file? Please advise... !!
If you can provide an example... It's very kind of you.
Concerning
AR
Here are a few examples that should give you clues: Sql loader
I seriously wonder if why you use the century previous SQL * Loader instead of a external table:
ORACLE-BASE - external Tables: querying data from flat files in Oracle
-
Error loading data using SQL loader
I get an error message like "SQL * Loader - 350 combination illegal syntax of non-alphanumeriques characters error during loading of a file using SQL loader in RHEL." The command used to run SQL * Loader is:
Sqlldr userid = < user name > / < password > control = data.ctl
The control file is data.ctl:
DOWNLOAD the data
INFILE ' / home/oraprod/data.txt'
Add in the table test
{
EmpID completed by «,»,
fname completed by «,»,
lname completed by «,»,
treatment is completed with a white space
}
The data.txt file is:
1, Kaushal, Hamad, 5000
2, Chetan, Hamad, 1000
Hopefully, my question is clear.
Please get back with the answer to my query.
Concerning
Replace "{" by "("dans votre fichier de contrôle) "
DOWNLOAD the data
INFILE 'c:\data.txt.
Add the emp_t table
(
EmpID completed by «,»,
fname completed by «,»,
lname completed by «,»,
treatment is completed with a white space
)
C:\>sqlldr user/pwd@database control = c.ctl
SQL * Loader: release 10.2.0.3.0 - Production on Wed Nov 13 10:10:24 2013
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Commit the point reached - the number of logical records 1
Commit the point reached - the number of logical records 2
SQL > select * from emp_t;
EMPID, FNAME LNAME SALARY
---------- -------------------- -------------------- ----------
1 kone hamadi 5000
2 Chetan Hamad 1000
Best regards
Mohamed Houri
-
Date fromats loading using SQL * Loader
Hello
I have the data to load into an Oracle DB using sqlldr below
333. 789. 6 ||| 01-08-2013 | 2014-08-01 |
334. 789. 6 ||| 01-08-2013 | 2014-08-01 |
335. 789. 6 ||| 01-08-2013 | 2014-08-01 |
It fails while the failed to load on date formats. How can I fix everything by using load data.
Thank you
Sylvie
Works for me. You must validate the CTL file.
Here is an example:
Control line: -.
DOWNLOAD THE DATA
INFILE *.
IN THE TABLE test_table
REPLACE
FIELDS TERMINATED BY ', '.
(
Col DATE 'yyyy-mm-dd ".
)
BEGINDATA
2013-08-01
2014-09-01
2015-10-01
2016-11-01
2017-12-01
-Create table
SQL > create table test_table (date of the pass);
Table created.
-SQL * Loader
C:\Users\43729434>sqlldr user/password@db_alias name control=C:\fakepath\test_ctl.ctl
SQL * Loader: release 11.2.0.1.0 - Production Tue Oct 24 10:26:40 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit the point reached - the number of logical records 4
Commit the point reached - the number of logical records 5-Check the data in the database
SQL > alter session set nls_date_format = 'DD-Mon-YYYY ';
Modified session.
SQL > select col from test_table;
COL
-----------
August 1, 2013
01-sep-2014
October 1, 2015
November 1, 2016
December 1, 2017SQL >
Maybe you are looking for
-
Satalite Pro A60 overheating and crashing
My satalite Satalite A60 Pro keeps overheating and crashing. It says something about an ati driver. I downloaded a bios between bios-190trad of this Web site.I burned the iso to the disk, but I can't get the laptop to boot from the disk in the read m
-
System Restore to factory on the Satellite U500-1f5
Hi palsss... IM using toshiba Satellite U500-1f5... I want to restore my system to factory settings and want to write a system repair disc...guide me! do want friends!
-
Original title: run as whenever I click on a shortcut or a program, it continues to ask that I want to "run as" current user or other how can I turn this off? I know this has something to do with the different credentials, but I don't know where to f
-
Hi, I have cisco ACS 5.2 and you want to create the user account of technician, with only some commands. How can I achieve this? Thank you
-
Just edit the photos using Lightroom. How can I copy these photos on a flash drive. ?Each attempt ends up copy of the original not modified pictures.