Reg - Sql Loader
Hello
IAM using sql loader to load data into a table
For example:
my file has
ID CREATED_DATE
FEBRUARY 21, 2012 121
FEBRUARY 22, 2012 122
ABDELLAH 123
MARCH 16, 2012 124
and my table data type is
Test_Table DESC
CREATED_DATE DATE
IDENTIFICATION NUMBER
in this case, while inserting the 3 rows. It will return the exception.
so it will not insert the 4 rows in table. My requirment is if the 3 rows
throw the exception. I want to eliminate the 3 rows and insert the 4 rows. So my last
This results in the table
ID CREATED_DATE
FEBRUARY 21, 2012 121
FEBRUARY 22, 2012 122
MARCH 16, 2012 124
is it possible to handle this exception in sql loader. PLs help
For this you must create a function to check the date format, then you can load the data based on this value. Try the below
-Function to check the format
FUNCTION to CREATE or REPLACE test_fun (p_indate VARCHAR2)
RETURN NUMBER
AS
v_indtchk DATE;
BEGIN
v_indtchk: = TO_DATE(p_indate,'DD-MON-YYYY');
RETURN 0;
EXCEPTION
WHILE OTHERS THEN
RETURN 1;
END;
-Control file
DOWNLOAD THE DATA
INFILE 'F:\ANNFOL\sample2.csv '.
TRUNCATE
IN THE TABLE test_ldr
FIELDS ENDED BY ',' OPTIONALLY SURROUNDED "" "
TRAILING NULLCOLS
(
CREATED_DATE "DECODE (test_fun(:CREATED_DATE), 0, TO_DATE(:CREATED_DATE,'DD-MON-YYYY'), NULL);
ID
)
Tags: Database
Similar Questions
-
Reg: Problem SQL Loader-
Hi Experts,
I am trying to load data from a flat file in an Oracle table, but face some problems.
Concern (1) (1)
I have a directory where there are about 20 similar files named as 'rb_1', 'rb_2', 'rb_3', etc...
All these data should be loaded in a Word only table 'X '.
Is it possible that only 1 CTL file will make a loop and load all the files in X?
Concern (2) (2)
Field delimiter is Ctrl-X (CAN: cancel) and Ctrl-M (EM: em) characters
Line delimiter is Ctrl-Z (SUB: substitute) and Ctrl-T (DC4: Device Control) characters
Is there a way I can specify this in my CTL file?
(I've only worked on delimiter of field as comma "«»", de champ comme virgule «», pas speciales personnages comme ca not special characters like that)
Please let me know if any additional information is required.
Help much appreciated.
Thank you
-Nordine
Agree with Hoek, you'd better to use external tables you can specify multiple filenames at once, otherwise you will have a script that provides the name of the input file when calling sql loader and make this script loop for each file.
Regarding the ctrl characters in your delimiters, control file supports the hexagonal versions of them for example
fields terminated by ' 09 x. "
where 9 is the hexagon of the ascii value of the character (in this case a tab character).
CTRL-m, that would be x ' 0 of etc.
-
SQL loader, problem with the POSITION &; EXTERNAL
Hi gurus of the Oracle.
I have problem with position and external.
I have the data file with the data of 1 million records
data delimiter is to and eventually closed by «»
some lines are not loaded due to errors of data, i.e. data contains some «»
now, we decided to use the position & external. am unable to write the control file.
any help would be much appreciated
The table name is person_memo, 4 columns
ID_PERSON VARCHAR2 (9 bytes) TX_MEMO VARCHAR2 (1000 bytes) ID_USER VARCHAR2 (20 bytes) TM_STAMP TIMESTAMP (6) my control file is
DOWNLOAD THE DATA
ADD THE PERSON_MEMO TABLE
FIELDS TERMINATED BY ' |'
SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
(
ID_PERSON POSITION (1) "CUT (: ID_PERSON).
, TX_MEMO POSITION (10) TANK (1000) "CUT (: TX_MEMO).
, POSITION (1012) ID_USER "TRIM (: ID_USER).
, TM_STAMP POSITION (1031) EXTERNAL (26) ' DECODE (: TM_STAMP, NULL, NULL, TO_TIMESTAMP (: TM_STAMP, ' YYYY-MM-DD - HH24.MI.)). SS. "(FF')).
)
specimen of data file
"04725813" | "aka"Little Will"" "|" " 095TDEAN «|» 2013-02-21 - 11.13.44.632000
"05599076" | "" FIRST NAME - ADDED A 'T' AS ON THE REG MAP | ' 016DDEAL ' |. ' 2014-04-11 - 10.06.35.598000
Thanks and greetings
REDA
In your control file, EXTERNAL (26) must be INTEGER EXTERNAL (26).
Unfortunately, this forum destroyed the spacing, so I can't say whether or not your data are positionnelles. If it is positional, you can then use positions, but you need start and end positions. The positions that you posted have nothing to do with the data you've posted. If you use positions, you can eliminate the delimiters and the beginning and the end of citations using these positions.
If your data are not positionnelles and you have quotes in your data in quotes, but you don't have the pipe delimiters in your data, then you can only use the delimiters and trim the quotes start and final data.
I have demonstrated the two methods below, using test1.ctl for the positional method and test2.ctl for the defined method.
Scott@orcl12c > host type test.dat
"04725813" | "aka"Little Will"" | "" 095TDEAN «|» 2013-02-21 - 11.13.44.632000
"05599076" | "" FIRST NAME - ADDED A 'T' AS ON THE REG MAP | ' 016DDEAL ' |. ' 2014-04-11 - 10.06.35.598000
Scott@orcl12c > type host test1.ctl
DOWNLOAD THE DATA
ADD THE PERSON_MEMO TABLE
FIELDS TERMINATED BY ' |'
TRAILING NULLCOLS
(ID_PERSON POSITION (02:10))
, TX_MEMO POSITION (14:59)
POSITION ID_USER (63:82)
, TM_STAMP POSITION (85:110) EXTERNAL INTEGER (26) ' DECODE (: TM_STAMP, NULL, NULL,).
TO_TIMESTAMP (: TM_STAMP, ' YYYY-MM-DD - HH24.MI.) SS. "(FF')).
)
Scott@orcl12c > type host test2.ctl
DOWNLOAD THE DATA
ADD THE PERSON_MEMO TABLE
FIELDS TERMINATED BY ' |'
TRAILING NULLCOLS
(ID_PERSON "TRIM ("------"" FROM: ID_PERSON ")")
, TX_MEMO CHAR (1000) "TRIM ("------"" FROM: TX_MEMO ").
, ID_USER "TRIM ("------"" FROM: ID_USER ").
, TM_STAMP INTEGER EXTERNAL (26) ' DECODE (: TM_STAMP, NULL, NULL,).
TO_TIMESTAMP (: TM_STAMP, ' YYYY-MM-DD - HH24.MI.) SS. "(FF')).
)
Scott@orcl12c > create table person_memo
2 (ID_PERSON VARCHAR2 (9 bytes)
3, TX_MEMO VARCHAR2 (1000 bytes)
4, ID_USER VARCHAR2 (20 bytes)
5, TM_STAMP TIMESTAMP (6))
6.
Table created.
Scott@orcl12c > host sqlldr scott/tiger control = test1.ctl data = test.dat log = test1.log
SQL * Loader: release 12.1.0.1.0 - Production Thursday, May 15, 10:53:11-2014
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: classics
Commit the point reached - the number of logical records 2
Table PERSON_MEMO:
2 rows loaded successfully.
Check the log file:
test1.log
For more information on the charge.
Scott@orcl12c > select * from person_memo
2.
ID_PERSON
---------
TX_MEMO
--------------------------------------------------------------------------------
ID_USER
--------------------
TM_STAMP
---------------------------------------------------------------------------
04725813
aka "Little Will"
095TDEAN
21 FEBRUARY 13 11.13.44.632000 AM
05599076
FIRST NAME - ADDED A 'T' AS ON THE REG MAP
016DDEAL
11 APRIL 14 10.06.35.598000 AM
2 selected lines.
Scott@orcl12c > truncate table person_memo
2.
Table truncated.
Scott@orcl12c > host sqlldr scott/tiger control = test2.ctl data = test.dat log = test2.log
SQL * Loader: release 12.1.0.1.0 - Production Thursday, May 15, 10:53:11-2014
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: classics
Commit the point reached - the number of logical records 2
Table PERSON_MEMO:
2 rows loaded successfully.
Check the log file:
test2.log
For more information on the charge.
Scott@orcl12c > select * from person_memo
2.
ID_PERSON
---------
TX_MEMO
--------------------------------------------------------------------------------
ID_USER
--------------------
TM_STAMP
---------------------------------------------------------------------------
04725813
aka "Little Will"
095TDEAN
21 FEBRUARY 13 11.13.44.632000 AM
05599076
FIRST NAME - ADDED A 'T' AS ON THE REG MAP
016DDEAL
11 APRIL 14 10.06.35.598000 AM
2 selected lines.
-
Import data... Wizard creates the file SQL Loader ctl with columns out of order
4.1.1.19 SQL Developer version. Connected to Oracle XE to test this.
I'm trying to understand what the problem with the data in my import file when finally, I realize that the Import Wizard of data did not care how I traced the columns at all. The ctl SQL Loader file generated by the Wizard expects that the columns of data in my file to match the order that they appear in the definition of the table and not how they have been mapped in the wizard. Manually edit the ctl file is a workaround. Has anyone else seen elsewhere?
I see that this is a bug.
-
Hi all
I am load flat file in custom intermediate table using sql loader, but I'm betting that the error
Record 2720: Rejected - error on the table XXPLL_SALES_FCST_BGA_TMPL, column MKTG_ASP.
ORA-01722: invalid number
XXPLL_SALES_FCST_BGA_TMPL - Custom intermediate Table
MKTG_ASP - contains decimal values
I created this column as a number data type.
I've seen much after and it was proposed to use 'Decimal external' but it still does not work. All suggestions will be useful.
Thank you
Sharath Chander M
Hi Barbara Boehmer
Now, I changed in the file ctl under 'EXTERNAL DECIMAL Mktg_ASP DONE BY a SPACE' and now all data are allowed in / load. Is - this way?
I will also try as you say. Also if the data file has a line any, as a null value for this column will be it's causing any problem for the amendment that I have performed in ctl file?
Thank you
Sharath Chander M
-
SQL Loader - ignore the lines with "rejected - all null columns."
Hello
Please see the attached log file. Also joined the table creation script, data file and the bad and throw the files after execution.
Sqlldr customer in the version of Windows-
SQL * Loader: release 11.2.0.1.0 - Production
The CTL file has two clauses INTO TABLE due to the nature of the data. The data presented are a subset of data in the real world file. We are only interested in the lines with the word "Index" in the first column.
The problem we need to do face is, according to paragraph INTO TABLE appears first in the corresponding CTL lines file to the WHEN CLAUSE it would insert and the rest get discarded.
1. statement of Create table : create table dummy_load (varchar2 (30) name, number, date of effdate);
2. data file to simulate this issue contains the lines below 10. Save this as name.dat. The intention is to load all of the rows in a CTL file. The actual file would have additional lines before and after these lines that can be discarded.
H15T1Y Index | 2. 19/01/2016 |
H15T2Y Index | 2. 19/01/2016 |
H15T3Y Index | 2. 19/01/2016 |
H15T5Y Index | 2. 19/01/2016 |
H15T7Y Index | 2. 19/01/2016 |
H15T10Y Index | 2. 19/01/2016 |
CPDR9AAC Index | 2. 15/01/2016 |
MOODCAVG Index | 2. 15/01/2016 |
H15TXXX Index | 2. 15/01/2016 |
H15TXXX Index | 2. 15/01/2016 |
3. the CTL file - name.ctl
DOWNLOAD THE DATA
ADD
IN THE TABLE dummy_load
WHEN (09:13) = "Index".
TRAILING NULLCOLS
(
COMPLETED name BY ' | ',.
rate TERMINATED BY ' | '.
COMPLETED effdate BY ' | '. ' TO_DATE (: effdate, "MM/DD/YYYY").
)
IN THE TABLE dummy_load
WHEN (08:12) = "Index".
TRAILING NULLCOLS
(
COMPLETED name BY ' | ',.
rate TERMINATED BY ' | '.
COMPLETED effdate BY ' | '. ' TO_DATE (: effdate, "MM/DD/YYYY").
)
invoke SQL loader in a file-> beats
C:\Oracle\product\11.2.0\client\bin\sqlldr USERID = myid/[email protected] CONTROL=C:\temp\t\name.ctl BAD=C:\temp\t\name_bad.dat LOG=C:\temp\t\name_log.dat DISCARD=C:\temp\t\name_disc.dat DATA=C:\temp\t\name.dat
Once this is run, the following text appears in the log file (excerpt):
Table DUMMY_LOAD, charged when 09:13 = 0X496e646578 ('Index' character)
Insert the option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Position Len term Encl. Datatype name
------------------------------ ---------- ----- ---- ---- ---------------------
NAME FIRST * | CHARACTER
RATE NEXT * | CHARACTER
EFFDATE NEXT * | CHARACTER
SQL string for the column: ' TO_DATE (: effdate, "MM/DD/YYYY").
Table DUMMY_LOAD, charged when 08:12 = 0X496e646578 ('Index' character)
Insert the option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Position Len term Encl. Datatype name
------------------------------ ---------- ----- ---- ---- ---------------------
NAME NEXT * | CHARACTER
RATE NEXT * | CHARACTER
EFFDATE NEXT * | CHARACTER
SQL string for the column: ' TO_DATE (: effdate, "MM/DD/YYYY").
Record 1: Ignored - all null columns.
Sheet 2: Cast - all null columns.
Record 3: Ignored - all null columns.
Record 4: Ignored - all null columns.
Sheet 5: Cast - all null columns.
Sheet 7: Discarded - failed all WHEN clauses.
Sheet 8: Discarded - failed all WHEN clauses.
File 9: Discarded - failed all WHEN clauses.
Case 10: Discarded - failed all WHEN clauses.
Table DUMMY_LOAD:
1 row loaded successfully.
0 rows not loaded due to data errors.
9 lines not loading because all WHEN clauses were failed.
0 rows not populated because all fields are null.
Table DUMMY_LOAD:
0 rows successfully loaded.
0 rows not loaded due to data errors.
5 rows not loading because all WHEN clauses were failed.
5 rows not populated because all fields are null.
The bad file is empty. The discard file has the following
H15T1Y Index | 2. 19/01/2016 |
H15T2Y Index | 2. 19/01/2016 |
H15T3Y Index | 2. 19/01/2016 |
H15T5Y Index | 2. 19/01/2016 |
H15T7Y Index | 2. 19/01/2016 |
CPDR9AAC Index | 2. 15/01/2016 |
MOODCAVG Index | 2. 15/01/2016 |
H15TXXX Index | 2. 15/01/2016 |
H15TXXX Index | 2. 15/01/2016 |
Based on the understanding of the instructions in the CTL file, ideally the first 6 rows will have been inserted into the table. Instead the table comes from the line 6' th.
NAME RATE EFFDATE H15T10Y Index 2 January 19, 2016 If the INTO TABLE clauses were put in the CTL file, then the first 5 rows are inserted and the rest are in the discard file. The line 6' th would have a ""rejected - all columns null. "in the log file. "
Could someone please take a look and advise? My apologies that the files cannot be attached.
Unless you tell it otherwise, SQL * Loader assumes that each later in the table and what clause after the first back in the position where the previous left off. If you want to start at the beginning of the line every time, then you need to reset the position using position (1) with the first column, as shown below. Position on the first using is optional.
DOWNLOAD THE DATA
ADD
IN THE TABLE dummy_load
WHEN (09:13) = "Index".
TRAILING NULLCOLS
(
name POSITION (1) TERMINATED BY ' | '.
rate TERMINATED BY ' | '.
COMPLETED effdate BY ' | '. ' TO_DATE (: effdate, "MM/DD/YYYY").
)
IN THE TABLE dummy_load
WHEN (08:12) = "Index".
TRAILING NULLCOLS
(
name POSITION (1) TERMINATED BY ' | '.
rate TERMINATED BY ' | '.
COMPLETED effdate BY ' | '. ' TO_DATE (: effdate, "MM/DD/YYYY").
)
-
Condition of licence for SQL Loader?
Hello
I have an installing the oracle client that allows me to use SQLPLUS to connect to our Oracle 11g databases. SQL Loader is optional for installation. I would like to make use of this utility. Can anyone confirm if there is a condition of additional license to install SQL Loader?
TIA
Oracle allows the tools that usually used to interact with the database. SQL * Plus, SQL Developer, SQL * Loader are all free. If you use a feature optionally licensed in the database that makes the difference. For example, Oracle Partitioning is an extra cost feature. You can use SQL * Loader to load data into an Oracle database without that it costs you nothing more. But if you use SQL * Loader to load into a partitioned table, you will have to pay for the partitioning, but not SQL * Loader.
See you soon,.
Brian -
Hi people,
Please find the table with the records below
CREATE TABLE EMPLOYEE_LOAD (ID NUMBER, NAME VARCHAR2 (100), UNIVERSITY_NAME VARCHAR2 (100));
File to load: employee.txt
ID | NAME | UNIVERSITY_NAME
1. JAMES | UNIVERSITY OF MIT
2. LISA | UNIVERSITY OF CAMBRIDGE
3. MINDY | "THIS UNIVERSITY
4. ' ALLEN J |' IT UNIVERSITY
5. ' MIKE'ALLEN | "THIS UNIVERSITY
I've written SQL Loader control as file below
OPTIONS(SKIP=1,DIRECT=TRUE)
DOWNLOAD THE DATA
INFILE ' mnt/employee.txt.
BADFILE "mnt/employee.bad.
DISCARDFILE ' mnt/employee.dsc.
ADD THE EMPLOYEE_LOAD TABLE
FIELDS TERMINATED BY ' | '. SURROUNDED OF POSSIBLY ' ' ' TRAILING NULLCOLS
(
ID,
NAME,
UNIVERSITY_NAME
)
sqlldr userid = < username > @database/password control=/mnt/employee.ctl log=/mnt/employee.log
Only 2 first recordings are loading (excluding the header) and face below errors
(1) no terminator found after CLOSE and CLOSED field
(2) second string for the attachment does not exist
These 3 recordings are in the wrong files and how do I force sql loader to load subfolders too?
3. MINDY | "THIS UNIVERSITY
4. ' ALLEN J |' IT UNIVERSITY
5. ' MIKE'ALLEN | "THIS UNIVERSITY
Try to remove the ENCLOSED BY ' ' ' clause. It is not true, according to your sample data.
These quotes seem to be interspersed randomly.
-
On the use of SQL * loader program pl/sql command
Is it possible to develop a PL/SQL program that calls SQL * Loader (sqlldr userid =...) command?
We want to package a process that loads data from a text file into a table, and we research it is possible to use SQL * Loader to the data loading process.
You can launch sqlldr like all other orders of operation. There are several methods of cooking OS command, as
1. external procedure PL/SQL: CHMOD FROM A PLSQL?
Johan's blog: how to call PL/SQL kernel32.dll.
2 Java Stored Procedure: Blog of Johan: using JAVA in PL/SQL - PART - I list files with timestamp , Blog of Johan: using JAVA in PL/SQL - PART - II operating system information obtaining
3. external PREPROCESSOR Table function: no response on java not call windows in oracle command
4. using DBMS_SCHEDULER,
job_type => 'executable' and
job_action => '/bin/sh' (may be)
-
Question to load data using sql loader in staging table, and then in the main tables!
Hello
I'm trying to load data into our main database table using SQL LOADER. data will be provided in separate pipes csv files.
I have develop a shell script to load the data and it works fine except one thing.
Here are the details of a data to re-create the problem.
Staging of the structure of the table in which data will be filled using sql loader
create table stg_cmts_data (cmts_token varchar2 (30), CMTS_IP varchar2 (20));
create table stg_link_data (dhcp_token varchar2 (30), cmts_to_add varchar2 (200));
create table stg_dhcp_data (dhcp_token varchar2 (30), DHCP_IP varchar2 (20));
DATA in the csv file-
for stg_cmts_data-
cmts_map_03092015_1.csv
WNLB-CMTS-01-1. 10.15.0.1
WNLB-CMTS-02-2 | 10.15.16.1
WNLB-CMTS-03-3. 10.15.48.1
WNLB-CMTS-04-4. 10.15.80.1
WNLB-CMTS-05-5. 10.15.96.1
for stg_dhcp_data-
dhcp_map_03092015_1.csv
DHCP-1-1-1. 10.25.23.10, 25.26.14.01
DHCP-1-1-2. 56.25.111.25, 100.25.2.01
DHCP-1-1-3. 25.255.3.01, 89.20.147.258
DHCP-1-1-4. 10.25.26.36, 200.32.58.69
DHCP-1-1-5 | 80.25.47.369, 60.258.14.10
for stg_link_data
cmts_dhcp_link_map_0309151623_1.csv
DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2
DHCP-1-1-2. WNLB-CMTS-03-3,WNLB-CMTS-04-4,WNLB-CMTS-05-5
DHCP-1-1-3. WNLB-CMTS-01-1
DHCP-1-1-4. WNLB-CMTS-05-8,WNLB-CMTS-05-6,WNLB-CMTS-05-0,WNLB-CMTS-03-3
DHCP-1-1-5 | WNLB-CMTS-02-2,WNLB-CMTS-04-4,WNLB-CMTS-05-7
WNLB-DHCP-1-13 | WNLB-CMTS-02-2
Now, after loading these data in the staging of table I have to fill the main database table
create table subntwk (subntwk_nm varchar2 (20), subntwk_ip varchar2 (30));
create table link (link_nm varchar2 (50));
SQL scripts that I created to load data is like.
coil load_cmts.log
Set serveroutput on
DECLARE
CURSOR c_stg_cmts IS SELECT *.
OF stg_cmts_data;
TYPE t_stg_cmts IS TABLE OF stg_cmts_data % ROWTYPE INDEX BY pls_integer;
l_stg_cmts t_stg_cmts;
l_cmts_cnt NUMBER;
l_cnt NUMBER;
NUMBER of l_cnt_1;
BEGIN
OPEN c_stg_cmts.
Get the c_stg_cmts COLLECT in BULK IN l_stg_cmts;
BECAUSE me IN l_stg_cmts. FIRST... l_stg_cmts. LAST
LOOP
SELECT COUNT (1)
IN l_cmts_cnt
OF subntwk
WHERE subntwk_nm = l_stg_cmts (i) .cmts_token;
IF l_cmts_cnt < 1 THEN
INSERT
IN SUBNTWK
(
subntwk_nm
)
VALUES
(
l_stg_cmts (i) .cmts_token
);
DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_cmts (i) .cmts_token);
ON THE OTHER
DBMS_OUTPUT. Put_line ("token is already present'");
END IF;
WHEN l_stg_cmts EXIT. COUNT = 0;
END LOOP;
commit;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
for dhcp
coil load_dhcp.log
Set serveroutput on
DECLARE
CURSOR c_stg_dhcp IS SELECT *.
OF stg_dhcp_data;
TYPE t_stg_dhcp IS TABLE OF stg_dhcp_data % ROWTYPE INDEX BY pls_integer;
l_stg_dhcp t_stg_dhcp;
l_dhcp_cnt NUMBER;
l_cnt NUMBER;
NUMBER of l_cnt_1;
BEGIN
OPEN c_stg_dhcp.
Get the c_stg_dhcp COLLECT in BULK IN l_stg_dhcp;
BECAUSE me IN l_stg_dhcp. FIRST... l_stg_dhcp. LAST
LOOP
SELECT COUNT (1)
IN l_dhcp_cnt
OF subntwk
WHERE subntwk_nm = l_stg_dhcp (i) .dhcp_token;
IF l_dhcp_cnt < 1 THEN
INSERT
IN SUBNTWK
(
subntwk_nm
)
VALUES
(
l_stg_dhcp (i) .dhcp_token
);
DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_dhcp (i) .dhcp_token);
ON THE OTHER
DBMS_OUTPUT. Put_line ("token is already present'");
END IF;
WHEN l_stg_dhcp EXIT. COUNT = 0;
END LOOP;
commit;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
for link -.
coil load_link.log
Set serveroutput on
DECLARE
l_cmts_1 VARCHAR2 (4000 CHAR);
l_cmts_add VARCHAR2 (200 CHAR);
l_dhcp_cnt NUMBER;
l_cmts_cnt NUMBER;
l_link_cnt NUMBER;
l_add_link_nm VARCHAR2 (200 CHAR);
BEGIN
FOR (IN) r
SELECT dhcp_token, cmts_to_add | ',' cmts_add
OF stg_link_data
)
LOOP
l_cmts_1: = r.cmts_add;
l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));
SELECT COUNT (1)
IN l_dhcp_cnt
OF subntwk
WHERE subntwk_nm = r.dhcp_token;
IF l_dhcp_cnt = 0 THEN
DBMS_OUTPUT. Put_line ("device not found: ' |") r.dhcp_token);
ON THE OTHER
While l_cmts_add IS NOT NULL
LOOP
l_add_link_nm: = r.dhcp_token |' _TO_' | l_cmts_add;
SELECT COUNT (1)
IN l_cmts_cnt
OF subntwk
WHERE subntwk_nm = TRIM (l_cmts_add);
SELECT COUNT (1)
IN l_link_cnt
LINK
WHERE link_nm = l_add_link_nm;
IF l_cmts_cnt > 0 AND l_link_cnt = 0 THEN
INSERT INTO link (link_nm)
VALUES (l_add_link_nm);
DBMS_OUTPUT. Put_line (l_add_link_nm |) » '||' Has been added. ") ;
ELSIF l_link_cnt > 0 THEN
DBMS_OUTPUT. Put_line (' link is already present: ' | l_add_link_nm);
ELSIF l_cmts_cnt = 0 then
DBMS_OUTPUT. Put_line (' no. CMTS FOUND for device to create the link: ' | l_cmts_add);
END IF;
l_cmts_1: = TRIM (SUBSTR (l_cmts_1, INSTR (l_cmts_1, ',') + 1));
l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));
END LOOP;
END IF;
END LOOP;
COMMIT;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
control files -
DOWNLOAD THE DATA
INFILE 'cmts_data.csv '.
ADD
IN THE STG_CMTS_DATA TABLE
When (cmts_token! = ") AND (cmts_token! = 'NULL') AND (cmts_token! = 'null')
and (cmts_ip! = ") AND (cmts_ip! = 'NULL') AND (cmts_ip! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:cmts_token))' cmts_token,
cmts_ip ' RTRIM (LTRIM(:cmts_ip)) ")". "
for dhcp.
DOWNLOAD THE DATA
INFILE 'dhcp_data.csv '.
ADD
IN THE STG_DHCP_DATA TABLE
When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')
and (dhcp_ip! = ") AND (dhcp_ip! = 'NULL') AND (dhcp_ip! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:dhcp_token))' dhcp_token,
dhcp_ip ' RTRIM (LTRIM(:dhcp_ip)) ")". "
for link -.
DOWNLOAD THE DATA
INFILE 'link_data.csv '.
ADD
IN THE STG_LINK_DATA TABLE
When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')
and (cmts_to_add! = ") AND (cmts_to_add! = 'NULL') AND (cmts_to_add! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:dhcp_token))' dhcp_token,
cmts_to_add TANK (4000) RTRIM (LTRIM(:cmts_to_add)) ")" ""
SHELL SCRIPT-
If [!-d / log]
then
Mkdir log
FI
If [!-d / finished]
then
mkdir makes
FI
If [!-d / bad]
then
bad mkdir
FI
nohup time sqlldr username/password@SID CONTROL = load_cmts_data.ctl LOG = log/ldr_cmts_data.log = log/ldr_cmts_data.bad DISCARD log/ldr_cmts_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
nohup time username/password@SID @load_cmts.sql
nohup time sqlldr username/password@SID CONTROL = load_dhcp_data.ctl LOG = log/ldr_dhcp_data.log = log/ldr_dhcp_data.bad DISCARD log/ldr_dhcp_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
time nohup sqlplus username/password@SID @load_dhcp.sql
nohup time sqlldr username/password@SID CONTROL = load_link_data.ctl LOG = log/ldr_link_data.log = log/ldr_link_data.bad DISCARD log/ldr_link_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
time nohup sqlplus username/password@SID @load_link.sql
MV *.log. / log
If the problem I encounter is here for loading data in the link table that I check if DHCP is present in the subntwk table, then continue to another mistake of the newspaper. If CMTS then left create link to another error in the newspaper.
Now that we can here multiple CMTS are associated with unique DHCP.
So here in the table links to create the link, but for the last iteration of the loop, where I get separated by commas separate CMTS table stg_link_data it gives me log as not found CMTS.
for example
DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2
Here, I guess to link the dhcp-1-1-1 with balancing-CMTS-01-1 and wnlb-CMTS-02-2
Theses all the data present in the subntwk table, but still it gives me journal wnlb-CMTS-02-2 could not be FOUND, but we have already loaded into the subntwk table.
same thing is happening with all the CMTS table stg_link_data who are in the last (I think here you got what I'm trying to explain).
But when I run the SQL scripts in the SQL Developer separately then it inserts all valid links in the table of links.
Here, she should create 9 lines in the table of links, whereas now he creates only 5 rows.
I use COMMIT in my script also but it only does not help me.
Run these scripts in your machine let me know if you also get the same behavior I get.
and please give me a solution I tried many thing from yesterday, but it's always the same.
It is the table of link log
link is already present: dhcp-1-1-1_TO_wnlb-cmts-01-1 NOT FOUND CMTS for device to create the link: wnlb-CMTS-02-2
link is already present: dhcp-1-1-2_TO_wnlb-cmts-03-3 link is already present: dhcp-1-1-2_TO_wnlb-cmts-04-4 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-5
NOT FOUND CMTS for device to create the link: wnlb-CMTS-01-1
NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-8 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-6 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-0 NOT FOUND CMTS for device to create the link: wnlb-CMTS-03-3
link is already present: dhcp-1-1-5_TO_wnlb-cmts-02-2 link is already present: dhcp-1-1-5_TO_wnlb-cmts-04-4 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-7
Device not found: wnlb-dhcp-1-13 IF NEED MORE INFORMATION PLEASE LET ME KNOW
Thank you
I felt later in the night that during the loading in the staging table using UNIX machine he created the new line for each line. That is why the last CMTS is not found, for this I use the UNIX 2 BACK conversion and it starts to work perfectly.
It was the dos2unix error!
Thank you all for your interest and I may learn new things, as I have almost 10 months of experience in (PLSQL, SQL)
-
SQL * Loader does not import data
Hi all -
I have a very basic package which should load data from a file delimited by tabs. My problem is that when I run my package, no data is loaded. Although the correct number of records is created (based on relaxation of the table). All records contain no data.
OPTIONS (skip = 1, errors = 10, lines = 10000, direct = True)
DOWNLOAD THE DATA
INFILE "C:\ECOMMERCE\VFT\Marin\inbound\DSGSiteCatalystPassbackKeywords.csv" "str"\n"
BADFILE "C:\ECOMMERCE\MFT\NxtGen_Catalog_Int\ErrorFiles\MARIN_DSG_KEYWORDS.bad."
DISCARDFILE 'C:\ECOMMERCE\MFT\NxtGen_Catalog_Int\ErrorFiles\MARIN_DSG_KEYWORDS.dsc '.
IN THE TABLE "ETL_STAGE". "" MARIN_KEYWORD ".
ADD
EVALUATE CHECK_CONSTRAINTS
"FIELDS TERMINATED BY ' 09 X."
TRAILING NULLCOLS
(
MARIN_KEYWORD_ID,
KEYWORD,
BUSINESS_DATE,
EDITOR,
ACCOUNT,
CAMPAIGN,
AD_GROUP,
TYPE CHAR (100000),
DESTINATION_URL,
UNIQUE_ID,
PUB_ID,
PRINT,
CLICKS,
PUB_COST,
ATTRIBUTED_CONVERSIONS_CONV,
CLICK_PATH_CONV,
LAST_CLICK_CONV,
EMAIL_SIGNUPS_CONV,
SCORECARD_SIGNUPS_CONV,
STORE_LOCATOR_PAGE_CONV
)
My table creation script is:
CREATE THE TABLE ETL_STAGE. MARIN_KEYWORD
(
MARIN_KEYWORD_ID VARCHAR2 (1000 BYTE),
KEYWORD VARCHAR2 (1000 BYTE).
BUSINESS_DATE VARCHAR2 (200 BYTE),
EDITOR VARCHAR2 (1000 BYTE),
ACCOUNT VARCHAR2 (1000 BYTE),
CAMPAIGN VARCHAR2 (1000 BYTE),
AD_GROUP VARCHAR2 (1000 BYTE),
TYPE VARCHAR2 (1000 BYTE),
DESTINATION_URL VARCHAR2 (1000 BYTE),
UNIQUE_ID VARCHAR2 (1000 BYTE),
PUB_ID VARCHAR2 (1000 BYTE),
VARCHAR2 (1000 BYTE) PRINT,
VARCHAR2 (1000 BYTE) CLICKS,
PUB_COST VARCHAR2 (1000 BYTE),
ATTRIBUTED_CONVERSIONS_CONV VARCHAR2 (1000 BYTE),
CLICK_PATH_CONV VARCHAR2 (1000 BYTE),
LAST_CLICK_CONV VARCHAR2 (1000 BYTE),
EMAIL_SIGNUPS_CONV VARCHAR2 (1000 BYTE),
SCORECARD_SIGNUPS_CONV VARCHAR2 (1000 BYTE),
STORE_LOCATOR_PAGE_CONV VARCHAR2 (1000 BYTE),
IMEX_LOG_REFERENCE_ID VARCHAR2 (1000 BYTE),
DATE_ADDED DATE DEFAULT SYSDATE NOT NULL,.
ADDED_BY VARCHAR2 (50 BYTES) BY DEFAULT USER NOT NULL,.
DATE_LAST_MODIFIED DATE DEFAULT SYSDATE NOT NULL,.
MODIFIED_BY VARCHAR2 (50 BYTES) BY DEFAULT USER NOT NULL,.
RECORD_STATUS VARCHAR2 (1 BYTE) DEFAULT 'A' NOT NULL
)
TABLESPACE ECOM_DATA
PCTUSED 0
PCTFREE 10
INITRANS 1
MAXTRANS 255
STORAGE)
64K INITIALS
ACCORDING TO 1 M
MINEXTENTS 1
MAXEXTENTS UNLIMITED
PCTINCREASE 0
DEFAULT USER_TABLES
)
LOGGING
NOCOMPRESS
NOCACHE
NOPARALLEL
MONITORING;
The log displays the following text:
SQL * Loader: release 11.2.0.2.0 - Production on Thu Jun 25 14:31:35 2015
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Control file: C:\ECOMMERCE\MFT\NxtGen_Catalog_Int\SQLLoaderScripts\MARIN_DSG_CREATIVES.ctl
Data file: C:\ECOMMERCE\VFT\Marin\inbound\DSGSiteCatalystPassbackCreatives.csv
Bad leadership: C:\\ECOMMERCE\MFT\NxtGen_Catalog_Int\ErrorFiles\MARIN_DSG_CREATIVES.bad
Delete the file: C:\ECOMMERCE\MFT\NxtGen_Catalog_Int\ErrorFiles\MARIN_DSG_CREATIVES.dsc
(Allow all releases)
Number of loading: ALL
Number of jump: 10
Authorized errors: 50
Link table: 64 lines, maximum of 256000 bytes
Continuation of the debate: none is specified
Path used: classics
Table 'ETL_STAGE '. "" MARIN_CREATIVE ", loaded from every logical record.
Insert the option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Position Len term Encl. Datatype name
------------------------------ ---------- ----- ---- ---- ---------------------
CREATIVE_ID PRIME * WHT O(") CHARACTER
The maximum field length is 100,000
HEADLINE THEN * CHARACTER O(") WHT
BUSINESS_DATE NEXT * CHARACTER O(") WHT
The SQL string for the column: 'TRIM(:BUSINESS_DATE) '.
DESCRIPTION_LINE_1 NEXT * CHARACTER O(") WHT
DESCRIPTION_LINE_2 NEXT * CHARACTER O(") WHT
DISPLAY_URL NEXT * CHARACTER O(") WHT
DESTINATION_URL NEXT * WHT O(") CHARACTER
The maximum field length is 100,000
EDITOR NEXT * CHARACTER O(") WHT
THEN THE CAMPAIGN * CHARACTER O(") WHT
AD_GROUP NEXT * CHARACTER O(") WHT
UNIQUE_ID NEXT * CHARACTER O(") WHT
PUB_ID NEXT * CHARACTER O(") WHT
PRINT NEXT * CHARACTER O(") WHT
The SQL string for the column: "replace (: print, ',',") "."
CLICK ON NEXT * CHARACTER O(") WHT
The SQL string for the column: "replace (: CLICKS, ',',") "."
ATTRIBUTED_CONVERSIONS_CONV NEXT * CHARACTER O(") WHT
CLICK_PATH_CONV NEXT * CHARACTER O(") WHT
LAST_CLICK_CONV NEXT * CHARACTER O(") WHT
EMAIL_SIGNUPS_CONV NEXT * CHARACTER O(") WHT
SCORECARD_SIGNUPS_CONV NEXT * CHARACTER O(") WHT
STORE_LOCATOR_PAGE_HITS_CONV NEXT * CHARACTER O(") WHT
IMEX_LOG_REFERENCE_ID NEXT * CHARACTER O(") WHT
Why WHT, showing when I use tabs?
In addition, why is not display all the data in the files?
NLS_CHARACTERSET WE8ISO8859P1 -
SQL Loader - null is not recognized
Hi, I have a very strange question that I would really appreciate help with.
We have the file following sql ctl charger that works very well.
OPTIONS (DIRECT = TRUE, PARALLEL = FALSE, ERRORS = 0, BINDSIZE = 50000, LINES = 10000, READ)
SIZE = 65536)
DOWNLOAD THE DATA
CHARACTERSET WE8MSWIN1252
INFILE 'file1' "STR X ' 0 and has.
READBUFFERS 4
IN THE TABLE test.tabl1
TRUNCATE
REACTIVATE THE DISABLED_CONSTRAINTS
FIELDS
(
POSITION TANK (01:25) "DT_TM_ADDED" "DECODE (SUBSTR (: DT_TM_ADDED, 13.2), null,).
Substr(:dt_tm_added,1,11) | "(00:00:00 ',: DT_TM_ADDED) ' etc. .
We must now change our ctl files so the fields are completed by |
It now looks like this.
FIELDS TERMINATED BY ' |'
(
"DT_TM_ADDED" TANK "DECODE (SUBSTR (: DT_TM_ADDED, 13.2), null, substr(:dt_tm_add)).
Ed, 1, 11). 00:00:00',TRIM(:DT_TM_ADDED)) ", etc.
This works great besides when the DT_TM_ADDED, 13.2 has a value zero. Its not picking up that it is worth zero. If I change the statement as follows using a ' ' instead of null, IE (: DT_TM_ADDED, 13.2), ' ', substr,... it works very well.
Am I missing something really obvious here?
Any help would be really appreciated.
Morgan Library has great demos on SQL LOADER.
Demo 6
"Use keywords NULLIF and DRAFTS to manage a length no strings loaded into numeric columns. Also note the use of the Direct path load in the control file (DIRECT = TRUE). »
Link: Oracle 12 c SQL * Loader
It could be that useful...
-
Hello
I have a requirement to load a file and exclude lines on a column with a value where ID NOT LIKE ' % no data %'
OPTIONS (SKIP = 3)
DOWNLOAD THE DATA
INFILE ' / home/test.csv '
IN THE test TABLE
TRUNCATE
What ID do NOT LIKE ' % no data %'
' Surrounded FIELDS TERMINATED BY ',' possibly ' ' '
TRAILING NULLCOLS
(ID,
BEGUN,
TIME,
DATE_ADDED SYSDATE
)
Some examples of ID:
Data AVDFGHKL:no
Data THGEIUJD:no
The value will always point 10 a go (after the semi colon)
I tried the following, but not luck
When (1) = '1') AND (10:2). = 'No '.
Yes, the outer table can be a better option (the file must be in the server). Depending on your data (i.e. the colon (:)) always comes in 9th place) it will work.)
[oracle@localhost saubhik]$ cat my_data.txt AVDFGHKL:No Data, 09/11/2009 JFDJFUJK, 09/11/2009 AGHFJDKK:No Data, 09/11/2009 TRUTIRUT, 09/11/2009 [oracle@localhost saubhik]$
[oracle@localhost saubhik]$ cat test_emp_ctl.ctl LOAD DATA INFILE 'my_data.txt' INTO TABLE test TRUNCATE when (9:9) <> ":" FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' (id, start_date DATE(19) "DD-MM-YYYY") [oracle@localhost saubhik]$ sqlldr scott/tiger control=test_emp_ctl.ctl SQL*Loader: Release 11.2.0.1.0 - Production on Tue Jun 2 16:20:22 2015 Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved. Commit point reached - logical record count 4 [oracle@localhost saubhik]$
Audit:
SQL> SELECT * FROM test; ID START_DAT ------------------------------ --------- JFDJFUJK 09-NOV-09 TRUTIRUT 09-NOV-09 SQL>
-
SQL loader is loading only one record
I use SQL loader to load a CSV file into database
sqlldr CON CONTROL = 'test.ctl' = 'TEST.log' = bad "bad.bad" LOG DATA = ' test. DAT'
How ever I always only one record
CTL file
OPTIONS (ERRORS = 50)
DOWNLOAD THE DATA
ADD
CONTINUEIF NEXT(1:1) = ' # '.
IN TABLEtest
FIELDS TERMINATED BY' '
SURROUNDED OF POSSIBLY ' "" ' AND ' "'"
TRAILING NULLCOLS)
"Etest_ID" (27) TANK.
'test_IND' CHAR (8),
'test_SUB_IND' CHAR (12),
"test_GIND1" TANK (9).
"test_GIND2" TANK (9).
'test_STATUS' CHAR (11),
'test_STATUS' CHAR (11),
'test_AMOUNT1' CHAR (14).
'test_AMOUNT2' CHAR (14).
'test_AMOUNT3' CHAR (14).
'test_AMOUNT4' CHAR (14).
"test_GIND21" TANK (9).
"tet_GIND3" TANK (9).
'test_STATUS1' CHAR (11),
'test_STATUS2' CHAR (11),
'test_AMOUNT1' CHAR (14).
'test_AMOUNT2' CHAR (14).
"test_FLAG" TANK (9).
'test_USED_FLAG' CHAR (15).
"test_FLAG" TANK (9).
'TtestL_TRF_AMOUNT' CHAR (16).
'testF_DATE' CHAR (8),
'test_STATUS' CHAR (14))
data file
'AB00431MT00377_110915_00000 '. "PP". " " « Y » « Y » 'TRAN '. "FAILURE." '00000008667.15 '. '00000000000.00 '. '00000000000.00 '. '00000000000.00 '. « Y » « Y » 'ZERO '. 'ZERO '. '00000000000.00 '. '00000000000.00 '. « Y » « Y » " " '00000008667.15 '. '2111014 '. « S » 'AB00431MT00377_110915_00000 '. "PP". " " « Y » " " 'ZERO '. " " '00000000000.00 '. '00000000000.00 '. '00000000000.00 '. '00000000000.00 '. « Y » " " 'ZERO '. " " '00000000000.00 '. '00000000000.00 '. « Y » " " " " '00000000000.00 '. "0000000". « R » Help, please
I tried generating CTL gall in various Noah this option worked. It is urgent please help
-
Name of dynamic INFILE SQL loader
Hi all
I use oracle 10g on windows server.
I have a log on a remote server. each line in the log file is as below
LogIN Mar 02/09/2014 10:10:48 ss18 N419FS40 1
I create a table for that as below;
Create table free_pc (log_in varchar2(10),log_day varchar2(10),log_date date,log_time date,log_user varchar2(30),log_Lab varchar2(30),log_pc varchar2(30), log_status char(1));
I have two problems;
1 N419FS40, the N419 is the name of a student in the lab, while FS40 is the name of the computer in this laboratory. I want only the first four characters to be inserted into the column of Log_lab and the rest of characters iin Log_pc column. my control file is less than
DOWNLOAD THE DATA
INFILE '\\remote_location\login02_09_14.txt '.
ADD
IN THE TABLE free_pc
FIELDS FINISHED WITH A WHITE SPACE
(log_in,
log_day,
log_date DATE "DD/MM/YYYY",
log_time DATE "HH24 MI SS."
LOG_USER,
log_Lab,
log_PC,
log_status
)
How to do this?
2. the log file is generated on the server, every day with a different name, he cancatenate the current date with Word LOGIN, for example login02_09_14.txt, login03_09_14.txt and login04_09_14 etc. in my INFILE tag, how can I put the name, so that it automatically take the name of the remote location?
Thank you.
Kind regards.
1. you can use the SUBSTR function to separate the two values. You put the fields in the control file in the same order in the data file, with all the columns that are formulated from data in other areas at the end.
2. There are different ways to do this. The file name can be in the control file or the SQL * Loader command line. You can create either using SQL/SQL * Plus or operating system commands. I tend to prefer change just the line of SQL * Loader command line, instead of the entire control file. I prefer to do this in a SQL file instead of a file breaks Windows or * ix shell script or something, so that it is independent of the operating system.
In addition, it is best to store all of your day, date and time in a column, then you can use to_char to display it as you want, in a column, or two or three.
Please see the example below, in which I have shown above.
Scott@ORCL >-data file you provided with the name changed to include today's date:
Scott@ORCL > HOST TYPE 'c:\remote_location\login27_04_15.txt '.
Opening on Tuesday, September 2, 2014 ss18 N419FS40 1 10:10:48
Scott@ORCL >-control file with no infile, fields in the order of the data file, using fill for the unused columns,.
Scott@ORCL >-with forumulated columns, using boundfiller for filling necessary for columns columns drawn up:
Scott@ORCL > test.ctl TYPE of HOST
DOWNLOAD THE DATA
ADD
IN THE TABLE free_pc
FIELDS FINISHED WITH A WHITE SPACE
TRAILING NULLCOLS
(log_in,
log_day FILLING,
log_date BOUNDFILLER,
log_time BOUNDFILLER,
LOG_USER,
log_Lab ' SUBSTR (: log_lab, 1, 4). "
log_status,
log_date_time ' TO_DATE (: log_date |: log_time, "MM/DD / YYYYHH24:MI:SS'"). "
log_PC ' SUBSTR (: log_lab, 5) ')
Scott@ORCL >-sql script which now enables to get the data and it concatenates the file name with the path of the directory
Scott@ORCL >- and loads the data running SQL * Loader of SQL * Plus, using the HOST command:
Scott@ORCL > test.sql TYPE of HOST
COLUMN data_file new_value data_file
SELECT 'c:\remote_location\login ' | TO_CHAR (SYSDATE, 'DD_MM_YY'). ".txt" AS data_file FROM DUAL
/
HOST SQLLDR scott/tiger CONTROL = test.ctl DATA = '& data_file' LOG = test.log
CLEAR COLUMN
Scott@ORCL >-table which includes the date and time in a column, for the day of the week can be extracted:
Scott@ORCL > create table free_pc
2 (log_in varchar2 (10))
3, date log_date_time
4, log_user varchar2 (30)
5, log_Lab varchar2 (30)
6, log_pc varchar2 (30)
7, log_status char (1))
8.
Table created.
Scott@ORCL >-load data file sql running running SQL * Loader using the control file:
Scott@ORCL > test.sql START
Scott@ORCL > data_file new_value data_file COLUMN
Scott@ORCL > SELECT 'c:\remote_location\login ' | TO_CHAR (SYSDATE, 'DD_MM_YY'). ".txt" AS data_file FROM DUAL
2.
DATA_FILE
------------------------------------
c:\remote_location\login27_04_15.txt
1 selected line.
Scott@ORCL > HOST SQLLDR scott/tiger CONTROL = test.ctl DATA = '& data_file' LOG = test.log
SQL * Loader: release 11.2.0.1.0 - Production on my Apr 27 12:00:53 2015
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit the point reached - the number of logical records 1
Scott@ORCL > CLEAR COLUMN
Scott@ORCL >-results with Date_Time field displayed in the form of day, date and time:
Scott@ORCL > log_in COLUMN FORMAT A6
Scott@ORCL > log_day COLUMN FORMAT A7
Scott@ORCL > log_user COLUMN FORMAT A8
Scott@ORCL > log_lab COLUMN FORMAT A8
Scott@ORCL > log_pc COLUMN FORMAT A6
Scott@ORCL > log_status FORMAT A10 COLUMN
Scott@ORCL > SELECT log_in,.
2 TO_CHAR (log_date_time, ' HH24:MI:SS Dy DD-Mon-YYYY ') ' DAY DATE TIME. "
3 log_user, log_lab, log_pc, log_status
4 FROM free_pc
5.
LOG_IN DAY DATE TIME LOG_USER LOG_LAB LOG_PC LOG_STATUS
------ ------------------------------------------ -------- -------- ------ ----------
LogIN Mar 02 - Sep - 2014 ss18 N419 FS40 1 10:10:48
1 selected line.
Maybe you are looking for
-
Satellite A60 - connected to the WIFI but can't get on the WEB
Hello I have a really strange problem with my Satellite A60. I have a US robotics wireless router. I can connect with my laptop, in the sense where it says that it acquires an IP address and then he said: it is connected, but then it is not possible
-
I am currently running Windows vista 64-bit, but I'm unable to allow automatic updates to run because it results in loss/damage/corruption of files.It will install everything as usual and reboot the computer, but instead of windows startup, it report
-
The hard drive works for 1/2 hour at the start before I can use the computer.
I tried all the registry cleaners, etc. with nothing doesn't. Is there a way to know what is happening during this time?I use AVG virus and tweak.now for other things.
-
How to install the Client of Microsoft Networks?
When I click on install, I select the customer, click on add, but then nothing new in the Network Client area to select. I do not have a disk to insert and I tried a system restore.
-
I always have to specify to open with Internet Explorer or other programs
Original title: A link can not remember which program to startI have a machine with Vista and and when I get an email with a link to the internet - a PDF file, a site, or something else, I can't just click on the link - a dialog box appears and I hav