Bulk insert in an external table
Hi, I get the error ora-29913, ora-01410 trying to do a bulk insert of external table
INSERT
IN CORE_TB_LOG
(SELECT 'MODEL', 'ARCH_BH_MODEL', ROWID, "MODEL-D-000000001', - 45, 'A', SYSDATE, 'R'")
OF ARCH_BH_MODEL1
WHERE length (MOD_XCODIGO) > 10)
INSERT
*
ERROR on line 1:
ORA-29913: error in executing ODCIEXTTABLEFETCH legend
ORA-01410: invalid ROWID
ARCH_BH_MODEL1 is the external table.
What's wrong?
Thank you.
Hello
There is no ROWID in external tables.
It makes sense: ROWID identifies where a line is stored in the database; It shows the data file and the block number in this file.
External tables are not stored in the database. They exist independently of any data file in the database. The concept of an Oracle block does not apply to them.
Why would you copy the ROWID, even if you could?
Apart from ROWID and SYSDATE, you select only literals. You don't want to select all the real data of the external table?
What is the big picture here? Post a small example of data (a CREATE TABLE statement and a small data file for the external table) and the desired results from these sample data (in other words, what core_tb_log must contain after INSERTION is complete.) Explain how you get these results from data provided.
Check out the Forum FAQ: Re: 2. How can I ask a question on the forums?
Tags: Database
Similar Questions
-
Direct load in external tables
Can we use direct load in external tables? or set DIRECTLY in the external table script?
Thank you.polasa wrote:
Can we use direct load in external tables? or set DIRECTLY in the external table script?Thank you.
N ° why? Because an external table does not load data. It's more like a pointer and an instruction how to read a file.
The big difference between SQL Loader and an external table, SQL Loader is actually two things.
(a) it reads a file from the file system
(b) it inserts these values into a table in the database.An external table only one).
However, you can do a quick insertion of this external table in an actual database table so that is sought.
insert /*+append */ into myRealTable (colA, ColB, colC) select * from myExternalTable
Append it and perhaps also the parallel indication will be close to a direct path insert.
-
With the help of external Tables
HELO - I have a few questions regarding the use of external tables, as I have not worked with them before.
From what I read, looks like they are more intended instead of SQL Loader, to simply load the data. But I was wondering if they are also used for queries. I ask because we usually create temporary tables to load the data provided by the company and we then attach this information to our main tables. The file that we just received is more 3 million lines, so I was wondering if it might be appropriate to use an external table and also, I can query this external table directly or will it still my temporary table as well.
In addition, it's the syntax that I found to create an external table and I was wondering if something escapes me.
I appreciate all the comments...SQL> create table xtern_empl_rpt 2 ( empl_id varchar2(3), 3 last_name varchar2(50), 4 first_name varchar2(50), 5 ssn varchar2(9), 6 email_addr varchar2(100), 7 years_of_service number(2,0) 8 ) 9 organization external 10 ( default directory xtern_data_dir 11 access parameters 12 ( records delimited by newline 13 fields terminated by ',' 14 ) 15 location ('employee_report.csv') 16 );
Thank you!
ChristineHello
Based on what I read
It would be good to know what you have read sofar and where...
I can query this external table directly
Yes, once you have created.
An external table allows to process a file as if it were a table.
Don't forget that you can query only, you can not the update/delete/insert in an external table.You can find many examples by searching on this forum,
or http://asktom.oracle.com
(for example: http://asktom.oracle.com/pls/asktom/f?p=100:11:0:::P11_QUESTION_ID:6611962171229))
or http://www.oracle-base.com/articles/9i/ExternalTables9i.php
or the Oracle Documentation @ http://tahiti.oracle.com -
Use with need to collect in bulk to insert records from multiple tables
Hello
I plsql record type with several tables with multiple columns. so when I used bulk collect with education for ALL. I want to insert records in multiple tables.
Please give me suggestions.
ForAll is designed to be used with a single DML statement, which may include dynamic SQL statements. However, I do not know what advantage this will give you your list iteration save several times, one for each table - especially since there is an air show with SQL dynamic.
Example 1 (dynamic SQL):
begin
...
forall i in vRecList.First..vRecList.Last
execute immediate '
begin
insert into Table1 (Col1, Col2, Col3) values (:1, :2, :3);
insert into Table2 (Col1, Col2, Col3) values (:1, :2, :3);
end;' using vRecList(i).Col1, vRecList(i).Col2, vRecList(i).Col3;
end;Another approach that I should work (but not tested) is using to insert all the Scriptures and based record inserts, but you need to try on your version of Oracle forall has changed between the versions. In this case vRecList must be compatible with the Table % ROWTYPE and Table2% ROWTYPE type.
Example 2 (insert all):
begin
...
forall i in vRecList.First..vRecList.Last
insert all
into Table1 values vRecList(i)
into Table2 values vRecList(i)
select 1 from dual;
end; -
Performance issue Bulk Insert PL/SQL table type
Hi all
I put in work of a batch to fill a table with a large number of data records(>3,000,000). To reduce the execution time, I used PL/SQL tables to temporarily store data that must be written to the destination table. Once all documents are piling up in the PL/SQL table I use a FORALL operator for bulk insert the records in the physical table.
Currently, I follow two approaches to implement the process described above. (Please see the code segments below). I need to choose how to best wise performance between these two approaches. I really appreciate all the comments of experts about the runtime of the two approaches.
(I don't see much difference in consumption of time in my test environment that has limited the data series. This process involves building a complex set of structures of large product once deployed in the production environment).
Approach I:_
DECLARE
TYPE of test_type IS test_tab % ROWTYPE directory INDEX TABLE;
test_type_ test_type.
ins_idx_ NUMBER;
BEGIN
ins_idx_: = 1;
NESTED LOOPS
test_type_ (ins_idx_) .column1: = value1;
test_type_ (ins_idx_) .column2: = value2;
test_type_ (ins_idx_) .column3: = value3;
ins_idx_: = ins_idx_ + 1;
END LOOP;
I_ FORALL in 1.test_type_. COUNTY
INSERT INTO test_tab VALUES (i_) test_type_;
END;
/
Approach II:_
DECLARE
Column1 IS a TABLE OF TYPE test_tab.column1%TYPE INDEX DIRECTORY.
Column2 IS a TABLE OF TYPE test_tab.column2%TYPE INDEX DIRECTORY.
Column3 IS a TABLE OF TYPE test_tab.column3%TYPE INDEX DIRECTORY.
column1 column1_;
column2_ Column2;
column3_ Column3;
ins_idx_ NUMBER;
BEGIN
ins_idx_: = 1;
NESTED LOOPS
column1_ (ins_idx_): = value1;
column2_ (ins_idx_): = value2;
column3_ (ins_idx_): = value3;
ins_idx_: = ins_idx_ + 1;
END LOOP;
FORALL idx_ in 1.column1_. COUNTY
INSERT
IN n_part_cost_bucket_tab)
Column1,
Column2,
Column3)
VALUES)
column1_ (idx_),
column2_ (idx_),
column3_ (idx_));
END;
/
Best regards
Lorenzo
Published by: nipuna86 on January 3, 2013 22:23nipuna86 wrote:
I put in work of a batch to fill a table with a large number of data records(>3,000,000). To reduce the execution time, I used PL/SQL tables to temporarily store data that must be written to the destination table. Once all documents are piling up in the PL/SQL table I use a FORALL operator for bulk insert the records in the physical table.
Performance is more than just reducing the execution time.
Just as smashing a car stops more than a car in the fastest possible time.
If it was (breaking a car stopping all simply), then a brick with reinforced concrete wall construction, would have been the perfect way to stop all types of all sorts of speed motor vehicles.
Only problem (well more than one actually) is that stop a vehicle in this way is bad for the car, the engine, the driver, passengers and any other content inside.
And pushing 3 million records in a PL/SQL 'table' (btw, that is a WRONG terminology - there no PL/SQL table structure) in order to run a SQL cursor INSERT 3 million times, to reduce the execution times, is no different than using a brick wall to stop a car.
Both approaches are pretty well imperfect. Both places an unreasonable demand on the memory of the PGA. Both are still row-by-row (aka slow-by-slow) treatment.
-
Hello
Version 10g
I need to get an array of 100 recordings in pl/sql procedure.
The table is built on columns.
Then I need bulk insert into the table.
Question: Is it possible to get a table in the stored procedure?
Thank youYes you can get it. Check the code below.
SQL> create type trec is object (a number, b number); 2 / Type created. SQL> create type tlist is table of trec; 2 / Type created. SQL> create table coltab(col1 number, col2 number, entry_date date); Table created. SQL> ed Wrote file afiedt.buf 1 create or replace procedure arraytest (p tlist) is 2 begin 3 for i in p.first..p.last 4 loop 5 insert into coltab values 6 (p(i).a,p(i).b,sysdate); 7 end loop; 8* end; SQL> / Procedure created. -----------Testing-------------------- SQL> declare 2 l tlist := tlist(trec(1,2),trec(4,3)); 3 begin 4 arraytest(l); 5 end; 6 / PL/SQL procedure successfully completed. SQL> SQL> SQL> select * from coltab; COL1 COL2 ENTRY_DAT ---------- ---------- --------- 1 2 04-AUG-10 4 3 04-AUG-10
If you do not use the concept of line, try this. This works in a similar way.
create or replace procedure arraytest (p tlist) is begin insert into coltab select t1.*, sysdate from table(p) t1; end; /
-
Bulk Insert for more than 50 columns of one table to another.
Hi all
I have two Source and target tables that are the same in structure, and I need the Source Table Insert into the target Table.
Source and the target Table is to have about 60 columns. And Volumn of data in the Source Table is about 1 Cr.
Previously, someone wrote the Simple insertion as
INSERT INTO TARGET_TBL (C1, C2, C3, C4... C60)
(SELECT C1, C2, C3, C4... TARGET_TAB C60);
And it will not work for this a large part of the data. Could you please suggest the best approach to do this.
Thanks in advance.
Hello..
This will not work for that much of the data
Tips for use as APPEND... See below link for direct loading
Try the below
ALTER session enable parallel dml.
INSERT / * + append parallel (6) * / INTO TARGET_TBL (C1, C2, C3, C4... C60)
(SELECT C1, C2, C3, C4... TARGET_TAB C60);
-
Bulk insert fired through forall
Hi all
in the following statement:
I get the message:declare cursor C is select id,PEOPLE_ID from CN_ITEMS; type T_A is table of cn_items%rowtype; V_A T_A; begin open c; LOOP fetch c bulk collect into v_a; forall I in V_A.first..V_A.last insert into CN_TAXES(id,CREATION_DATE,TAX_PRICE,ITEM_ID,PEOPLE_ID) values (CN_TAX_S.NEXTVAL, sysdate,10.5,v_a.id(i),v_a.people_id(i)); exit when c%notfound; end loop; end; /
Any ideas how to use in this situation FORALL? If I select all values in a table and then I use FORALL to insert them into another table with the same columns are ok, but here I want to just use the values to get insert them into 2 columns in another table.ORA-06550: line 13, column 2: PLS-00394: wrong number of values in the INTO list of a FETCH statement ORA-06550: line 13, column 2: PL/SQL: SQL Statement ignored ORA-06550: line 20, column 61: PLS-00302: component 'ID' must be declared ORA-06550: line 20, column 71: PLS-00302: component 'PEOPLE_ID' must be declared ORA-06550: line 20, column 71: PLS-00302: component 'PEOPLE_ID' must be declared ORA-06550: line 20, column 67: PL/SQL: ORA-00904: "V_A"."PEOPLE_ID": invalid identifier ORA-06550: line 19, column 5: PL/SQL: SQL Statement ignored 06550. 00000 - "line %s, column %s:\n%s" *Cause: Usually a PL/SQL compilation error. *Action:
Version: 11g
Thanks in advance,
Bahchevanov.>
Any ideas how to use in this situation FORALL? If I select all values in a table and then I use FORALL to insert them into another table with the same columns are ok
>
You answered your own question. The solution is exactly what you just said above
>
Select all values in a table and then I use FORALL to insert them into another table with the same columns
>
The first error you receivePLS-00394: wrong number of values in the INTO list of a FETCH statement
because of this code
cursor C is select id,PEOPLE_ID from CN_ITEMS; type T_A is table of cn_items%rowtype;
Your T_A variable is based on the CN_ITEMS table, but since you use a slider you should base on the slider
cursor C is select id,PEOPLE_ID from CN_ITEMS; type T_A is table of C%rowtype;
You are also selection ID but never use it. And you have an EXTERNAL loop, but did not use a LIMIT clause. A straight line in BULK COLLECT collect at a time so there is no purpose for the outer loop. But you should always use a clause limit rather than any which implied.
So if you must use a collection in bulk (even if your specific should use pure SQL) and to solve your problem change the code for this
cursor C is select CN_TAX_S.NEXTVAL, sysdate, 10.5,PEOPLE_ID from CN_ITEMS; type T_A is table of C%rowtype;
In other words, just build a line that will correspond to your target table. Then, you can use the FORALL to insert the entire line to the time (note the LIMIT clause)
LOOP fetch c bulk collect into v_a LIMIT 1000; forall I in V_A.first..V_A.last insert into CN_TAXES(id,CREATION_DATE,TAX_PRICE,ITEM_ID,PEOPLE_ID) values (V_A(I)); exit when c%notfound; end loop;
-
Load the XML file into Oracle external Table
I load the data from the XML file into an intermediate table Oracle using external Tables.Let's say below, it is my XML file
< header >
< A_CNT > 10 < / A_CNT >
< E_CNT > 10 < / E_CNT >
< AF_CNT > 10 < / AF_CNT >
< / header >
< student >
<>students-details
< Student_info >
< Single_Info >
< ID > 18 / < ID >
New York < City > < / City >
< country > United States < / country >
< Name_lst >
< Student_name >
Samuel < name > < / name >
Paul < Last_name > < / Last_name >
< DOB > 19871208 < / DOB >
Aware of < RecordStatus > < / RecordStatus >
< / Student_name >
< Student_name >
Samuel < name > < / name >
Paul < Last_name > < / Last_name >
< DOB > 19871208 < / DOB >< TerminationDt > 20050812 < / TerminationDt >
History of < RecordStatus > < / RecordStatus >
< / Student_name >
< / Name_lst >
< Personal_Info >
<>men < / Type >
< 27 > < / Age >
< / Personal_Info >
< / Single_Info >
< / Student_info >< student - register >
class < A >
< info >
< detail >
< ID student > 18 < / student >
EE < major > < / Major >
< course-Grades >
< course > VLSI < / course >
< degree > 3.0 < / Grade >
< / course-Grades >
< course-Grades >
< course > nanotechnology < / course >
< degree > 4.0 < / Grade >
< / course-Grades >
< / details >
< detail >
< ID student > 18 < / student >
THIS < major > < / Major >
< / details >
< / info >
class < A >
< Student_Enrol >
<>students-details
< student >I load this XML data file into a single table using an external Table. Could someone help me please with coding.
Thank you
Reva
Could you please help me how to insert my XML content into that.
Same as before, try a plain old INSERT:
insert into xml_pecos
values)
XmlType (bfilename ('XML_DIR', "test.xml"), nls_charset_id ('AL32UTF8'))
);
But you'll probably hit the same limitation as with the binary XMLType table.
In this case, you can use FTP to load the file as a resource in the XML DB repository.
If the XML schema has been registered with the hierarchy enabled then the file will be automatically inserted into the table.
Could you post the exact statement that you used to save the scheme?
In the meantime, you can also read this article, I did a few years ago, it covers the XML DB features that may be useful here, including details on how to load the file via FTP:
https://odieweblog.WordPress.com/2011/11/23/Oracle-XML-DB-a-practical-example/
And documentation of the course: http://docs.oracle.com/cd/E11882_01/appdev.112/e23094/xdb06stt.htm#ADXDB4672
-
Loading external Table with quotes
I have a file with fields in the file are as TAB delimiter ~ TAB.
Example as below:
QM ~ CD ~ Exzm ~ BMW
DM ~ BD ~ Exzm ~ BMW
CREATE TABLE test
(
Col_1 VARCHAR2 (100),
Col_2 VARCHAR2 (100),
Col_3 VARCHAR2 (100),
Col_4 VARCHAR2 (100)
)
EXTERNAL ORGANIZATION
(TYPE ORACLE_LOADER
DEFAULT DIRECTORY 'Test_Report '.
ACCESS SETTINGS
(records delimited by '\n'
CHARACTERSET 'UTF8 '.
fields terminated by '\t~\t '.
missing field values are null
)
LOCATION ("test.asc")
)
REJECT LIMIT UNLIMITED;
OUTPUT:
----------------
Data loaded in DB, but col_4 data comes from the quotation as below
col_4
-------
"BMW".
"BMW".
Note: Col1 - col3 data arrives correctly.
2807064 wrote:
A finding on my side.
I found that the values of Col_4 after inserting into DB with "transport return character" (CHR (13)) at the end of each value as shown below when I copy paste the value in notepad ++ "»
Example:
----------
"BMW".
"
But if I see the file I saw that BMW.
My question is, in this case the external table loading must fail right? Why is this it is to load data in DB?
Do you have this file begin life on windows, and then are transferred to * nix to serve an external table? If so, which explains a lot. Windows is the standard record delimiter x '0d0a' (Chr (13) 10) On * nix, it's just x '0A' (10. When the process of loader is scanning for record delimiter he's just looking for the '0A' x and x'd 0' gets included in the data.
Two solutions-
1 - Make sure that the data file is transferred so that the Records delimiters are converted. It's supposed to to happen with ascii ftp mode, but this week I saw several examples in the House of it does not.
2. attach your table definition external to seek the delimiter of actual recording instead of the default value of the operating system. == RECORDS DELIMITED BY X '0D0A '.
-
How to select the csv data stored in a BLOB column as if it were an external table?
Hi all
(Happy to be back after a while! )
Currently I am working on a site where users should be able to load the data of csv (comma is the separator) of their client machines (APEX 3.2 application) in the Oracle 11.2.0.4.0 EE database.
My problem is:
I can't use an external table (for the first time in my life ) so I'm a little clueless what to do as the csv data is stored by the application of the APEX in a BLOB column, and I'm looking for an elegant way (at least SQL PL/SQL/maximization) to insert the data into the destination table (run validations by a MERGER would be the most effective way to do the job).
I found a few examples, but I think they are too heavy and there could be a more elegant way in Oracle DB 11.2.
Simple unit test:
drop table CBC purge;
drop table dst serving;
create table src
(myblob blob
);
create table dst
(num number
, varchar2 (6) str
);
Insert in src
Select utl_raw.cast_to_raw (1; AAAAAA ;'|| Chr (10) |
2; BATH; »
)
Double;
Desired (of course) output based on the data in table SRC:
SQL > select * DST;
NUM STR
---------- ------
1 ABDELKRIM
2 BATH
Does anybody know a solution for this?
Any ideas/pointers/links/examples are welcome!
/ * WARNING: I was 'off' for about 3 months, then the Oracle - the part of my brain has become a bit "rusty and I feel it should not be so complicated as the examples I've found sofar ' * /"
Haha, wonder about regexp is like the blind leading the blind!
However, it's my mistake: I forgot to put the starting position setting (so 1, 2, 3,... was in fact the starting position, not the nth occurrence. duh!)
So, it should actually be:
select x.* , regexp_substr(x.col1, '[^;]+', 1, 1) , regexp_substr(x.col1, '[^;]+', 1, 2) , regexp_substr(x.col1, '[^;]+', 1, 3) , regexp_substr(x.col1, '[^;]+', 1, 4) , regexp_substr(x.col1, '[^;]+', 1, 5) , regexp_substr(x.col1, '[^;]+', 1, 6) from src , xmltable('/a/b' passing xmltype(''||replace(conv_to_clob(src.myblob), chr(10), '')||'') columns col1 varchar2(100) path '.') x;
Note: that's assuming that all the "columns" passed in the string won't be lame.
If one of them might be null, then:
select x.* , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 1) , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 2) , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 3) , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 4) , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 5) , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 6) from src , xmltable('/a/b' passing xmltype(replace(';'||replace(conv_to_clob(src.myblob), chr(10), ';')||'', ';;', '; ;')) columns col1 varchar2(100) path '.') x;
-
I have a file of sample data (we will have the a 'true' at a later date and put in day after that) which includes a header, footer, and 5 types of records, that have different columns and lengths, noticed by the first two characters. The different types of records are not all together. On the contrary, some (in particular, two of these types in this example) are intertwined. I am currently working on a SQL * Loader configuration file when it was suggested that I use external tables. I know very little of either, then I would ask what is the best to use.
Scott@orcl12c > host type test.dat
header line
AB, 123, efg
CD, hij, 456
Scott@orcl12c > type host test.ctl
options (Skip = 1)
load data
in the ab table truncate where table_name = 'ab'
fields ended by ',' trailing nullcols
(table_name filler position (1), col1, col2)
in the cd table add where table_name = 'cd'
fields ended by ',' trailing nullcols
(table_name filler position (1), col3, col4)
Scott@orcl12c > create table ab
2 (col1 number,
3 col2 varchar2 (8))
4.
Table created.
Scott@orcl12c > insert into ab values (1, 'old data')
2.
1 line of creation.
Scott@orcl12c > create table cd
2 (col3 varchar2 (8))
3 col4 number)
4.
Table created.
Scott@orcl12c > insert into cd values ("old data", 1).
2.
1 line of creation.
Scott@orcl12c > commit
2.
Validation complete.
Scott@orcl12c > host sqlldr scott/tiger control = test.ctl data = test.dat log = test.log
SQL * Loader: release 12.1.0.1.0 - Production on Thu Mar 27 13:11:47 2014
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: classics
Commit the point reached - the number of logical records 2
Table AB:
1 row loaded successfully.
Table D:
1 row loaded successfully.
Check the log file:
test.log
For more information on the charge.
Scott@orcl12c > select * AB
2.
COL1 COL2
---------- --------
EFG 123
1 selected line.
Scott@orcl12c > select * from cd
2.
COL3 COL4
-------- ----------
old data 1
hij 456
2 selected lines.
-
Create an external table to import csv to dynamic action
Hello
I'm trying to import data from a csv file into a table. The csv is located on the server and I try to do an external table to copy its contents to the master data table.
I can't even save the code, without this error:
It works very well in the SQL Developer.PLS-00103: Encountered the symbol "CREATE" when expecting one of the following: ( begin case declare exit for goto if loop mod null pragma raise return select update while with <an identifier> <a double-quoted delimited-identifier> <a bind variable> << continue close current delete fetch lock insert open rollback savepoint set sql execute commit forall merge pipe purge
CREATE TABLE "DATA_CSV" ( "C1" VARCHAR2(255), "C2" VARCHAR2(255), "C3" VARCHAR2(255), "C4" VARCHAR2(255), "C5" VARCHAR2(255), "C6" VARCHAR2(255), "C7" VARCHAR2(255), "C8" VARCHAR2(255), "C9" VARCHAR2(255), "C10" VARCHAR2(255), "C11" VARCHAR2(255), "C12" VARCHAR2(255), "C13" VARCHAR2(255), "C14" VARCHAR2(255), "C15" VARCHAR2(255), "C16" VARCHAR2(255), "C17" VARCHAR2(255), "C18" VARCHAR2(255), "C19" VARCHAR2(255), "C20" VARCHAR2(255) ) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY FTP_FOLDER ACCESS PARAMETERS ( records delimited BY newline fields terminated BY ';' optionally enclosed BY '"' lrtrim missing field VALUES are NULL ) LOCATION ('foo.csv') ); {code} The server I work on, runs on Apex 4.2.2. Thanks in advance, for your help Skirnir
-
LTRIM in an external Table definition.
Hello
We use the Oracle 11.2.
Because there is little documentation on the Web on no general external Table definitions, I would like to put this question out there.
How can I insert a Ltrim function in an external table definition?Hello
Here is an example that works on my system:
CREATE TABLE table_x ( num_id NUMBER (2) , string_id VARCHAR2 (2) , txt VARCHAR2 (12) ) ORGANIZATION EXTERNAL ( TYPE oracle_loader DEFAULT DIRECTORY my_dir ACCESS PARAMETERS ( RECORDS DELIMITED BY NEWLINE FIELDS ( num_id CHAR (1) , string_id CHAR (2) , txt CHAR (12) LTRIM ) ) LOCATION ('table_x.txt') );
Where the data file looks like this:
1 A abcdefgh 1 A ijklmnop 1 A pqrstuv 1 B abcdefgh 1 B ijklmn 2 A abcdefgh 3 C hi 3 C world
With this example data, the LENGTH (string_id) is always 2 (in other words, each string_id begins with a space), but LENGTH (txt) varies from 2 to 8.
I hope that answers your question.
If not, post your CREATE TABLE statement (simplified as much as possible, but always with the same problem) and a few lines of the data file corresponding. -
Logging what data inserted in the outer table
Hello
We load a large number of data using external tables and inserts into tables partitioned by day nor the hour.
Other parts of the system want to know what partitions have been updated and when, so before load us all the data, we make a
That's when inserting into partitions of day, IE get all separate days of that specific file.insert into new_data(date_of_event) values (select distinct(trunc(date_of_event) from external_table)
These files, however, can be very substantial and after this first analysis, we do a full insertion in to the correct data table, that is, read the entire file again.
So my question is, is there another (better) way to retrieve the different days/times? A way to make the effective insertion in the table of data and information on which partitions Oracle inserted data into any dictionary etc.?
Haven't tried creating a 'good' temp-table, insert the data into it, extraction of dates and then inserting to the actual data table. But then, what is the best, creating a table and a fall or read the file twice?
Thanks in advanceYou don't say how you are querying and inserting, but if you have a control on a few possibilities come to mind.
1. use INSERT ALL make a multi-table insert. Insert the "date_of_event" into a new "Journal" table and insert data to the table that you insert now.
Then you can make the separate (trunc (date_of_event) "select" on the new table of newspaper. Just truncate the new log table before each query or add a column that contains the file that was imported.2. create a VIEW that includes all the columns in your current query and another copy of "date_of_event" and peel the date of the event in another table.
Option #1 is the simplest.
See ALL INSERT in SQL Reference: http://docs.oracle.com/cd/B28359_01/server.111/b28286/statements_9014.htm
Maybe you are looking for
-
How can I add another page here? Frankly, I don't remember how I got 1 to 2 pages since I have not touched this document for some time, but now I need a third page, and I can't figure how to get it.
-
Tecra A3 106 noise on Audio output all supported
Hi all My Tecra A3 generates a noise on line out / headphone for charging the battery.When it is unplugged, the noise goes off. Is this normal? Thanks in advance.Onur.
-
create labels from a file of address
How to create a file of names and addresses address labels? And then print them out on a sheet that contains 30 labels arranged in 10 rows of 3 labels?
-
Can I use with my Satellite L500 - 1XC SSD?
I have a Toshiba Satellite L500-1XC. need a drive to solid state to replace the hard drive inside of her now?I was looking at the Toshiba 120 gb stor. E comes with caddy, cables and software for clone you old hard drive, its SATA III. IS MY LAPTOP SA
-
Does anyone have the BIOS beep codes for the P50/P70 Thinkpads? Almost everytime I start mine beeps at me. 3 short beeps, pause, 1 beep, pause, 1 beep, pause, 3 short beeps. I called tech support and they wanted me to uninstall the RAM I added after