field of varchar2 external table storing only 255 bytes (4000 bytes)
Hi allI was wondering if someone can tell me what im missing here.
I have an external table with one column defined as varchar2 (4000 byte). the file contains a line with 255 characters (all the number 2 for simplicity). When I query the table, everything is fine. If I add 1 plus 2 chain (256 characters) it fails. im sure its something stupidly simple, but what am I missing? If he doesn't have to question fine up to 4000 characters?
Thank you
Dave
I ran your testcase, thanks for that.
Make sure you read the SQL and PL/SQL FAQ as well (the first thread sticky on this forum), he explains how to post the code formatted and many other things.
In any case, the .log file gave me:
LOG file opened at 07/18/11 20:05:33
Field Definitions for table DAVEP2
Record format DELIMITED, delimited by 0A
Data in file has same endianness as the platform
Rows with all null fields are accepted
Fields in Data Source:
MY_STRING CHAR (255)
Terminated by ","
Enclosed by """ and """
Trim whitespace same as SQL Loader
So, what happens if you create the table as follows:
CREATE TABLE davep2 (
my_string VARCHAR2(4000 BYTE) NULL
)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY FILE_TEST
ACCESS PARAMETERS
( RECORDS DELIMITED BY 0x'0A' BADFILE FILE_TEST:'davep2.bad'
LOGFILE FILE_TEST:'davep2.log'
FIELDS TERMINATED BY ',' optionally enclosed by '"' and '"'
missing field values are null
(
my_string char(4000)
) )
LOCATION (FILE_TEST:'DaveP.csv')) REJECT LIMIT 0 NOPARALLEL
Tags: Database
Similar Questions
-
How can we not consider a field missing for external tables
My to Oracle. East of 10 gr 2
I created an external table using the following syntax:
create the table ext_table
(a number (5),)
b number (5),
c varchar2 (1000))
external organization
(type ORACLE_LOADER
FLAISTD default directory
access parameters (records delimited by newline
fields ended by "#".
(a char (5),
b tank (5),
c char (1000)))
location ("file.csv")
);
My problem is the following. I have a file. XLS I save as a file. CSV sometimes any line of the file. XLS for young woman of the last column and so in my folder. CSV, I can have something like this:
123 123 # #xxx
456 #456
and when I try to execute a select * from ext_table, I get an error because he expects a missing field.
How can I do? 'Say' in the table create above something alert who could miss the last field?
Thanks in advance!Solomon Yakobson says:
Use TRAILING NULLCOLS:Oops, this isn't the external table of SQL * Loader. So it should be MISSING FIELD VALUES ARE NULL:
create table ext_table (a number(5), b number(5), c varchar2(1000)) organization external (type ORACLE_LOADER default directory TEMP access parameters (records delimited by newline fields terminated by "#" missing field values are null (a char(5), b char(5), c char(1000))) location ('file.csv') ); Table created. SQL> select * 2 from ext_table 3 / A B C ---------- ---------- ---------- 123 123 xxx 456 456 SQL>
SY.
-
Several flat_files through the external table with only the common columns of loading
Hi, I have 50 flat files and each of them have some columns (fields) common and I need to load only the fields that are common to an external Table. Is any chance to do it with education unique external table. Or I need to load all flat_files at separate tables and then with the ETG and UNION load them only one table.If the page size for all the files are different, I think that your only option would be to define different external tables and create a view that joins all the.
HTH
Srini -
External table Oracle via the Tables API
Hello world
I did experiment with the Oracle NoSQL database recently and I became a bit stuck with the new API of Tables. I have so far successfully of the external tables on the data entered using storage techniques 'vanilla' KV and avro (using generic and specific links) scheme, but create API Tables seems to be another matter entirely.
My question arises in the trainer interface, which has a KeyValueVersion and a KVStore. I can't translate a KeyValueVersion created with the API of Tables in a primary key for recovery (since I don't know what the key generated by the API actually looks like to!) or map it on an avro scheme. The problem seems to be that the Tables API writes data in some format that can be easily translated into a string or an integer (releases from external table lines due to unknown characters if I am trying to retrieve all the values in the database to see what it looks like to), and try to map it to an AVRO map results in the error message 'the data are not as AVRO'.
Scenario:
I created a very simple table in the administration tool KV, which consists of a column personId (integer) that is PK, firstName, lastName, emailAddr (all channels) and enter 5 rows with success. What I want to do is to create an external table called person that returns just those 5 values (and brand new I add to the table of course). This means that I have to first understand what the parentKey value must be defined in the .dat file and how to take this key and it becomes a primary key for the recovery of the line.
Faithful old Google could not find information on how to do this (he was only a thread similar to this with the answer "we'll add it soon"), so I hope that someone here managed to do!
Thank you
Richard
Hi Richard
I understand the issue you are facing. In the current version (12.1.3.0.9) the external tables feature only works with records of K/V not with the Table model, however, in the next version (which us will very soon be GA) we will support integration of external tables with the data of Table model as well. Please make sure that you have signed up for the announcement of release so that we can inform you of the release. I apologize for the inconvenience, he did to you.
Best
Anuj
Sign up for announcements of NoSQL database , so we can warn you versions futures and other updates from the product of NoSQL database
-
External table with the field tab delimiter
With the help of Oracle 11 g Release 2
Here is my table create statement external:
CREATE TABLE global.ext_a_attrib_cmt ( tag VARCHAR2(255) , from$ VARCHAR2(255) , to$ VARCHAR2(255) ) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY EXT_DATA_DIR ACCESS PARAMETERS ( RECORDS DELIMITED BY NEWLINE SKIP 1 BADFILE EXT_BAD_DIR:'a_attrib_cmt.bad' LOGFILE EXT_LOG_DIR:'a_attrib_cmt.log' -- FIELDS TERMINATED BY 0X'09' -- TAB delimited FIELDS TERMINATED BY '\t' OPTIONALLY ENCLOSED BY "'" MISSING FIELD VALUES ARE NULL REJECT ROWS WITH ALL NULL FIELDS ) LOCATION ('a_attrib_cmt.txt') ) REJECT LIMIT UNLIMITED NOMONITORING /
Here is the text file, a_attrib_cmt.txt:
tag Of TO FrontSpringType_id Coil w/FRONT COIL SPRINGS FrontSpringType_id Sheet the FRONT/w suspension SPRINGS Aspiration_id Naturally aspirated w/o TURBO Aspiration_id Turbocharged w/TURBO Aspiration_id Supercharged w/COMPRESSOR SteeringType_id Grid w/RACK and PINION STEERING SteeringType_id Gear w/GEAR STEERING FuelDeliveryType_id CARB w/o FUEL INJ FuelDeliveryType_id FI w/FUEL INJ BedLength_id ?" BED BodyNumDoors_id ? DR BrakeSystem_id w / ? BRAKES FrontBrakeType_id w/FRONT? BRAKES PUBLIC has privileges to write to the directory EXT_DATA_DIR.
Here is the error I get:
Globall@ORA1 > select count (*) in the ext_a_attrib_cmt;
Select count (*) in ext_a_attrib_cmt
*
ERROR on line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN legend
ORA-29400: data cartridge error
KUP-00554: error occurred when parsing the access settings
KUP-01005: syntax error: found 'minussign': expected an a: "badfile, bigEndian, characterset, column, data, delimited, discardfile,
disable_directory_link_check, fields, fixed, charge, logfile, language, nodiscardfile, nobadfile, nologfile, date_cache, preprocessor, TailleLue, String, jump,
territory, variable.
KUP-01007: line 5, column 8
Just get rid of the comment line. You cannot cave comments to create external table statement. SY.
-
How to load the file tabs with field-delimited a comma in an external table
I am trying to create an external table in oracle 11 g r2. the script is as below. It fails if the field contains commas.
for example, the following data will not be loaded.
ERT 123, poipoipoi, yutio 567
Please suggest how to solve this problem. Thank you!
CREATE TABLE external_interaction
(
TAX_ID NUMBER (8).
pubmed_id_list varchar2 (36),
interaction_id_type varchar2 (36)
)
EXTERNAL ORGANIZATION
(TYPE ORACLE_LOADER
THE DEFAULT DIRECTORY ET_NCBI_DIR
ACCESS SETTINGS
(records delimited by 0 x '0A'
jump 1
BADFILE et_ncbi_log_dir: 'interactions.bad'
Et_ncbi_log_dir LOG file: 'interactions.log'
fields terminated by '\t '.
missing field values are null
REJECT ROWS WITH ALL FIELDS ARE NULL
(
TAX_ID,pubmed_id_list,
interaction_id_type
)
)
LOCATION (ET_NCBI_DIR: "interactions")
)
REJECT LIMIT UNLIMITED
NOPARALLEL
NOMONITORING;Thanks for the reply.
Should I use
collumn_name tank (2000) in the table creation script because it is larger than the default char (255)
-
How to select the csv data stored in a BLOB column as if it were an external table?
Hi all
(Happy to be back after a while! )
Currently I am working on a site where users should be able to load the data of csv (comma is the separator) of their client machines (APEX 3.2 application) in the Oracle 11.2.0.4.0 EE database.
My problem is:
I can't use an external table (for the first time in my life ) so I'm a little clueless what to do as the csv data is stored by the application of the APEX in a BLOB column, and I'm looking for an elegant way (at least SQL PL/SQL/maximization) to insert the data into the destination table (run validations by a MERGER would be the most effective way to do the job).
I found a few examples, but I think they are too heavy and there could be a more elegant way in Oracle DB 11.2.
Simple unit test:
drop table CBC purge;
drop table dst serving;
create table src
(myblob blob
);
create table dst
(num number
, varchar2 (6) str
);
Insert in src
Select utl_raw.cast_to_raw (1; AAAAAA ;'|| Chr (10) |
2; BATH; »
)
Double;
Desired (of course) output based on the data in table SRC:
SQL > select * DST;
NUM STR
---------- ------
1 ABDELKRIM
2 BATH
Does anybody know a solution for this?
Any ideas/pointers/links/examples are welcome!
/ * WARNING: I was 'off' for about 3 months, then the Oracle - the part of my brain has become a bit "rusty and I feel it should not be so complicated as the examples I've found sofar ' * /"
Haha, wonder about regexp is like the blind leading the blind!
However, it's my mistake: I forgot to put the starting position setting (so 1, 2, 3,... was in fact the starting position, not the nth occurrence. duh!)
So, it should actually be:
select x.* , regexp_substr(x.col1, '[^;]+', 1, 1) , regexp_substr(x.col1, '[^;]+', 1, 2) , regexp_substr(x.col1, '[^;]+', 1, 3) , regexp_substr(x.col1, '[^;]+', 1, 4) , regexp_substr(x.col1, '[^;]+', 1, 5) , regexp_substr(x.col1, '[^;]+', 1, 6) from src , xmltable('/a/b' passing xmltype(''||replace(conv_to_clob(src.myblob), chr(10), '')||'') columns col1 varchar2(100) path '.') x;
Note: that's assuming that all the "columns" passed in the string won't be lame.
If one of them might be null, then:
select x.* , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 1) , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 2) , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 3) , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 4) , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 5) , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 6) from src , xmltable('/a/b' passing xmltype(replace(';'||replace(conv_to_clob(src.myblob), chr(10), ';')||'', ';;', '; ;')) columns col1 varchar2(100) path '.') x;
-
Battle of external table: files that contain only a picture
Hello
Does anyone know if this is possible: use an external table to load files that each contain only one photo (photo) in a BLOB column?
So: just a photo, not other columns and not many lines.
I tried, in the sense of this article: ORACLE-DATABASE external Tables containing LOB -
I'm starting to believe it is not possible due to the "delimited records by ' and 'fields terminated by' clauses, which are not really applicable given files (just a picture), but maybe (hopefully), I'm wrong.
Any pointer is more than welcome!
BANNER
----------------------------------------------------------------
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
PL/SQL Release 10.2.0.4.0 - Production
CORE 10.2.0.4.0 Production
AMT for HP - UX: release 10.2.0.4.0 - Production
NLSRTL Version 10.2.0.4.0 - Production
5 selected lines.
SQL > drop table photo_ext is serving;
Deleted table.
SQL > create table photo_ext
2 (blob_content blob
3 )
4 external organization
5 (type oracle_loader
6 default directory tmp
7 access settings
8 (records delimited by newline
9 nobadfile
10 nologfile
11 fields completed by «,»
12 field missing values are null
(13)
14 blob_filename tank (100)
15)
16 transformations column (lobfile (blob_filename) of the blob (tmp) blob_content)
17)
18 rental
19 ("54618645837_vp3.jpg",
20 "54618645837_vp4.jpg."
21 "54618645837_vp2.jpg."
22 "54618645837_vp1.jpg."
23 "54618645837.jpg."
24 "54618636860_vp6.jpg."
25 "54618636860_vp5.jpg."
26 "54618636860_vp4.jpg."
27 "54618636860_vp3.jpg."
28 '54618636860_vp2.jpg '.
29 )
30)
31 reject limit unlimited;
Table created.
SQL > select DBMS_LOB.getlength (blob_content) AS blob_length of photo_ext;
Select DBMS_LOB.getlength (blob_content) AS photo_ext blob_length
*
ERROR on line 1:
ORA-29913: error in executing ODCIEXTTABLEFETCH legend
ORA-29400: data cartridge error
KUP-04001: doing bij openen van bestand/tmp /.
ORA-06512: at "SYS." ORACLE_LOADER', line 52
You need for your names to jpg files in a text file, the name of one file per line in the list, then use the name of the text file as your location in your outer table. In the example below, I've listed just a jpg file bridge.jpg in text file test.dat.
Scott@orcl12c > host type test.dat
Bridge.jpg
Scott@orcl12c > create or replace directory tmp as 'c:\my_oracle_files '.
2.
Created directory.
Scott@orcl12c > create table photo_ext
2 (blob_content blob)
3 the external organization
4 (type oracle_loader
5 by default directory tmp
6 access settings
7 (records delimited by newline
8 nobadfile
9 nologfile
10 fields completed by «,»
11 lack of field values are null
12 (blob_filename tank (100))
13 column transformations
14 (lobfile (blob_filename) of the blob (tmp) blob_content))
15 rental ("test.dat"))
release limit 16 unlimited
17.
Table created.
Scott@orcl12c > select DBMS_LOB.getlength (blob_content) AS photo_ext blob_length
2.
BLOB_LENGTH
-----------
511500
1 selected line.
-
External table: ORA-029913 only through connection allias TNS
Hello world
I have a problem on an Oracle 11.2.0.3 database. They have an external table MYTABLE that points to an Oracle MYDIR directory. This directory is a text file MyFile.txt, read by the external table.
If I connect to the database using "sqlplus user/pass" directly from the DB server, the statement 'select count (*) from' myTable works very well.
If I connect using a LMO allias "sqlplus user/pass@DB", the statement is wrong:
SQL> select count (*) from MYTABLE; select count (*) from MYTABLE * ERROR at line 1: ORA-29913: error in executing ODCIEXTTABLEOPEN callout ORA-29400: data cartridge error KUP-04040: file myfile.txt in MYDIR not found
The user application oracle has READ and write on the Oracle MYDIR directory.
The OS 'oracle' user permissions to write to the directory of the OS.
The problem occurs with the application user, but also with sys and system.
You have an idea? What could make the statement works in direct relation and fail to remote access?
Thanks for your help.
Michael
I finally found the solution! The oracle user has been recently added to the group. The database was relaunched after this change, but the listener has only been reloaded. All processes spawned by the receiver have been opened with privileges evil and that's why my local connections can get access to the file system and not open remote session.
THW following note has helped me find the solution: ORA-29283 "Operation file not valid" with OS group and Oracle user is a member (Doc ID 832424.1).
The note is only on the basis of data Oracle itself, but I tried to restart the receiver and it solved my problem.
Thanks for your help
-
the external table DDL contains the owbunuseX in the list of fields
Hello, we have defined an external table in design OWB11.1 Center, and we have defined columns such as "Field1, Field2,... field50', but when it was generated, we found owbunused0, owbunuse1... in the field of access settings list." Why the domain name is not same as the column in the external table name? Thank you.Hi Gary
OK, now I see. For a table outside right click on it in the Designer tree and click the option synchronize, you will then get a dialog box to sync and when you regenerate you should be good to go.
As other objects external Table may be related to another object, the file (in this case). So that you can make the external table if you change the file (in the same way with the mapping, when you modify a table, you can synchronize the table in and out of the mapping).
See you soon
David -
update of a field for a number of forms of an external table
Good day all.
This one I could not find in the help section.
I designed a number of forms that use the same "rate of pay" (there are 4 categories) for the calculation. I was told to expect that it will take more than a few forms and these new forms will also use the same "rate of pay".
Currently I use 'switch' to insert the "rate of pay" when the user selects a category from a dropdown list.
Is there a way I can update all forms of external 'table' instead of having to update each form individually?
Thank you all
Chomp
Hello
If the form is compatible with Acrobat Reader, which will then cut the data connect.
You should take a look at the blog of John Brinkman: http://blogs.adobe.com/formfeed/2010/07/shared_data_in_packages_part_2.html. Two pieces.
There is also an example of communication inter-formulaire here, but he can be too clumsy: http://assure.ly/qQivbm.
Good luck
Niall
-
Need to change column Datatype on external Table
Hi - I need to change a data type of column on an external table and I have a few questions.
1. I've read that I can do a regular alter table change column statement, but I also see in the script behind the external table, the following instructions:
When I run the statement column change he does not alter the column but does nothing in this clause. Is it possible to update this info as well?FIELDS MISSING FIELD VALUES ARE NULL( claim_number POSITION(1-12) INTEGER EXTERNAL(12), receipt_date POSITION(13-8) INTEGER EXTERNAL(8), ........
I wonder also (currently working at dev with a specification of invalid directory - soon to solve) - will I encounter a problem if I try to edit my article, if a file is located in the related directory? I ask because I know that to change the data type of the existing column in a table standard you must first set this null column, so I wonder how this rul applies when it is an external table.
Thank you!
ChristineAn external table does not store any data. The external file that contains data is only available at query time. You can remove the outer table and re-create it to change both the data type of the column in the table and the type of data in the file. using the statement ALTER TABLE... Type MODIFY instruction, you can modify the data of table column. You can also change the definition of the external table if the file is stored in the direcory and ready to be read through the external table.
In an ordinary table, you can decrease the length of the column if the column is no longer, then the new column length:
SQL >desc test Nome Nullo? Tipo ----------------- -------- ------------ A NOT NULL NUMBER B VARCHAR2(30) C NOT NULL NUMBER D DATE E DATE SQL >alter table test modify b varchar2(10); Tabella modificata. SQL >alter table test modify b varchar2(1); alter table test modify b varchar2(1) * ERRORE alla riga 1: ORA-01441: cannot decrease column length because some value is too big
Max
-
Pretreatment of an external table - table is empty at the end
Hello
Oracle Database 11 g Enterprise Edition Release 11.2.0.1.0 - 64 bit Production
PL/SQL Release 11.2.0.1.0 - Production
"CORE 11.2.0.1.0 Production."
AMT for Solaris: Version 11.2.0.1.0 - Production
NLSRTL Version 11.2.0.1.0 - Production
Solaris 10 x 86
I'm trying to create an external table that first pre-process a file and then the bed. The problem is that, in the end, the outer table is empty even if the read file is not empty.
First, the table works gunzip on the 'employees.csv.gz' file located in/export/home/oracle/archives and then bed. The file employees.csv.gz exists in the specified location. This is the complete script:
-This Directory keeps the archived files
CREATE or REPLACE DIRECTORY arch_dir AS 'archives/export/home/oracle/archives";
-This directory shows where the command gunzip
CREATE or REPLACE the bin_dir AS DIRECTORY ' / usr/bin';
CREATE TABLE emp_exadata
(
employee_id NUMBER (22, 0),
first name VARCHAR2 (20).
last_name VARCHAR2 (25).
e-mail VARCHAR2 (25).
Phone_Number VARCHAR2 (20).
hire_date DATE,
job_id VARCHAR2 (10),
higher wages (22: 2),
commission_pct NUMBER 22 (2)
manager_id NUMBER (22, 0),
department_id NUMBER (22, 0)
)
EXTERNAL ORGANIZATION
(
TYPE oracle_loader
Arch_dir default DIRECTORY
ACCESS SETTINGS
(
RECORDS delimited BY NEWLINE
preprocessor bin_dir: "gunzip".
end fields BY «»
missing field VALUES are NULL
(
employe_id,
first name,
last_name,
E-mail
Phone_Number,
hire_date CHAR date_format DATE mask "dd-mm-yyyy hh24:mi:ss."
job_id,
salary,
commission_pct,
manager_id,
department_id
)
)
LOCATION ("employees_exp.csv.gz")
)
REJECT LIMIT UNLIMITED;
When I choose to emp_exadata the result set is empty!
SELECT * FROM emp_exadata;
no selected line
When I look at the db server in the directory /export/home/oracle/archives I see no archived file employees_exp.csv. Here is the result of the first three lines:
bash - $3.2-3 employees_exp.csv head
198, Donald, Ollivier, DOCONNEL, 650.507.9833, 21/06/2007 00:00:00, SH_CLERK, 2600, 124, 50,.
199, Douglas, grant, DGRANT, 650.507.9844, 2008-01-13 00:00:00, SH_CLERK, 2600, 124, 50,.
200 Jennifer Whalen, JWHALEN 515.123.4444, 17/09/2003 00:00:00, AD_ASST, 4400, 101, 10.
The end of the lines in the file line is LF (unix style). The encoding is in ANSI format.
I tried to experiment around, but cannot view records when I select in the external table. Please help me to solve it.
I also register the generated log file:
LOG file opened at 01/06/15 16:40:11
For table EMP_EXADATA field definitions
Record format DELIMITED BY newline
Data in file have same "endianness" as platform
Rows with all null fields are accepted
Fields of the Data Source:
EMPLOYEE_ID CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
FIRST NAME CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
LAST_NAME CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
EMAIL CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
PHONE_NUMBER CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
HIRE_DATE TANK (19)
Day DATE data type, date mask dd-mm-yyyy hh24:mi:ss
Completed by «,»
Trim whitespace same as SQL Loader
JOB_ID CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
SALARY CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
COMMISSION_PCT CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
MANAGER_ID CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
DEPARTMENT_ID CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
You need to carefully examine the description of the PREPROCESSOR option in the chapter of the manual external Tables utility.
The first point that applies to your question is that the preprocessor must write its data transformed to stdout. To do this with gunzip, you use the - c command line parameter. The second point that applies to your case, in view of the answer to the first point, is that you must write a script shell if your preprocessor requires command line settings.
Kind regards
Bob
-
Reg: question of external table-
Hi Experts,
I am trying to create and read from an external table, but it raises an error. Please notify.
Scenario-
I'm downloading a file of my APEX application that is stored in a BLOB field. Then, comes the following:
DBMS_LOB.CREATETEMPORARY (v_clob, true);
-/ / Convert BLOB on the CLOB type
() DBMS_LOB.converttoclob
v_clob, v_blob,
DBMS_LOB. LOBMAXSIZE,
v_dest_offset, v_src_offset,
v_blob_csid, v_lang_context, g_msg
);
-/ / creating a csv file
v_temp_filename: = 'apex_ ' | TO_CHAR (sysdate, 'yyyymmddhh24miss') |'. CSV';
-/ / Put the csv file in the database directory 'APEX_DIR '.
dbms_xslprocessor.clob2file (v_clob, 'APEX_DIR', v_temp_filename);
-/ / creating an external table
v_ext_table: = q'[create table (apex_temp_data_ext)
C001 varchar2 (4000), c002 varchar2 (4000), c003 varchar2 (4000), c004 varchar2 (4000), c005 varchar2 (4000).
C006 varchar2 (4000), c007 varchar2 (4000), c008 varchar2 (4000), c009 varchar2 (4000) c010 varchar2 (4000).
C011 varchar2 (4000), c012 varchar2 (4000), c013 varchar2 (4000), c014 varchar2 (4000), c015 varchar2 (4000).
C016 varchar2 (4000), c017 varchar2 (4000), c018 varchar2 (4000), c019 varchar2 (4000), c020 varchar2 (4000).
C021 varchar2 (4000), c022 varchar2 (4000), c023 varchar2 (4000), c024 varchar2 (4000), see c025 varchar2 (4000).
C026 varchar2 (4000), c027 varchar2 (4000), c028 varchar2 (4000), c029 varchar2 (4000), c030 varchar2 (4000).
C031 varchar2 (4000), c032 varchar2 (4000), c033 varchar2 (4000), c034 varchar2 (4000), c035 varchar2 (4000).
c037 varchar2 (4000), c038 varchar2 (4000), c039 varchar2 (4000), C036 varchar2 (4000), c040 varchar2 (4000).
c042 varchar2 (4000), c043 varchar2 (4000), c044 varchar2 (4000), c041 varchar2 (4000), c045 varchar2 (4000).
C046 varchar2 (4000), c047 varchar2 (4000), c048 varchar2 (4000), c049 varchar2 (4000), c050 varchar2 (4000)
)
(external) Organization
type oracle_loader
the default directory apex_dir
(settings) access
records delimited by newline
fields completed by «,»
surrounded of possibly ' "' and '"' NOTRIM
missing field values are null
(
C001 varchar2 (4000), c002 varchar2 (4000), c003 varchar2 (4000), c004 varchar2 (4000), c005 varchar2 (4000).
C006 varchar2 (4000), c007 varchar2 (4000), c008 varchar2 (4000), c009 varchar2 (4000) c010 varchar2 (4000).
C011 varchar2 (4000), c012 varchar2 (4000), c013 varchar2 (4000), c014 varchar2 (4000), c015 varchar2 (4000).
C016 varchar2 (4000), c017 varchar2 (4000), c018 varchar2 (4000), c019 varchar2 (4000), c020 varchar2 (4000).
C021 varchar2 (4000), c022 varchar2 (4000), c023 varchar2 (4000), c024 varchar2 (4000), see c025 varchar2 (4000).
C026 varchar2 (4000), c027 varchar2 (4000), c028 varchar2 (4000), c029 varchar2 (4000), c030 varchar2 (4000).
C031 varchar2 (4000), c032 varchar2 (4000), c033 varchar2 (4000), c034 varchar2 (4000), c035 varchar2 (4000).
c037 varchar2 (4000), c038 varchar2 (4000), c039 varchar2 (4000), C036 varchar2 (4000), c040 varchar2 (4000).
c042 varchar2 (4000), c043 varchar2 (4000), c044 varchar2 (4000), c041 varchar2 (4000), c045 varchar2 (4000).
C046 varchar2 (4000), c047 varchar2 (4000), c048 varchar2 (4000), c049 varchar2 (4000), c050 varchar2 (4000)
)
)
location ('] ' |) v_temp_filename | " ')
)
3 parallel
reject limit unlimited;] " ;
immediately run v_ext_table;
It gives me a generic mistake on the front-end server. But, when I take this external Table as well as the "v_temp_filename", that it is created, but when the SELECTION is fired triggers error:
ORA-29913: error in executing ODCIEXTTABLEOPEN legend
ORA-29400: data cartridge error
KUP-00554: error occurred when parsing the access settings
KUP-01005: syntax error: found 'distinctive sign': expected an a: "binary_double, types binary_float, comma, date, defaultif, char, decimal, double, float, integer, (, nullif, oracle_date, oracle_number, position, raw, recnum,), unsigned, varchar, varraw, varrawc, varcharc, zoned.
KUP-01008: the bad ID was: varchar2
KUP-01007: in column 15 of line 6
Privilege is already provided - GRANT READ, WRITE on APEX_DIR to APEX_DEV;
But you should check with DBA on the rwx for the generated 'v_temp_filename' permission.
Pointers?
Thank you and best regards,
Nordine
(on Oracle 11.2.0.3.0, Apex 4.2.5)
Try this:
. . . E t c...
-/ / creating an external table
v_ext_table: = "CREATE TABLE Apex_Temp_Data_Ext
(
C001 VARCHAR2 (4000), C002 VARCHAR2 (4000), C003 VARCHAR2 (4000), C004 VARCHAR2 (4000), C005 VARCHAR2 (4000).
C006 VARCHAR2 (4000), C007 VARCHAR2 (4000), C008 VARCHAR2 (4000), C009 VARCHAR2 (4000), C010 VARCHAR2 (4000).
C011 VARCHAR2 (4000), C012 VARCHAR2 (4000), C013 VARCHAR2 (4000), C014 VARCHAR2 (4000), C015 VARCHAR2 (4000).
C016 VARCHAR2 (4000), C017 VARCHAR2 (4000), C018 VARCHAR2 (4000), C019 VARCHAR2 (4000), C020 VARCHAR2 (4000).
C021 VARCHAR2 (4000), C022 VARCHAR2 (4000), C023 VARCHAR2 (4000), C024 VARCHAR2 (4000), SEE C025 VARCHAR2 (4000).
C026 VARCHAR2 (4000), C027 VARCHAR2 (4000), C028 VARCHAR2 (4000), C029 VARCHAR2 (4000), C030 VARCHAR2 (4000).
C031 VARCHAR2 (4000), C032 VARCHAR2 (4000), C033 VARCHAR2 (4000), C034 VARCHAR2 (4000), C035 VARCHAR2 (4000).
C036 VARCHAR2 (4000), C037 VARCHAR2 (4000), C038 VARCHAR2 (4000), C039 VARCHAR2 (4000), C040 VARCHAR2 (4000).
C041 VARCHAR2 (4000), C042 VARCHAR2 (4000), C043 VARCHAR2 (4000), C044 VARCHAR2 (4000), C045 VARCHAR2 (4000).
C046 VARCHAR2 (4000), C047 VARCHAR2 (4000), C048 VARCHAR2 (4000), C049 VARCHAR2 (4000), C050 VARCHAR2 (4000)
)
(EXTERNAL) ORGANIZATION
TYPE Oracle_Loader
The DEFAULT DIRECTORY Apex_Dir
(SETTINGS) ACCESS
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ","
OPTIONALLY SURROUNDED BY "" "AND" "" NOTRIM
MISSING FIELD VALUES ARE NULL
(
C001 TANK (4000), C002 TANK (4000), C003 TANK (4000), C004 TANK (4000), CHAR C005 (4000),
C006 TANK (4000), C007 TANK (4000), C008 TANK (4000), C009 TANK (4000), C010 TANK (4000).
C011 TANK (4000), C012 TANK (4000), C013 TANK (4000), CHAR C014 (4000), C015 TANK (4000),
C016 TANK (4000), C017 TANK (4000), CHAR C018 (4000), C019 TANK (4000), C020 TANK (4000),
C021 TANK (4000), C022 TANK (4000), C023 TANK (4000), CHAR C024 (4000), SEE C025 TANK (4000),
C026 TANK (4000), CHAR C027 (4000), C028 TANK (4000), C029 TANK (4000), C030 TANK (4000),
C031 TANK (4000), C032 TANK (4000), C033 TANK (4000), C034 TANK (4000), C035 TANK (4000).
C036 TANK (4000), C037 TANK (4000), C038 TANK (4000), C039 TANK (4000), C040 TANK (4000).
C041 TANK (4000), C042 TANK (4000), C043 TANK (4000), C044 TANK (4000), C045 TANK (4000).
C046 TANK (4000), C047 TANK (4000), C048 TANK (4000), C049 TANK (4000), C050 TANK (4000)
)
)
LOCATION('''||) V_Temp_Filename | " ')
)
3 PARALLEL
UNLIMITED RELEASE LIMIT ';
. . . E t c...
-
How do I get the number of incorrect records when you use external tables
Hi all, I have an external table DEPT,.
DEPT. DAT
20. ELECTRONICS
10. SHOES
30. CAMERA
Select * from the Department; only 10 and 30 dept will be led as deptdescr for 20 that there are more than 10 in length so this record will go into the wrong file,
y at - it count any query to display the folder or get any query to get the record to view the entries entries wrong file rather that will drop and see how much is rejected.
Table:
CREATE TABLE DEPT ( DEPT NUMBER, DEPTDESCR VARCHAR2 (10 CHAR) ) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY BATCH_INBOX ACCESS PARAMETERS ( RECORDS DELIMITED BY '\r\n' BADFILE BATCH_BAD:'UPS_DEPT_LOAD_%p.bad' LOGFILE BATCH_LOG:'UPS_DEPT_%p.log' NODISCARDFILE FIELDS TERMINATED BY '|' MISSING FIELD VALUES ARE NULL ( DEPT, DEPTDESCR ) ) LOCATION (BATCH_INBOX:'DEPT.DAT') ) REJECT LIMIT UNLIMITED NOPARALLEL NOMONITORING;
You can use the wrong file as the data file for another external table, with the entire line in a single field. Please see the demo below.
Scott@orcl12c > CREATE or REPLACE DIRECTORY batch_inbox AS 'c:\my_oracle_files '.
2.
Created directory.
Scott@orcl12c > CREATE or REPLACE DIRECTORY batch_bad AS 'c:\my_oracle_files '.
2.
Created directory.
Scott@orcl12c > CREATE or REPLACE DIRECTORY batch_log AS 'c:\my_oracle_files '.
2.
Created directory.
Scott@orcl12c > CREATE TABLE DEPT
2 (
NUMBER 3 DEPT,
4 DEPTDESCR VARCHAR2 (10 CHAR)
(5) ORGANIZATION EXTERNAL
6 (TYPE ORACLE_LOADER
7 DEFAULT DIRECTORY BATCH_INBOX
8 ACCESS SETTINGS
9 (RECORDS DELIMITED BY "\r\n"
10 BADFILE BATCH_BAD: 'UPS_DEPT_LOAD.bad'
11 BATCH_LOG:'UPS_DEPT_%p.log LOGFILE'
12 NODISCARDFILE
13 FIELDS TERMINATED BY ' |'
14 MISSING FIELD VALUES ARE NULL
15 (
DEPT 16,
17 DEPTDESCR
18 )
19 )
LOCATION 20 (BATCH_INBOX:'DEPT.) DAT')
21 )
RELEASE 22 UNLIMITED LIMIT
23 NOPARALLEL
24 NOMONITORING;
Table created.
Scott@orcl12c > SELECT * FROM dept
2.
DEPTDESCR DEPT
---------- ----------
10 SHOES
CAMERA 30
2 selected lines.
Scott@orcl12c > CREATE TABLE DEPT_bad
2 (
3 the_whole_row VARCHAR2 (4000)
(4) ORGANIZATION EXTERNAL
5 (TYPE ORACLE_LOADER
6 DEFAULT DIRECTORY BATCH_INBOX
7 ACCESS SETTINGS
8 (RECORDS DELIMITED BY "\r\n"
9 NOLOGFILE
10 FIELDS TERMINATED BY '\r\n '.
11. THE MISSING FIELD VALUES ARE NULL
12 (
13 the_whole_row CHAR (4000)
14 )
15 )
16 RENTAL (BATCH_BAD:'UPS_DEPT_LOAD.) THE BAD ")"
17 )
RELEASE 18 UNLIMITED LIMIT
19 NOPARALLEL
20 NOMONITORING
21.
Table created.
Scott@orcl12c > SELECT * FROM dept_bad
2.
THE_WHOLE_ROW
--------------------------------------------------------------------------------
20. ELECTRONICS
1 selected line.
Maybe you are looking for
-
Deleting more of 1 photo on my laptop
When I delete more than 4 photos straight from my library, I get sent to the very 1st photo of the library - causing me to have to go back where I was. What I'm doing wrong or is this a bug in the system?
-
I always forget my password due to having too many different connections.
-
Little change in the table and little range between 2 numeric values
Hi all I am a student in Electromechanics in Belgium and I do a simulation of an explosion with 48 leds for my final project. In the attachment, you can see the program that I have already written. I think that this can be done more easily, but I don
-
game has a few graphics that are blackened
questions about the effects of browser I installed a game for Win XP Ser2 using the compatibility setting but without success. The PC came with 2 browsers IE and Google Chrome. I had to uninstall Chrome for some other reason and the game has now resp
-
LV software written in V5.0, scheduled to open in V9.0
I want to open a program with our newly purchased Labview 9.0. I received an error message indicating that the VI were created in too old a version of Labview (5.0.1) and cannot be converted to version 9.0. Is there a way to get around this? Do I