Doubt of external Table
Hi Master,
Using Sql * Loader I can load the dat/csv/txt data to oracle tables. I can use it when clause contained in the data file and load control in different tables.
e.g. when deptno = 20 then table 1. Where deptno = 10 then table2 etc...
But using the external table... How can I load data in different tables as an example above? I googled. but do not get appropriate information. Please advise... !!
Concerning
AR
Bad code. You use positional WHEN when you define variable length fields. As a result, your code will load Department 201, 2090,...:
SQL > create table T_DEPTNO_20)
NUMBER 2 DEPTNO,
3 DNAME VARCHAR2 (14).
LOC 4 VARCHAR2 (13)
5)
6 EXTERNAL ORGANIZATION
() 7
ORACLE_LOADER TYPE 8
9 DEFAULT TEMP DIRECTORY
10 ACCESS SETTINGS
(11)
12 RECORDS DELIMITED BY NEWLINE
13 CHARGE WHEN (1:2) = "20"
14 LOGFILE 'MyFile.log.
15 BADFILE "MyFile.bad."
16 NODISCARDFILE
17 FIELDS TERMINATED BY ', '.
18 EVENTUALLY FRAMED BY "" "
RTRIM 19
20 MISSING FIELD VALUES ARE NULL
(21)
22 DEPTNO,
23 DNAME,
LOC 24
25)
26)
27 RENTAL ("MyFile.csv")
28)
RELEASE 29 UNLIMITED LIMIT
30.
Table created.
SQL > select * from T_DEPTNO_20
2.
DEPTNO DNAME LOC
---------- -------------- -------------
SEARCH 20 DALLAS
201 DO NOT LOAD ARMPIT
SQL > drop table T_DEPTNO_20
2.
Deleted table.
SQL > create table T_DEPTNO_20)
NUMBER 2 DEPTNO,
3 DNAME VARCHAR2 (14).
LOC 4 VARCHAR2 (13)
5)
6 EXTERNAL ORGANIZATION
() 7
ORACLE_LOADER TYPE 8
9 DEFAULT TEMP DIRECTORY
10 ACCESS SETTINGS
(11)
12 RECORDS DELIMITED BY NEWLINE
13 CHARGER WHEN deptno = '20'
14 LOGFILE 'MyFile.log.
15 BADFILE "MyFile.bad."
16 NODISCARDFILE
17 FIELDS TERMINATED BY ', '.
18 EVENTUALLY FRAMED BY "" "
RTRIM 19
20 MISSING FIELD VALUES ARE NULL
(21)
22 DEPTNO,
23 DNAME,
LOC 24
25)
26)
27 RENTAL ("MyFile.csv")
28)
RELEASE 29 UNLIMITED LIMIT
30.
Table created.
SQL > select * from T_DEPTNO_20
2.
DEPTNO DNAME LOC
---------- -------------- -------------
SEARCH 20 DALLAS
SQL >
SY.
Tags: Database
Similar Questions
-
Hello
I try to get data by using the external table as follows
create table nfs_acq ( Participant_ID char( 3 ), Transaction_Type char( 2 ), From_Account_Type char( 2 ), To_Account_Type char( 2 ), Trans_srno char( 12 ), Response_Code char( 2 ), PAN_Number char( 19 ), Member_Number char( 1 ), Approval_Number char( 6 ), System_Trace_Number char( 12 ), Transaction_Date char( 6 ), Transaction_Time char( 6 ), Merchant_Category_Code char( 4 ), Card_Acceptor_S_Date char( 6 ), Card_Acceptor_ID char( 15 ), Card_Acceptor_t_ID char( 8 ), Card_Acceptor_t_Location char( 40 ), Acquirer_ID char( 11 ), Acquirer_Settlement_Date char( 6 ), Transaction_Currency_code char( 3 ), Transaction_Amount char( 15 ), Actual_Transaction_Amount char( 15 ), Transaction_Acitivity_fee char( 15 ), Acquirer_Cur_Code char( 3 ), Acquirer_s_Amount char( 15 ), Acquirer_Settlement_Fee char( 15 ), Acquirer_settl_proc_fee char( 15 ), Tran_Acq_Conv_Rate char( 15 ) ) -- ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY HRK_NEW ACCESS PARAMETERS ( RECORDS fixed 274 skip 1 FIELDS ( -- Participant_ID char( 3 ), Transaction_Type char( 2 ), From_Account_Type char( 2 ), To_Account_Type char( 2 ), Trans_srno char( 12 ), Response_Code char( 2 ), PAN_Number char( 19 ), Member_Number char( 1 ), Approval_Number char( 6 ), System_Trace_Number char( 12 ), Transaction_Date char( 6 ), Transaction_Time char( 6 ), Merchant_Category_Code char( 4 ), Card_Acceptor_S_Date char( 6 ), Card_Acceptor_ID char( 15 ), Card_Acceptor_t_ID char( 8 ), Card_Acceptor_t_Location char( 40 ), Acquirer_ID char( 11 ), Acquirer_Settlement_Date char( 6 ), Transaction_Currency_code char( 3 ), Transaction_Amount char( 15 ), Actual_Transaction_Amount char( 15 ), Transaction_Acitivity_fee char( 15 ), Acquirer_Cur_Code char( 3 ), Acquirer_s_Amount char( 15 ), Acquirer_Settlement_Fee char( 15 ), Acquirer_settl_proc_fee char( 15 ), Tran_Acq_Conv_Rate char( 15 ) ) ) LOCATION (' ACQRPKJB110116.mkjb') ) reject limit unlimited /
table design
However
When I select data from table, I get below error
ORA-29913: error in executing ODCIEXTTABLEOPEN legend
ORA-29400: data cartridge error
KUP-00554: error occurred when parsing the access settings
KUP-01005: syntax error: found 'minussign': expected an a: 'double-quoted-string, identifier, single-quoted-string '.
KUP-01007: in column 2 line 4
29913 00000 - "error in the execution of %s legend".
* Cause: The execution of the specified legend caused an error.
* Action: Examine the error messages take appropriate measures.
This is a sequential I am trying to import
KJB0403 4643470817234643471601110731336011160111ATM00201 001107001303004213191476009651 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000070000000000000070000000000000000000356000000000070000000000000000000000000000000000000001000000000 CHOWK
KJB0503 0174520817337281611601110956246011160111ATM00201 001109001320004293932011713114 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000000000000000000000000000000000000000356000000000000000000000000000000000000000000000000001000000000 CHOWK
KJB0403 0176940817339062491601110957576011160111ATM00201 001109001321004293932011713114 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000050000000000000050000000000000000000356000000000050000000000000000000000000000000000000001000000000 CHOWK
KJB0402 8653240817340573261601110959156011160111ATM00201 001109001322004722554560104882 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000320000000000000320000000000000000000356000000000320000000000000000000000000000000000000001000000000 CHOWK
KJB0403 4459000817360340861601111015066011160111ATM00201 001110001326005044372611412016 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000001000000000000001000000000000000000000356000000001000000000000000000000000000000000000000001000000000 CHOWK
KJB0403 6468820817392369431601111037426011160111ATM00201 001110001335004214090575031872 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000100000000000000100000000000000000000356000000000100000000000000000000000000000000000000001000000000 CHOWK
KJB0402 9133280817459788951601111119146011160111ATM00201 001111001359006074194560106006 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000600000000000000600000000000000000000356000000000600000000000000000000000000000000000000001000000000 CHOWK
KJB0402 1985030817481985031601111131596011160111ATM00201 001111001362005360160500127597 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000001000000000000001000000000000000000000356000000001000000000000000000000000000000000000000001000000000 CHOWK
KJB0403 4400820817484400821601111133216011160111ATM00201 001111001363004213371215519505 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000200000000000000200000000000000000000356000000000200000000000000000000000000000000000000001000000000 CHOWK
KJB0402 0817491361901601111137166011160111ATM00201 001111001366044135083790012559 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000300000000000000000000000000000000000356000000000000000000000000000000000000000000000000001000000000 CHOWK
KJB0403 6891010817496562841601111140096011160111ATM00201 001111001368004214090322036455 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000050000000000000050000000000000000000356000000000050000000000000000000000000000000000000001000000000 CHOWK
KJB0403 1310430817509973121601111147346011160111ATM00201 001111001371004689680028170636 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000001000000000000001000000000000000000000356000000001000000000000000000000000000000000000000001000000000 CHOWK
KJB0503 0309320817544106971601111206046011160111ATM00201 001112001375005296160001593353 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000000000000000000000000000000000000000356000000000000000000000000000000000000000000000000001000000000 CHOWK
KJB0403 0322130817546216041601111207116011160111ATM00201 001112001376005296160001593353 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000700000000000000700000000000000000000356000000000700000000000000000000000000000000000000001000000000 CHOWK
KJB0502 8851180817548851181601111208376011160111ATM00201 001112001377005087530341448208 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000000000000000000000000000000000000000356000000000000000000000000000000000000000000000000001000000000 CHOWK
KJB0402 0566530817550566531601111209336011160111ATM00201 001112001378005087530341448208 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000420000000000000420000000000000000000356000000000420000000000000000000000000000000000000001000000000 CHOWK
KJB0403 6943770817556943771601111213006011160111ATM00201 001112001379004214920342912709 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000050000000000000050000000000000000000356000000000050000000000000000000000000000000000000001000000000 CHOWK
KJB0402 7864780817559532151601111214256011160111ATM00201 001112001380006220180039900075266 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000330000000000000330000000000000000000356000000000330000000000000000000000000000000000000001000000000 CHOWK
KJB0402 1083880817581341641601111226086011160111ATM00201 001112001382004378990101766440 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000800000000000000800000000000000000000356000000000800000000000000000000000000000000000000001000000000 CHOWK
KJB0403 4830820817582764451601111226546011160111ATM00201 001112001383006071600100041514 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000700000000000000700000000000000000000356000000000700000000000000000000000000000000000000001000000000 CHOWK
KJB0403 6286600817597510771601111234376011160111ATM00201 001112001386005326760302103432 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000100000000000000100000000000000000000356000000000100000000000000000000000000000000000000001000000000 CHOWK
KJB0402 0029180817604353121601111238126011160111ATM00201 001112001389004704560203009224 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000001000000000000001000000000000000000000356000000001000000000000000000000000000000000000000001000000000 CHOWK
KJB0403 3000000817608080441601111240116011160111ATM00201 001112001390005044339052564987192 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000400000000000000400000000000000000000356000000000400000000000000000000000000000000000000001000000000 CHOWK
KJB0403 0139290817626145271601111249456011160111ATM00201 001112001396005346800000069646 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000900000000000000900000000000000000000356000000000900000000000000000000000000000000000000001000000000 CHOWK
KJB0403 1126320817671162161601111313226011160111ATM00201 001113001406005196190081887558 ATM00201TILAK KALYAN THANE MHIN800044 160111356000000000020000000000000020000000000000000000356000000000020000000000000000000000000000000000000001000000000 CHOWK
Help, please
External table reference...
Comments are lines that start with two dashes followed text. Comments must be placed before all access settings, for example:
access_parameters Clause
The access parameters clause contains comments, record formatting, and field formatting information. The syntax for the
access_parameters
clause is as follows: Text description of the illustration et_access_parameter.gifComments
Comments are lines that begin with two dashes followed by text. Comments must be placed before any access parameters, for example:
--This is a comment --This is another comment RECORDS DELIMITED BY NEWLINE
-
Since data is stored outdoors, and the definition that is stored inside, it means that a the table structure of the outer table is stored in the database as well (or a virtual table based on the definition of the external table...)
I'll hit the ORA-DOCS again once, back in the external tables read tonight on them 2 or 3 sources and it is not quite clear whether real data with structure exists outside the database or an internal table set (and stored) retrieves data from an outside source.
Since data is stored outdoors, and the definition that is stored inside, it means that a the table structure of the outer table is stored in the database as well (or a virtual table based on the definition of the external table...)
The 'definition' you refer to IS the structure of the table; they are one and the same. There is no 'table' stored in the database using space or storage. When a query on an external table is executed, the data source is read.
I'll be hitting the ORA-DOCS up again, just got back into external tables tonight reading up on them from 2-3 sources and it is not quite clear if an actual data with structure exists out of the database , or an internally defined (and stored?) table draws data from an outside source.
I suggest you that start with the documentation of Oracle - including the ground provided the link to:
If any "metadata" are stored outside the database depends on if the file that is outside of the database was produced by Oracle using the robot UNLOADING discussed in this doc
Unloading of data using ORACLE_DATAPUMP Access driver
To unload data, you use the
ORACLE_DATAPUMP
driver access. The stream that is discharged is in a proprietary format and contains all of the data column for each row being unloaded.A discharge operation also creates a stream of metadata that describes the content of the data stream. The information in the metadata stream are required to load the stream. Therefore, the metadata stream is written to the file data and placed before the data stream.
If YOU provide data/files, then you MUST provide it in the format expected by the external table statement. You can, if you wish, use a preprocessor to convert ANY file (zip, encrypted, etc.) in the required format.
For YOUR files, you can the metadata stored in the same file, or elsewhere, if you choose, but Oracle will have NO knowledge of this fact and not will NOT be involved in the transformation or read any of the metadata that you provide. Your preprocessor must remove all these metadata and ONLY provide data in the format appropriate for Oracle to use.
If the file was produced by the process of UNLOADING Oracle then it will include metadata that Oracle WILL read, use, and remove as says this quote from doc above. These external metadata is added to the real external table definintion/metadata stored in the dictionary.
-
ORA-29913 for external table on Windows 7
Hello
Please help with the following problem.
I installed Oracle 11 g on Windows 7 Pro workstation and have created an external table.
I'm getting an ORA-29913 when I select.
What I did is given below.
create the directory imp_files as 'C:\app\user\admin\orcl\dpdump\'-'C:\orcl_work '.
;drop table people
;
create table people)
first name varchar2 (250).
VARCHAR2 (250) last_name,.
hire_date date,
number of salary
)
(external) Organization
type oracle_loader
the default directory data_pump_dir
(settings) access
records delimited by newline
BadFile data_pump_dir: 'pers1%a_%p.bad' /*'pers.bad'*/
data_pump_dir log file: "pers1%a_%p.log" /*'pers.log'*/
fields termintated by ' |'
missing field values are null
(first_name, last_name, hire_date 'dd.mm.yyyy' format mask, salary)
)
location ("pers_table1.txt")
)
reject limit unlimited
;Hello
you completed wrongly spelt...
Try to fix that and try again.
See you soon,.
rich
-
My question is - is it possible for me to fix this error at the level of external table definition? Please advice
Here is the data file I am trying to download...
KSEA | 08-10 - 2015-17.00.00 | 83.000000 | 32.000000 | 5.800000
KBFI | 2015-08-06 - 15.00.00 | 78.000000 | 35.000000 | 0.000000
KSEA | 08-10 - 2015-11.00.00 | 73.000000 | 55.000000 | 5.800000
KSEA | 08-08 - 2015-05.00.00 | 61.000000 | 90.000000 | 5.800000
KBFI | 2015-08-06 - 16.00.00 | 78.000000 | 36.000000 | 5.800000
KSEA | 2015-08-07 - 18.00.00 | 82.000000 | 31.000000 | 10.400000
KSEA | 08-10 - 2015-00.00.00 | 65.000000 | 61.000000 | 4.600000
KBFI | 08-08 - 2015-07.00.00 | 63.000000 | 84.000000 | 4.600000
KSEA | 08-10 - 2015-15.00.00 | 81.000000 | 34.000000 | 8.100000
This is the external table script
CREATE TABLE MWATCH. MWATCH. WEATHER_EXT ".
(
LOCATION_SAN VARCHAR2 (120 BYTE),
DATE OF WEATHER_DATETIME,
NUMBER (16) TEMP.
NUMBER (16) OF MOISTURE,
WIND_SPEED NUMBER (16)
)
EXTERNAL ORGANIZATION
(TYPE ORACLE_LOADER
THE DEFAULT DIRECTORY METERWATCH
ACCESS SETTINGS
(records delimited by newline
BadFile "METERWATCH": "weather_bad" logfile 'METERWATCH': 'weather_log '.
fields ended by ' |' missing field values are null
(location_san, WEATHER_DATETIME char date_format DATE mask "YYYY-mm-dd - hh.mi.ss", TEMPERATURE, MOISTURE, wind_speed)
)
LOCATION (METERWATCH: 'weather.dat')
)
REJECT LIMIT UNLIMITED
PARALLEL (DEGREE 5 1 INSTANCES)
NOMONITORING;
Here is the error in the weather_bad which is generated files...
column WEATHER_DATETIME of 55 error processing in the 1st row to the /export/home/camsdocd/meterwatch/weather.dat data file ORA - 01849ther_log.log 55 56 error processing column WEATHER_DATETIME in the row 1 for the /export/home/camsdocd/meterwatch/weather.dat data file 57 56 ORA - 01849ther_log.log: time must be between 1 and 12 58 column WEATHER_DATETIME 57 error during treatment number 2 for the /export/home/camsdocd/meterwatch/weather.dat data file 59 ORA-58 01849: time must be between 1 and 12 60 column WEATHER_DATETIME of 59 error processing 5th for the /export/home/camsdocd/meterwatch/weather.dat data file 61 ORA-60 01849: time must be between 1 and 12 62 column WEATHER_DATETIME of 61 error treatment in line 6 to the /export/home/camsdocd/meterwatch/weather.dat data file 63 ORA-62 01849: time must be between 1 and 12 64 column WEATHER_DATETIME of 63 error treatment in row 7 for datafile /export/home/camsdocd/meterwatch/weather.dat 65 ORA-64 01849: time must be between 1 and 12 66 column WEATHER_DATETIME of 65 error treatment 9 for the /export/home/camsdocd/meterwatch/weather.dat data file online 67: time must be between 1 and 12 My question is - is it possible for me to fix this error at the level of external table definition? Please advice
Yes it is possible. Let's not your date mask. You're masking for 12-hour format when your data is in 24-hour format. Change the mask of your date to be "YYYY-mm-dd-hh24. MI.ss ". Notice the change in "BOLD".
-
External table with mixed columns
Hello everyone. We use Oracle 11R1. We have an external table pointing to a CSV with 7 columns inside. The file has always column 7 but the column order differs from time to time. Each column has a header that remains consistent. Is there a way to map the column names to the column header, s so that we did not change the definition of the external table every time a new file is available in?
Thank you.
Hello
As John said, you must assign names of columns when the table is created. I guess you could have a dynamic SQL solution that reads the header you mentioned, uses this information to write a CREATE table, drop the table, and then recreated with the new order of the columns.
You may have a view that maps the 7 columns in your table of 7 columns in the view. Which column gets the mapping to that may depend on the header.
-
create an external table based on local drive
Hello world
Is it possible to create an external table and that it points to a file on my local drive?
It is perhaps a silly question. I'm sorry.
Thank you
Lydiea
In addition to the response of William Robertson (which is correct), if you do not have a way to make the server access to your local disk, you can use sql loader on your local computer to load the data into an actual table.
-
Load the XML file into Oracle external Table
I load the data from the XML file into an intermediate table Oracle using external Tables.Let's say below, it is my XML file
< header >
< A_CNT > 10 < / A_CNT >
< E_CNT > 10 < / E_CNT >
< AF_CNT > 10 < / AF_CNT >
< / header >
< student >
<>students-details
< Student_info >
< Single_Info >
< ID > 18 / < ID >
New York < City > < / City >
< country > United States < / country >
< Name_lst >
< Student_name >
Samuel < name > < / name >
Paul < Last_name > < / Last_name >
< DOB > 19871208 < / DOB >
Aware of < RecordStatus > < / RecordStatus >
< / Student_name >
< Student_name >
Samuel < name > < / name >
Paul < Last_name > < / Last_name >
< DOB > 19871208 < / DOB >< TerminationDt > 20050812 < / TerminationDt >
History of < RecordStatus > < / RecordStatus >
< / Student_name >
< / Name_lst >
< Personal_Info >
<>men < / Type >
< 27 > < / Age >
< / Personal_Info >
< / Single_Info >
< / Student_info >< student - register >
class < A >
< info >
< detail >
< ID student > 18 < / student >
EE < major > < / Major >
< course-Grades >
< course > VLSI < / course >
< degree > 3.0 < / Grade >
< / course-Grades >
< course-Grades >
< course > nanotechnology < / course >
< degree > 4.0 < / Grade >
< / course-Grades >
< / details >
< detail >
< ID student > 18 < / student >
THIS < major > < / Major >
< / details >
< / info >
class < A >
< Student_Enrol >
<>students-details
< student >I load this XML data file into a single table using an external Table. Could someone help me please with coding.
Thank you
Reva
Could you please help me how to insert my XML content into that.
Same as before, try a plain old INSERT:
insert into xml_pecos
values)
XmlType (bfilename ('XML_DIR', "test.xml"), nls_charset_id ('AL32UTF8'))
);
But you'll probably hit the same limitation as with the binary XMLType table.
In this case, you can use FTP to load the file as a resource in the XML DB repository.
If the XML schema has been registered with the hierarchy enabled then the file will be automatically inserted into the table.
Could you post the exact statement that you used to save the scheme?
In the meantime, you can also read this article, I did a few years ago, it covers the XML DB features that may be useful here, including details on how to load the file via FTP:
https://odieweblog.WordPress.com/2011/11/23/Oracle-XML-DB-a-practical-example/
And documentation of the course: http://docs.oracle.com/cd/E11882_01/appdev.112/e23094/xdb06stt.htm#ADXDB4672
-
Pretreatment of an external table - table is empty at the end
Hello
Oracle Database 11 g Enterprise Edition Release 11.2.0.1.0 - 64 bit Production
PL/SQL Release 11.2.0.1.0 - Production
"CORE 11.2.0.1.0 Production."
AMT for Solaris: Version 11.2.0.1.0 - Production
NLSRTL Version 11.2.0.1.0 - Production
Solaris 10 x 86
I'm trying to create an external table that first pre-process a file and then the bed. The problem is that, in the end, the outer table is empty even if the read file is not empty.
First, the table works gunzip on the 'employees.csv.gz' file located in/export/home/oracle/archives and then bed. The file employees.csv.gz exists in the specified location. This is the complete script:
-This Directory keeps the archived files
CREATE or REPLACE DIRECTORY arch_dir AS 'archives/export/home/oracle/archives";
-This directory shows where the command gunzip
CREATE or REPLACE the bin_dir AS DIRECTORY ' / usr/bin';
CREATE TABLE emp_exadata
(
employee_id NUMBER (22, 0),
first name VARCHAR2 (20).
last_name VARCHAR2 (25).
e-mail VARCHAR2 (25).
Phone_Number VARCHAR2 (20).
hire_date DATE,
job_id VARCHAR2 (10),
higher wages (22: 2),
commission_pct NUMBER 22 (2)
manager_id NUMBER (22, 0),
department_id NUMBER (22, 0)
)
EXTERNAL ORGANIZATION
(
TYPE oracle_loader
Arch_dir default DIRECTORY
ACCESS SETTINGS
(
RECORDS delimited BY NEWLINE
preprocessor bin_dir: "gunzip".
end fields BY «»
missing field VALUES are NULL
(
employe_id,
first name,
last_name,
E-mail
Phone_Number,
hire_date CHAR date_format DATE mask "dd-mm-yyyy hh24:mi:ss."
job_id,
salary,
commission_pct,
manager_id,
department_id
)
)
LOCATION ("employees_exp.csv.gz")
)
REJECT LIMIT UNLIMITED;
When I choose to emp_exadata the result set is empty!
SELECT * FROM emp_exadata;
no selected line
When I look at the db server in the directory /export/home/oracle/archives I see no archived file employees_exp.csv. Here is the result of the first three lines:
bash - $3.2-3 employees_exp.csv head
198, Donald, Ollivier, DOCONNEL, 650.507.9833, 21/06/2007 00:00:00, SH_CLERK, 2600, 124, 50,.
199, Douglas, grant, DGRANT, 650.507.9844, 2008-01-13 00:00:00, SH_CLERK, 2600, 124, 50,.
200 Jennifer Whalen, JWHALEN 515.123.4444, 17/09/2003 00:00:00, AD_ASST, 4400, 101, 10.
The end of the lines in the file line is LF (unix style). The encoding is in ANSI format.
I tried to experiment around, but cannot view records when I select in the external table. Please help me to solve it.
I also register the generated log file:
LOG file opened at 01/06/15 16:40:11
For table EMP_EXADATA field definitions
Record format DELIMITED BY newline
Data in file have same "endianness" as platform
Rows with all null fields are accepted
Fields of the Data Source:
EMPLOYEE_ID CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
FIRST NAME CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
LAST_NAME CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
EMAIL CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
PHONE_NUMBER CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
HIRE_DATE TANK (19)
Day DATE data type, date mask dd-mm-yyyy hh24:mi:ss
Completed by «,»
Trim whitespace same as SQL Loader
JOB_ID CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
SALARY CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
COMMISSION_PCT CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
MANAGER_ID CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
DEPARTMENT_ID CHAR (255)
Completed by «,»
Trim whitespace same as SQL Loader
You need to carefully examine the description of the PREPROCESSOR option in the chapter of the manual external Tables utility.
The first point that applies to your question is that the preprocessor must write its data transformed to stdout. To do this with gunzip, you use the - c command line parameter. The second point that applies to your case, in view of the answer to the first point, is that you must write a script shell if your preprocessor requires command line settings.
Kind regards
Bob
-
Access to an external table via the display in the UCM configuration manager problem
Hi all
I know that this question is already asked here: to access the external tables in DB / in UCM Applet configuration manager but the conversation has already been archived, and I can't comment there. I'm following approach given by William Phelps in above said archived thread.
I am able to create dblink with data remote and created the view by using the linked table in my diagram of the Complutense University of Madrid. The problem that I face is that this point of view such as told by William is not visible in the configuration manager. Although I tried to give (create view and create all views) system diagram University Complutense of MADRID user privileges, but I'm not sure that it will also provide access rights. I am using the SQL Developer and working at the University Complutense of MADRID 11 g.
All comment or suggestion will be a great help.
Stéphane yapi
The value of this configuration variable: EBRIncludeViewsInTableList = 1
https://jonathanhult.com/blog/2013/11/use-database-view-WebCenter-content-schema-view/
Jonathan
-
Loading external Table with quotes
I have a file with fields in the file are as TAB delimiter ~ TAB.
Example as below:
QM ~ CD ~ Exzm ~ BMW
DM ~ BD ~ Exzm ~ BMW
CREATE TABLE test
(
Col_1 VARCHAR2 (100),
Col_2 VARCHAR2 (100),
Col_3 VARCHAR2 (100),
Col_4 VARCHAR2 (100)
)
EXTERNAL ORGANIZATION
(TYPE ORACLE_LOADER
DEFAULT DIRECTORY 'Test_Report '.
ACCESS SETTINGS
(records delimited by '\n'
CHARACTERSET 'UTF8 '.
fields terminated by '\t~\t '.
missing field values are null
)
LOCATION ("test.asc")
)
REJECT LIMIT UNLIMITED;
OUTPUT:
----------------
Data loaded in DB, but col_4 data comes from the quotation as below
col_4
-------
"BMW".
"BMW".
Note: Col1 - col3 data arrives correctly.
2807064 wrote:
A finding on my side.
I found that the values of Col_4 after inserting into DB with "transport return character" (CHR (13)) at the end of each value as shown below when I copy paste the value in notepad ++ "»
Example:
----------
"BMW".
"
But if I see the file I saw that BMW.
My question is, in this case the external table loading must fail right? Why is this it is to load data in DB?
Do you have this file begin life on windows, and then are transferred to * nix to serve an external table? If so, which explains a lot. Windows is the standard record delimiter x '0d0a' (Chr (13) 10) On * nix, it's just x '0A' (10. When the process of loader is scanning for record delimiter he's just looking for the '0A' x and x'd 0' gets included in the data.
Two solutions-
1 - Make sure that the data file is transferred so that the Records delimiters are converted. It's supposed to to happen with ascii ftp mode, but this week I saw several examples in the House of it does not.
2. attach your table definition external to seek the delimiter of actual recording instead of the default value of the operating system. == RECORDS DELIMITED BY X '0D0A '.
-
Bulk insert in an external table
Hi, I get the error ora-29913, ora-01410 trying to do a bulk insert of external table
INSERT
IN CORE_TB_LOG
(SELECT 'MODEL', 'ARCH_BH_MODEL', ROWID, "MODEL-D-000000001', - 45, 'A', SYSDATE, 'R'")
OF ARCH_BH_MODEL1
WHERE length (MOD_XCODIGO) > 10)
INSERT
*
ERROR on line 1:
ORA-29913: error in executing ODCIEXTTABLEFETCH legend
ORA-01410: invalid ROWID
ARCH_BH_MODEL1 is the external table.
What's wrong?
Thank you.
Hello
There is no ROWID in external tables.
It makes sense: ROWID identifies where a line is stored in the database; It shows the data file and the block number in this file.
External tables are not stored in the database. They exist independently of any data file in the database. The concept of an Oracle block does not apply to them.
Why would you copy the ROWID, even if you could?
Apart from ROWID and SYSDATE, you select only literals. You don't want to select all the real data of the external table?
What is the big picture here? Post a small example of data (a CREATE TABLE statement and a small data file for the external table) and the desired results from these sample data (in other words, what core_tb_log must contain after INSERTION is complete.) Explain how you get these results from data provided.
Check out the Forum FAQ: Re: 2. How can I ask a question on the forums?
-
How to select the csv data stored in a BLOB column as if it were an external table?
Hi all
(Happy to be back after a while! )
Currently I am working on a site where users should be able to load the data of csv (comma is the separator) of their client machines (APEX 3.2 application) in the Oracle 11.2.0.4.0 EE database.
My problem is:
I can't use an external table (for the first time in my life ) so I'm a little clueless what to do as the csv data is stored by the application of the APEX in a BLOB column, and I'm looking for an elegant way (at least SQL PL/SQL/maximization) to insert the data into the destination table (run validations by a MERGER would be the most effective way to do the job).
I found a few examples, but I think they are too heavy and there could be a more elegant way in Oracle DB 11.2.
Simple unit test:
drop table CBC purge;
drop table dst serving;
create table src
(myblob blob
);
create table dst
(num number
, varchar2 (6) str
);
Insert in src
Select utl_raw.cast_to_raw (1; AAAAAA ;'|| Chr (10) |
2; BATH; »
)
Double;
Desired (of course) output based on the data in table SRC:
SQL > select * DST;
NUM STR
---------- ------
1 ABDELKRIM
2 BATH
Does anybody know a solution for this?
Any ideas/pointers/links/examples are welcome!
/ * WARNING: I was 'off' for about 3 months, then the Oracle - the part of my brain has become a bit "rusty and I feel it should not be so complicated as the examples I've found sofar ' * /"
Haha, wonder about regexp is like the blind leading the blind!
However, it's my mistake: I forgot to put the starting position setting (so 1, 2, 3,... was in fact the starting position, not the nth occurrence. duh!)
So, it should actually be:
select x.* , regexp_substr(x.col1, '[^;]+', 1, 1) , regexp_substr(x.col1, '[^;]+', 1, 2) , regexp_substr(x.col1, '[^;]+', 1, 3) , regexp_substr(x.col1, '[^;]+', 1, 4) , regexp_substr(x.col1, '[^;]+', 1, 5) , regexp_substr(x.col1, '[^;]+', 1, 6) from src , xmltable('/a/b' passing xmltype(''||replace(conv_to_clob(src.myblob), chr(10), '')||'') columns col1 varchar2(100) path '.') x;
Note: that's assuming that all the "columns" passed in the string won't be lame.
If one of them might be null, then:
select x.* , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 1) , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 2) , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 3) , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 4) , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 5) , regexp_substr(ltrim(x.col1, ';'), '[^;]+', 1, 6) from src , xmltable('/a/b' passing xmltype(replace(';'||replace(conv_to_clob(src.myblob), chr(10), ';')||'', ';;', '; ;')) columns col1 varchar2(100) path '.') x;
-
Reg: question of external table-
Hi Experts,
I am trying to create and read from an external table, but it raises an error. Please notify.
Scenario-
I'm downloading a file of my APEX application that is stored in a BLOB field. Then, comes the following:
DBMS_LOB.CREATETEMPORARY (v_clob, true);
-/ / Convert BLOB on the CLOB type
() DBMS_LOB.converttoclob
v_clob, v_blob,
DBMS_LOB. LOBMAXSIZE,
v_dest_offset, v_src_offset,
v_blob_csid, v_lang_context, g_msg
);
-/ / creating a csv file
v_temp_filename: = 'apex_ ' | TO_CHAR (sysdate, 'yyyymmddhh24miss') |'. CSV';
-/ / Put the csv file in the database directory 'APEX_DIR '.
dbms_xslprocessor.clob2file (v_clob, 'APEX_DIR', v_temp_filename);
-/ / creating an external table
v_ext_table: = q'[create table (apex_temp_data_ext)
C001 varchar2 (4000), c002 varchar2 (4000), c003 varchar2 (4000), c004 varchar2 (4000), c005 varchar2 (4000).
C006 varchar2 (4000), c007 varchar2 (4000), c008 varchar2 (4000), c009 varchar2 (4000) c010 varchar2 (4000).
C011 varchar2 (4000), c012 varchar2 (4000), c013 varchar2 (4000), c014 varchar2 (4000), c015 varchar2 (4000).
C016 varchar2 (4000), c017 varchar2 (4000), c018 varchar2 (4000), c019 varchar2 (4000), c020 varchar2 (4000).
C021 varchar2 (4000), c022 varchar2 (4000), c023 varchar2 (4000), c024 varchar2 (4000), see c025 varchar2 (4000).
C026 varchar2 (4000), c027 varchar2 (4000), c028 varchar2 (4000), c029 varchar2 (4000), c030 varchar2 (4000).
C031 varchar2 (4000), c032 varchar2 (4000), c033 varchar2 (4000), c034 varchar2 (4000), c035 varchar2 (4000).
c037 varchar2 (4000), c038 varchar2 (4000), c039 varchar2 (4000), C036 varchar2 (4000), c040 varchar2 (4000).
c042 varchar2 (4000), c043 varchar2 (4000), c044 varchar2 (4000), c041 varchar2 (4000), c045 varchar2 (4000).
C046 varchar2 (4000), c047 varchar2 (4000), c048 varchar2 (4000), c049 varchar2 (4000), c050 varchar2 (4000)
)
(external) Organization
type oracle_loader
the default directory apex_dir
(settings) access
records delimited by newline
fields completed by «,»
surrounded of possibly ' "' and '"' NOTRIM
missing field values are null
(
C001 varchar2 (4000), c002 varchar2 (4000), c003 varchar2 (4000), c004 varchar2 (4000), c005 varchar2 (4000).
C006 varchar2 (4000), c007 varchar2 (4000), c008 varchar2 (4000), c009 varchar2 (4000) c010 varchar2 (4000).
C011 varchar2 (4000), c012 varchar2 (4000), c013 varchar2 (4000), c014 varchar2 (4000), c015 varchar2 (4000).
C016 varchar2 (4000), c017 varchar2 (4000), c018 varchar2 (4000), c019 varchar2 (4000), c020 varchar2 (4000).
C021 varchar2 (4000), c022 varchar2 (4000), c023 varchar2 (4000), c024 varchar2 (4000), see c025 varchar2 (4000).
C026 varchar2 (4000), c027 varchar2 (4000), c028 varchar2 (4000), c029 varchar2 (4000), c030 varchar2 (4000).
C031 varchar2 (4000), c032 varchar2 (4000), c033 varchar2 (4000), c034 varchar2 (4000), c035 varchar2 (4000).
c037 varchar2 (4000), c038 varchar2 (4000), c039 varchar2 (4000), C036 varchar2 (4000), c040 varchar2 (4000).
c042 varchar2 (4000), c043 varchar2 (4000), c044 varchar2 (4000), c041 varchar2 (4000), c045 varchar2 (4000).
C046 varchar2 (4000), c047 varchar2 (4000), c048 varchar2 (4000), c049 varchar2 (4000), c050 varchar2 (4000)
)
)
location ('] ' |) v_temp_filename | " ')
)
3 parallel
reject limit unlimited;] " ;
immediately run v_ext_table;
It gives me a generic mistake on the front-end server. But, when I take this external Table as well as the "v_temp_filename", that it is created, but when the SELECTION is fired triggers error:
ORA-29913: error in executing ODCIEXTTABLEOPEN legend
ORA-29400: data cartridge error
KUP-00554: error occurred when parsing the access settings
KUP-01005: syntax error: found 'distinctive sign': expected an a: "binary_double, types binary_float, comma, date, defaultif, char, decimal, double, float, integer, (, nullif, oracle_date, oracle_number, position, raw, recnum,), unsigned, varchar, varraw, varrawc, varcharc, zoned.
KUP-01008: the bad ID was: varchar2
KUP-01007: in column 15 of line 6
Privilege is already provided - GRANT READ, WRITE on APEX_DIR to APEX_DEV;
But you should check with DBA on the rwx for the generated 'v_temp_filename' permission.
Pointers?
Thank you and best regards,
Nordine
(on Oracle 11.2.0.3.0, Apex 4.2.5)
Try this:
. . . E t c...
-/ / creating an external table
v_ext_table: = "CREATE TABLE Apex_Temp_Data_Ext
(
C001 VARCHAR2 (4000), C002 VARCHAR2 (4000), C003 VARCHAR2 (4000), C004 VARCHAR2 (4000), C005 VARCHAR2 (4000).
C006 VARCHAR2 (4000), C007 VARCHAR2 (4000), C008 VARCHAR2 (4000), C009 VARCHAR2 (4000), C010 VARCHAR2 (4000).
C011 VARCHAR2 (4000), C012 VARCHAR2 (4000), C013 VARCHAR2 (4000), C014 VARCHAR2 (4000), C015 VARCHAR2 (4000).
C016 VARCHAR2 (4000), C017 VARCHAR2 (4000), C018 VARCHAR2 (4000), C019 VARCHAR2 (4000), C020 VARCHAR2 (4000).
C021 VARCHAR2 (4000), C022 VARCHAR2 (4000), C023 VARCHAR2 (4000), C024 VARCHAR2 (4000), SEE C025 VARCHAR2 (4000).
C026 VARCHAR2 (4000), C027 VARCHAR2 (4000), C028 VARCHAR2 (4000), C029 VARCHAR2 (4000), C030 VARCHAR2 (4000).
C031 VARCHAR2 (4000), C032 VARCHAR2 (4000), C033 VARCHAR2 (4000), C034 VARCHAR2 (4000), C035 VARCHAR2 (4000).
C036 VARCHAR2 (4000), C037 VARCHAR2 (4000), C038 VARCHAR2 (4000), C039 VARCHAR2 (4000), C040 VARCHAR2 (4000).
C041 VARCHAR2 (4000), C042 VARCHAR2 (4000), C043 VARCHAR2 (4000), C044 VARCHAR2 (4000), C045 VARCHAR2 (4000).
C046 VARCHAR2 (4000), C047 VARCHAR2 (4000), C048 VARCHAR2 (4000), C049 VARCHAR2 (4000), C050 VARCHAR2 (4000)
)
(EXTERNAL) ORGANIZATION
TYPE Oracle_Loader
The DEFAULT DIRECTORY Apex_Dir
(SETTINGS) ACCESS
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ","
OPTIONALLY SURROUNDED BY "" "AND" "" NOTRIM
MISSING FIELD VALUES ARE NULL
(
C001 TANK (4000), C002 TANK (4000), C003 TANK (4000), C004 TANK (4000), CHAR C005 (4000),
C006 TANK (4000), C007 TANK (4000), C008 TANK (4000), C009 TANK (4000), C010 TANK (4000).
C011 TANK (4000), C012 TANK (4000), C013 TANK (4000), CHAR C014 (4000), C015 TANK (4000),
C016 TANK (4000), C017 TANK (4000), CHAR C018 (4000), C019 TANK (4000), C020 TANK (4000),
C021 TANK (4000), C022 TANK (4000), C023 TANK (4000), CHAR C024 (4000), SEE C025 TANK (4000),
C026 TANK (4000), CHAR C027 (4000), C028 TANK (4000), C029 TANK (4000), C030 TANK (4000),
C031 TANK (4000), C032 TANK (4000), C033 TANK (4000), C034 TANK (4000), C035 TANK (4000).
C036 TANK (4000), C037 TANK (4000), C038 TANK (4000), C039 TANK (4000), C040 TANK (4000).
C041 TANK (4000), C042 TANK (4000), C043 TANK (4000), C044 TANK (4000), C045 TANK (4000).
C046 TANK (4000), C047 TANK (4000), C048 TANK (4000), C049 TANK (4000), C050 TANK (4000)
)
)
LOCATION('''||) V_Temp_Filename | " ')
)
3 PARALLEL
UNLIMITED RELEASE LIMIT ';
. . . E t c...
-
The Oracle 12 c memory option works with external tables?
Hello
Does anyone know if the external tables are also candidates for the benefit of the Oracle 12 c option in memory? I have read the documentation and white papers, and can't find any reference to external tables. If it is possible, we are very interested to know all about her, especially any limitation.
Thank you.
This is not the right forum for this question; This forum is for the memory of TimesTen database not Oracle database in memory option. These are completely different things.
But it happens that I can tell you that no, external tables are not candidates for use with the In-Memory Option.
Chris
Maybe you are looking for
-
HP Envy 15 Notebook PC: HP Envy 15 laptop hinges
I am extremely disappointed by HP; I bought my computer just long enough so that the warranty expires before that one of the attching the display hinges to the broken keyboard, something that has been difficult to close computer and, in general, to u
-
Satellite P20-304 stops unexpectedly
I have a laptop satellite P20 304 coming sometimes turns off without warning.The screen goes black and the power is off. This sometimes happens during a search, but alos is writing a letter. If not necessary a high workload for the laptop. Opening ve
-
I'm running Firefox 3.6.9 and have updated to Flash Player 10.1 r53, but the system keeps telling me when I check for updates of plugin Flash Player is vulnerable and a security risk?I contacted Adobe is it was NOT much helpAnyone know what is happen
-
When I turn on my laptop, it works fine for a little bit, and then after some time (usually when I do an antivirus check) a blue screen comes up saying "windows xp has been closed because the file"kernel_inpage_data_error"does not work or something.
-
Adding a feature to a package in sqlplus
Hello experts. I have not used sql more for a while, but I recently created a package and I would like to add more functions and procedure. How should I do?