Direct load in external tables

Can we use direct load in external tables? or set DIRECTLY in the external table script?

Thank you.

polasa wrote:
Can we use direct load in external tables? or set DIRECTLY in the external table script?

Thank you.

N ° why? Because an external table does not load data. It's more like a pointer and an instruction how to read a file.

The big difference between SQL Loader and an external table, SQL Loader is actually two things.

(a) it reads a file from the file system
(b) it inserts these values into a table in the database.

An external table only one).

However, you can do a quick insertion of this external table in an actual database table so that is sought.

insert /*+append */ into myRealTable (colA, ColB, colC)
select  * from myExternalTable

Append it and perhaps also the parallel indication will be close to a direct path insert.

Tags: Database

Similar Questions

  • SQL * Loader vs external tables to a single file with several types of records (intercalated)

    I have a file of sample data (we will have the a 'true' at a later date and put in day after that) which includes a header, footer, and 5 types of records, that have different columns and lengths, noticed by the first two characters. The different types of records are not all together. On the contrary, some (in particular, two of these types in this example) are intertwined. I am currently working on a SQL * Loader configuration file when it was suggested that I use external tables. I know very little of either, then I would ask what is the best to use.

    Scott@orcl12c > host type test.dat

    header line

    AB, 123, efg

    CD, hij, 456

    Scott@orcl12c > type host test.ctl

    options (Skip = 1)

    load data

    in the ab table truncate where table_name = 'ab'

    fields ended by ',' trailing nullcols

    (table_name filler position (1), col1, col2)

    in the cd table add where table_name = 'cd'

    fields ended by ',' trailing nullcols

    (table_name filler position (1), col3, col4)

    Scott@orcl12c > create table ab

    2 (col1 number,

    3 col2 varchar2 (8))

    4.

    Table created.

    Scott@orcl12c > insert into ab values (1, 'old data')

    2.

    1 line of creation.

    Scott@orcl12c > create table cd

    2 (col3 varchar2 (8))

    3 col4 number)

    4.

    Table created.

    Scott@orcl12c > insert into cd values ("old data", 1).

    2.

    1 line of creation.

    Scott@orcl12c > commit

    2.

    Validation complete.

    Scott@orcl12c > host sqlldr scott/tiger control = test.ctl data = test.dat log = test.log

    SQL * Loader: release 12.1.0.1.0 - Production on Thu Mar 27 13:11:47 2014

    Copyright (c) 1982, 2013, Oracle and/or its affiliates.  All rights reserved.

    Path used: classics

    Commit the point reached - the number of logical records 2

    Table AB:

    1 row loaded successfully.

    Table D:

    1 row loaded successfully.

    Check the log file:

    test.log

    For more information on the charge.

    Scott@orcl12c > select * AB

    2.

    COL1 COL2

    ---------- --------

    EFG 123

    1 selected line.

    Scott@orcl12c > select * from cd

    2.

    COL3 COL4

    -------- ----------

    old data 1

    hij 456

    2 selected lines.

  • When to load the external table condition


    Hello

    Is my version of db: oracle 11g

    I have a 6 gig csv file.

    I divided it several 15 MB csv files.

    But only the first csv file has header (with column headers).

    I have to load each of these files into a target table.

    I created an external table with jump 1.

    But how can I substitute jumping 1 for other csv files.

    Is there a way I can do it in the external table definition.

    I can't merge the split csv files and run as one big file. So I don't hv this option. Please advice.

    You should be able to use the LOAD WHEN clause to exclude a line based on the contents of a field. Or the other

    WHEN LOAD 1 / 2! = "ID".

    or, if the bonds are quoted,

    WHEN LOAD 1:4! = « « ID » »

    or you can always add a list of fields to your specifications and the names of the fields in the reference LOAD WHEN the condition instead of using an absolute position.

    Don't know if you saw my comments on the use of the tail command before you replied, but if the break-up of the file is part of your project of automated process, and then deleting the header row could certainly be integrated into this process if your on a unix platform and using the split command to split the file.

    Kind regards

    Bob

  • jump the last record loading using external table

    Hi gentlemen,

    I have a requirement to load data from text file of oracle database by using the external table. I need to remove the header records and complementary.

    I can use the skip option to remove the header, IE front-line.

    Is it possible to remove the additional record (IE last line) using the external table.


    Thanks in advance.

    Ferry

    Hello ferry.
    Do not hesitate on occasion as useful and Correct answers for those of us Newbie trying to get credibility :)

    Thank you
    Luke

  • Error loading the external table

    I encounter an error.

    IAF.txt:

    "02T001427T04206"-1
    '478081' 12
    '131379' 200
    "158125"-100
    '152040'-800
    "151112"-4
    '481990'-5
    "150389"-300
    "481136" 3


    CREATE OR REPLACE DIRECTORY IAF_log_dir
    LIKE 'c:\dataformats\logs\IAF\log ';
    CREATE OR REPLACE DIRECTORY IAF_bad_dir
    LIKE 'c:\dataformats\logs\IAF\bad ';
    create table ext_IAF_table)
    item_no varchar2 (15).
    integer quantity
    )
    external organization
    (
    type oracle_loader
    default directory user_dir
    access settings
    (
    records delimited by newline
    FIELDS TERMINATED BY WHITESPACE EVENTUALLY SURROUNDED "" "
    BadFile IAF_bad_dir:'IAF%a_%p.bad'
    logfile IAF_log_dir:'IAF%a_%p.log'
    fields
    (
    item_no char (15),
    external quantity integer
    )
    )
    location ("IAF.txt")
    )
    reject limit unlimited;


    SQL > select * from ext_iaf_table;
    Select * from ext_iaf_table
    *
    ERROR on line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN legend
    ORA-29400: data cartridge error
    KUP-00554: error occurred when parsing the access settings
    KUP-01005: syntax error: found "badfile": waiting for an of: "and in the column, (,»
    LTrim lrtrim, ldrtrim, lack, notrim, rtrim, reject.
    KUP-01007: line 3, column 1
  • How to choose the access method (direct path or external tables) for Data Pump export?

    I have this slow data export pump, and I have a few suggestions for settings that might improve the speed. But I can't seem to pass them through the DBMS_DATAPUMP package. Is this possible?

    REPORT THE NUMBER OF PUMP_HANDLE: = DBMS_DATAPUMP. OPEN (OPERATION = > 'EXPORT', JOB_MODE = > 'TABLE', JOB_NAME = > 'EXP_DATABASE_370');

    BEGIN

    DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A1.) DMP', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A2.) DMP', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A5.) TXT', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);

    DBMS_DATAPUMP. METADATA_FILTER (PUMP_HANDLE, NAME = > 'NAME_EXPR', VALUE = > 'IN ("MY_DATABASE_370")');

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'INCLUDE_METADATA', VALUE = > 1);

    DBMS_DATAPUMP. SET_PARALLEL (PUMP_HANDLE, LEVEL = > 4);

    < < THIS_LINE_FAILS > > DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > "DIRECT_PATH");

    DBMS_DATAPUMP. START_JOB (PUMP_HANDLE);

    DBMS_DATAPUMP. DETACH (PUMP_HANDLE);

    END;

    < < THIS_LINE_FAILS > > line throws an exception:

    ORA-20020: error: ORA-39001: value of the invalid argument. ORA-39049: parameter not valid name ACCESS_METHOD;

    ORA-06512: at line 10

    Replace < < THIS_LINE_FAILS > > this call fails with the same message

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > "EXTERNAL_TABLES");

    Replace < < THIS_LINE_FAILS > > this call fails with the same message

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > 1); / * INTEGER does not seem to work either * /.

    Replace < < THIS_LINE_FAILS > > this call also fails with a message similar

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'PARALLEL_FORCE_LOCAL', VALUE = > 1);

    Replacement of < < THIS_LINE_FAILS > > with this call fails also, with a quite different message

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'Settings', VALUE = > "DISABLE_APPEND_HINT");

    ORA-20020: error: ORA-39001: value of the invalid argument. ORA-39207: NULL value is not valid for the parameter settings. ;

    Hello

    you have ACCESS_METHOD we DATA_ACCESS_METHOD. Just give a try.

    see you soon,

    rich

  • External table of hourly load

    Hello

    How can I plan to load the external table?

    I have the external table and the target table, so I want to plan the following statement:

    Insert into target_table as
    Select * from external_table


    What is the best approach? To create a stored procedure and the calendar?

    Thank you
    Alex

    Creating DBMS_SCHEDULER job and schedule it to run on time.

    SY.

  • External table - load a log file with more than 4000 bytes per column

    Hello
    I'm trying to import a log file into a database table that has a single column: txt_line
    In this column, I'm trying to fill out a log by record type entry. Each log entry is normally more than 4000 bytes in the outer table, it should be a clob.
    Below is a table of external work that works, but cut all entries after 4000 bytes. How is it is possible to directly load the data into a clob column? All I've found are descriptions where I have a clob-file by file.
    Any help is appreciated
    Thank you



    Source file
    .. more than 4000 bytes...]] .. .more Quen 4000 bytes...]] .. more than 4000 bytes.

    ]] ist the record delimiter

    External table:
    create the table TST_TABLE
    (
    txt_line varchar2 (4000)
    )
    external organization
    (type
    ORACLE_LOADER
    the default directory tmp_ext_tables
    (settings) access
    records delimited by a "]]"
    fields (txt_line char (4000))
    )
    location ("test5.log")
    )
    reject the limit 0
    ;

    user12068228 wrote:

    I'm trying to import a log file into a database table that has a single column: txt_line
    In this column, I'm trying to fill out a log by record type entry. Each log entry is normally more than 4000 bytes in the outer table, it should be a clob.
    Below is a table of external work that works, but cut all entries after 4000 bytes. How is it is possible to directly load the data into a clob column? All I've found are descriptions where I have a clob-file by file.
    Any help is appreciated
    . . . E t c...

    And what did you expect if you define the field source and target column as 4000 characters?

    Try this:

    CREATE TABLE tst_table
     (
       txt_line CLOB
     )
     ORGANIZATION EXTERNAL
     (TYPE oracle_loader
        DEFAULT DIRECTORY tmp_ext_tables
        ACCESS PARAMETERS (
           RECORDS DELIMITED BY ']]'
           FIELDS (txt_line CHAR(32000))
        )
      LOCATION ('test5.log')
     )
    REJECT LIMIT 0
    ;
    

    8 2

  • Load the XML file into Oracle external Table


    I load the data from the XML file into an intermediate table Oracle using external Tables.

    Let's say below, it is my XML file

    < header >
    < A_CNT > 10 < / A_CNT >
    < E_CNT > 10 < / E_CNT >
    < AF_CNT > 10 < / AF_CNT >
    < / header >
    < student >
    <>students-details
    < Student_info >
    < Single_Info >
    < ID > 18 / < ID >
    New York < City > < / City >
    < country > United States < / country >
    < Name_lst >
    < Student_name >
    Samuel < name > < / name >
    Paul < Last_name > < / Last_name >
    < DOB > 19871208 < / DOB >
    Aware of < RecordStatus > < / RecordStatus >
    < / Student_name >
    < Student_name >
    Samuel < name > < / name >
    Paul < Last_name > < / Last_name >
    < DOB > 19871208 < / DOB >

    < TerminationDt > 20050812 < / TerminationDt >
    History of < RecordStatus > < / RecordStatus >
    < / Student_name >
    < / Name_lst >
    < Personal_Info >
    <>men < / Type >
    < 27 > < / Age >
    < / Personal_Info >
    < / Single_Info >
    < / Student_info >

    < student - register >
    class < A >
    < info >
    < detail >
    < ID student > 18 < / student >
    EE < major > < / Major >
    < course-Grades >
    < course > VLSI < / course >
    < degree > 3.0 < / Grade >
    < / course-Grades >
    < course-Grades >
    < course > nanotechnology < / course >
    < degree > 4.0 < / Grade >
    < / course-Grades >
    < / details >
    < detail >
    < ID student > 18 < / student >
    THIS < major > < / Major >
    < / details >
    < / info >
    class < A >
    < Student_Enrol >
    <>students-details
    < student >

    I load this XML data file into a single table using an external Table. Could someone help me please with coding.

    Thank you

    Reva

    Could you please help me how to insert my XML content into that.

    Same as before, try a plain old INSERT:

    insert into xml_pecos

    values)

    XmlType (bfilename ('XML_DIR', "test.xml"), nls_charset_id ('AL32UTF8'))

    );

    But you'll probably hit the same limitation as with the binary XMLType table.

    In this case, you can use FTP to load the file as a resource in the XML DB repository.

    If the XML schema has been registered with the hierarchy enabled then the file will be automatically inserted into the table.

    Could you post the exact statement that you used to save the scheme?

    In the meantime, you can also read this article, I did a few years ago, it covers the XML DB features that may be useful here, including details on how to load the file via FTP:

    https://odieweblog.WordPress.com/2011/11/23/Oracle-XML-DB-a-practical-example/

    And documentation of the course: http://docs.oracle.com/cd/E11882_01/appdev.112/e23094/xdb06stt.htm#ADXDB4672

  • Loading external Table with quotes

    I have a file with fields in the file are as TAB delimiter ~ TAB.

    Example as below:

    QM ~ CD ~ Exzm ~ BMW

    DM ~ BD ~ Exzm ~ BMW

    CREATE TABLE test

    (

    Col_1 VARCHAR2 (100),

    Col_2 VARCHAR2 (100),

    Col_3 VARCHAR2 (100),

    Col_4 VARCHAR2 (100)

    )

    EXTERNAL ORGANIZATION

    (TYPE ORACLE_LOADER

    DEFAULT DIRECTORY 'Test_Report '.

    ACCESS SETTINGS

    (records delimited by '\n'

    CHARACTERSET 'UTF8 '.

    fields terminated by '\t~\t '.

    missing field values are null

    )

    LOCATION ("test.asc")

    )

    REJECT LIMIT UNLIMITED;

    OUTPUT:

    ----------------

    Data loaded in DB, but col_4 data comes from the quotation as below

    col_4

    -------

    "BMW".

    "BMW".

    Note: Col1 - col3 data arrives correctly.

    2807064 wrote:

    A finding on my side.

    I found that the values of Col_4 after inserting into DB with "transport return character" (CHR (13)) at the end of each value as shown below when I copy paste the value in notepad ++ "»

    Example:

    ----------

    "BMW".

    "

    But if I see the file I saw that BMW.

    My question is, in this case the external table loading must fail right? Why is this it is to load data in DB?

    Do you have this file begin life on windows, and then are transferred to * nix to serve an external table?  If so, which explains a lot.  Windows is the standard record delimiter x '0d0a' (Chr (13) 10)  On * nix, it's just x '0A' (10.  When the process of loader is scanning for record delimiter he's just looking for the '0A' x and x'd 0' gets included in the data.

    Two solutions-

    1 - Make sure that the data file is transferred so that the Records delimiters are converted.  It's supposed to to happen with ascii ftp mode, but this week I saw several examples in the House of it does not.

    2. attach your table definition external to seek the delimiter of actual recording instead of the default value of the operating system. == RECORDS DELIMITED BY X '0D0A '.

  • How to load the file tabs with field-delimited a comma in an external table

    I am trying to create an external table in oracle 11 g r2. the script is as below. It fails if the field contains commas.

    for example, the following data will not be loaded.

    ERT 123, poipoipoi, yutio 567

    Please suggest how to solve this problem. Thank you!

    CREATE TABLE external_interaction
    (
    TAX_ID NUMBER (8).
    pubmed_id_list varchar2 (36),
    interaction_id_type varchar2 (36)
    )
    EXTERNAL ORGANIZATION
    (TYPE ORACLE_LOADER
    THE DEFAULT DIRECTORY ET_NCBI_DIR
    ACCESS SETTINGS
    (records delimited by 0 x '0A'
    jump 1
    BADFILE et_ncbi_log_dir: 'interactions.bad'
    Et_ncbi_log_dir LOG file: 'interactions.log'
    fields terminated by '\t '.
    missing field values are null
    REJECT ROWS WITH ALL FIELDS ARE NULL
    (
    TAX_ID,

    pubmed_id_list,
    interaction_id_type
    )
    )
    LOCATION (ET_NCBI_DIR: "interactions")
    )
    REJECT LIMIT UNLIMITED
    NOPARALLEL
    NOMONITORING;

    Thanks for the reply.

    Should I use

    collumn_name tank (2000) in the table creation script because it is larger than the default char (255)

  • Load an external audio file in table?

    Hello.

    I figured out how to load an external video file - and I can play an audio file imported, but I can't seem to use both methods to load an external audio file into a table. (doh)

    Here is my audio code:

    var tour_audio:Array = [tour1, tour2];

    var my_num:Number = Math.floor (Math.random () * 2);

    var ChosenSound = tour_audio [my_num];

    var playing: Sound = new ChosenSound();

    Playing.Play ();

    Here's my video code:

    var my_videos:Array = new Array ('link1.mp4', 'link2.mp4', 'link3.mp4', 'link4.mp4');

    var randomIndex = Math.floor (Math.random () * my_videos.length);

    my_player.source = my_videos [randomIndex];

    I tried ' var tour_audio:Array = ["link1.mp3", "link2.np3"];  - but it does not work.

    Ty

    //  ------------------------

    the table is from your code, the table will be the references for the two audio files

    var tour_audio:Array = new Array();

    then you must create two instances of the sound class to contain the audio files you load into

    var soundClip1:Sound = new Sound();

    var soundClip2:Sound = new Sound();

    then you need a sound channel to play audio files

    var sndChannel:SoundChannel = new SoundChannel();

    It's your randomizer code

    var my_num:Number = Math.floor (Math.random () * 2);

    now, you load each sound file in your instances of the sound using the URLRequest class

    soundClip1.load (new URLRequest ("tour1.mp3"));

    soundClip2.load (new URLRequest ("tour2.mp3"));

    to find out when loading, the files have completed, you must add an event listener to each of the classes of sounds

    These event listeners will be triggered the functions onComplete1 and onComplete2 when loading is completed to each sound file

    soundClip1.addEventListener(Event.COMPLETE,onComplete1,false,0,true);

    soundClip2.addEventListener(Event.COMPLETE,onComplete2,false,0,true);

    This is the function for the first hearing of his charger, it will add the instance of the sound in your table class once loaded the sound

    function onComplete1(evt:Event):void {}

    tour_audio.push (soundClip1);

    }

    This function does the same work for the second audio file

    function onComplete2(evt:Event):void {}

    tour_audio.push (soundClip2);

    }

    Call this function to play the selected audio file

    It could be called from the function of onComplete2 above

    function playRandomSound (): void {}

    var ChosenSound = tour_audio [my_num];

    sndChannel = ChosenSound.play ();

    }

    //  ------------------------

    I added comments to the code itself which should help to explain what is happening. Yes, AS3 is just jam packed with intimidating stuff. As use you it, it will start to make sense.

  • Several flat_files through the external table with only the common columns of loading

    Hi, I have 50 flat files and each of them have some columns (fields) common and I need to load only the fields that are common to an external Table. Is any chance to do it with education unique external table. Or I need to load all flat_files at separate tables and then with the ETG and UNION load them only one table.

    If the page size for all the files are different, I think that your only option would be to define different external tables and create a view that joins all the.

    HTH
    Srini

  • Difference between external tables and sql * loader

    Hello

    Could you please tell me the difference between
    tables external and sql * loader

    I have serached on the net but did ' get correct idea

    Please help me

    1 SQL LOADER can be run on the network (from any client computer), external tables can't

    2. return to the Oracle 9, external Tables could not load CLOB/BLOB (Oracle10 changed it)

    3 oracle 11 external tables have preprocessor, which is pretty dam characteristic cool - running essentially any OS command e.g. decompress before external table run. What's even better is the fact that the result of the operating system command is the source of the outer table, which means that there are no required temporary file (unzip the tracks and the output is the source of the external table). There are several ways to great use this - look at my blog for samples rare http://jiri.wordpress.com/2010/01/19/no-more-unix-scripts-in-11-2/

    4. as long as the 009 stressed, filed external load anything, they show just. Think of it more as load on request - it's great if you have old files archived and one or two users what to see content once a while

    5. external tables require no user access to the operating system, it is oracle environment pure - this may seem minor but for me it's huge. The fact that the ETL needs no special unix, no control file command and uses the simple SQL and DDL is nice and important

    6. external tables can load more text files, Oracle export dump files can be loaded, perhaps in the future more formats will be supported (hopefully all right excel format?)

    now the same thing to kill the myth - the TWO are EXACTLY the same when it comes to speed, I would actually drive of the external tables before will be faster because sql loader is old technology oracle doesn't really develops more

  • External table. How to load numbers (scientific and decimal notation format)

    Hi all, I need to load within a records in the external table that contain 7 fields. The last field is called AMOUNT and he has represented in some documents in decimal, in other documents in the format of scientific, notation for example, below:

    CY001_STATU; 2009; Jan; 11220020GR;' 03900; CYZ900; -9, 99999999839929e-03
    CY001_STATU; 2009; Jan; 11200100;' 60800; CYZ900; 41380,77

    The external table script is the following:

    CREATE TABLE HYP_DATA
    (
    COUNTRY VARCHAR2 (50 BYTE)
    YEAR VARCHAR2 (20 BYTE).
    PERIOD VARCHAR2 (20 BYTE).
    ACCOUNT VARCHAR2 (50 BYTE),
    VARCHAR2 (20 BYTE) DEPT,
    ACTIVITY_LOC VARCHAR2 (20 BYTE),
    AMOUNT VARCHAR2 (50 BYTE)
    )
    EXTERNAL ORGANIZATION
    (TYPE ORACLE_LOADER
    THE DEFAULT DIRECTORY HYP_DATA_DIR
    ACCESS SETTINGS
    (RECORDS DELIMITED BY NEWLINE
    BADFILE ' HYP_BAD_DIR': ' HYP_LOAD.bad '.
    DISCARDFILE ' HYP_DISCARD_DIR': ' HYP_LOAD.dsc '.
    LOGFILE ' HYP_LOG_DIR': ' HYP_LOAD.log '.
    SKIP 0
    FIELDS TERMINATED BY '; '.
    MISSING FIELD VALUES ARE NULL
    REJECT ROWS WITH ALL FIELDS ARE NULL
    (
    Tank of 'COUNTRY ', he said.
    'YEAR' tank,.
    Chariot of the "PERIOD."
    Char 'ACCOUNT ',.
    'DEPT' tank,
    "ACTIVITY_LOC," tank
    Char 'AMOUNT. '
    )
    )
    LOCATION (HYP_DATA_DIR: 'Total.txt')
    )
    REJECT LIMIT UNLIMITED
    NOPARALLEL
    NOMONITORING;

    If, for the field AMOUNT, I use VARCHAR (see above) data type, the table is loaded but I have some records rejected, and all these folders contain the AMOUNT in the field with scientific as notation:

    CY001_STATU; 2009; Jan; 11220020GR;' 03900; CYZ900; -9, 99999999839929e-03
    CY001_STATU; 2009; Feb; 11220020GR;' 03900; CYZ900; -9, 99999999839929e-03
    CY001_STATU; 2009; Mar; 11220020GR;' 03900; CYZ900; -9, 99999999839929e-03
    CY001_STATU; 2009; Dec; 11220020GR;' 03900; CYZ900; -9, 99999999839929e-03

    All records with a decimal NUMBER are loaded correctly.

    So my problem is that I NEED to load all the records (with the comma and the scientific notation format) together (without the rejected records), but I do not know what data type should I use for the field AMOUNT...

    Someone has an idea?
    Any help would be appreciated

    Thanks in advance

    Alex

    @OP,
    What version of Oracle are you using?
    Just Cut the dough of your script and example woked FINE for me.

    However my equation is... An external table will all LOAD data or not at all. How did you validate/conclude that...
    I have a few records rejected, and all these folders contain the last field AMOUNT with scientific notation

    select * from v$version where rownum <2;
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    
    select * from mydata;
    CY001_STATU     2009     Jan     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Feb     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Jan     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Jan     11200100     '60800     CYZ900     41380,77
    CY001_STATU     2009     Mar     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Dec     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Jan     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Jan     11200100     '60800     CYZ900     41380,77
    

    Table MYDATA script is...

    drop table mydata;
    CREATE TABLE mydata
    (
    COUNTRY VARCHAR2(50 BYTE),
    YEAR VARCHAR2(20 BYTE),
    PERIOD VARCHAR2(20 BYTE),
    ACCOUNT VARCHAR2(50 BYTE),
    DEPT VARCHAR2(20 BYTE),
    ACTIVITY_LOC VARCHAR2(20 BYTE),
    AMOUNT VARCHAR2(50 BYTE)
    )
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY IN_DIR
    ACCESS PARAMETERS
    ( RECORDS DELIMITED BY NEWLINE
    BADFILE 'IN_DIR':'HYP_LOAD.bad'
    DISCARDFILE 'IN_DIR':'HYP_LOAD.dsc'
    LOGFILE 'IN_DIR':'HYP_LOAD.log'
    SKIP 0
    FIELDS TERMINATED BY ";"
    MISSING FIELD VALUES ARE NULL
    REJECT ROWS WITH ALL NULL FIELDS
    (
    "COUNTRY" Char,
    "YEAR" Char,
    "PERIOD" Char,
    "ACCOUNT" Char,
    "DEPT" Char,
    "ACTIVITY_LOC" Char,
    "AMOUNT" Char
    )
    )
    LOCATION (IN_DIR:'total.txt')
    )
    REJECT LIMIT UNLIMITED
    NOPARALLEL
    NOMONITORING;
    

    VR,
    Sudhakar B.

Maybe you are looking for