date of sqlloader

Hi team,

I have two flat files that I'm trying to load using sqlloader. the data will be in the same table. I have a few fields that are the date data type.

the problem is that the two files has different date format.  a single file has the format 'YYYYMMDD' and the other is "MM/DD/YYYY".

I try to use the ctl file came to load the data into the same table.  How can I edit the ctl and say sqlloader file if the file format is "YYYYMMDD".

load the data with this format. Otherwise, load the data as "MM/DD/YYYY '?

so far in the file ctl I have something like ' MM/DD/YYYY' adj_date but when I try to load a file with a date field 'YYYYMMDD' I get an error message.

I'm looking for a way to edit the ctl file so I can load file with either 'YYYYMMDD' or ' MM/DD/YYYY' date fields.

can anyone help?  Thank you

Sorry Greg,

But this example you posted will not work for these conditions or these.

Only replacing options would be:

(1) does not exercise control 'dynamic' which means a 'model' of the file on the fly change to set the correct format (vg with sed) utility.

(2) have a function to decide the format of date like this:

DOWNLOAD THE DATA

REPLACE IN THE TABLE My_Table

FIELDS TERMINATED BY ' 05 X "

SURROUNDED OF POSSIBLY "" "

TRAILING NULLCOLS

(

COL1,

OTHER_COL2,

THE_DATE ' DECODE (SUBSTR(:THE_DATE,3,1),'/ ', TO_DATE(:THE_DATE,'MM/DD/YYYY'), TO_DATE(:THE_DATE,'YYYYMMDD')).

)

Tags: Database

Similar Questions

  • Import excel data in oracle tables

    Hello gurus,
    Importing excel data in oracle tables...

    I know it's the most common question on the wire... First, I searched the forum, I found lots of threads with loading data using sqlloader, excellent in conversion. Txt, file delimited by tabs, file .csv etc...

    Finally, I was totally confused in terms of how to get there...

    Here's wat I
       - Excel file on local computer.
       - i have laod data into dev environment tables(So no risk involved, but want to try something simple)
       - Oracle version 11.1.0.7
       - Sqlplus and toad (editors)
       
     
    Here's wat I want to do... .i don't know if its possible
        - Without going to unix server can i do everthing on local system by making use of oracle db and sqlplus or toad
       
    SQLLOADER could be an option... but I don't want to go the unix server to place files and newspapers and stuff.

    Wat will be the best option and the easiest to do? and wat format better convert excel csv or txt or tab delimited etc...


    If you suggest sqlloader, any example of code will be greatly appreciated.


    Thank you very much!!!

    Hello

    Toad version 9.0.0.160, you can directly load data excel file (or any other specified) to table using the navigation "database > import > import the data in the table.
    You need to connect to the database, then go to the above navigation. Select the table, validation interval (i.e. commit after each record or once all records), map columns in excel file to your table and press ok.
    It loads data directly to your table.

    But, if you use characters multibyte (such as Chinese) in excel file you want to load, then you must make some settings in your machine.

    Don't know if its possible in another version of Toad.

    Concerning
    Imran

  • Import using sql loader triming trailing white spaces

    411885; JUICE 5-1; 990801;
    123777; BDG 558-1; 100101;



    I am importing data using sqlloader and problems with the load in the table space. The source of the data file is not safe how to provide "" in the beginning and the end of field 2. Is there an option on my end to remove the trailing spaces?

    Hey

    You can cut out using sqlloader in the control file columns listed all-in

    (deptno,
    dname "TRIM (:dname)",
    loc "TRIM (:loc)")
    

    Thank you
    AJ

  • sqlloader to load two tables of the single data file in a single operation

    Oracle 11.2.0.3 SE - One

    Oracle Linux 5.6

    I don't know if I need a second set of eyes or if I am missing something.

    Problem: Given a file of text csv with header and detail records (identified by the first field in the file) use sql loader to load the header and detail of the tables in a single operation.

    The header record is to take, but the detail records are rejected by omitting the WHEN clause.

    More comments after reading through the exhibits:

    In view of these two tables:

    SQL > desc EDSTEST_HEADER

    Name                                      Null?    Type

    ----------------------------------------- -------- ----------------------------

    EDSTEST_HEADER_ID NOT NULL NUMBER

    REC_TYPE VARCHAR2 (10)

    SOLD_TO_ACCOUNT VARCHAR2 (50)

    SCAC_RECEIVER_ID VARCHAR2 (50)

    FORMAT_TYPE VARCHAR2 (10)

    CLIENT_NAME VARCHAR2 (100)

    CUSTOMER_PICKUP_ADDRESS VARCHAR2 (100)

    CUSTOMER_PICKUP_CITY VARCHAR2 (50)

    CUSTOMER_PICKUP_STATE VARCHAR2 (10)

    CUSTOMER_PICKUP_ZIP VARCHAR2 (50)

    INSERT_USER VARCHAR2 (50)

    DATE OF INSERT_USER_DATE

    INSERT_STATUS_CODE VARCHAR2 (10)

    SQL > desc EDSTEST_DETAIL

    Name Null?    Type

    ----------------------------------------- -------- ----------------------------

    EDSTEST_DETAIL_ID NOT NULL NUMBER

    NUMBER OF EDSTEST_HEADER_ID

    REC_TYPE VARCHAR2 (10)

    SHIP_TO_NAME VARCHAR2 (100)

    SHIP_TO_ADDRESS VARCHAR2 (100)

    SHIP_TO_CITY VARCHAR2 (50)

    SHIP_TO_STATE VARCHAR2 (10)

    SHIP_TO_ZIP VARCHAR2 (50)

    STATUS_OR_APPT_REASON_CD VARCHAR2 (10)

    EVENT_DESCRIPTION VARCHAR2 (50)

    SHIPMENT_STATUS_CD VARCHAR2 (10)

    SHIPMENT_EVENT_DATE VARCHAR2 (10)

    SHIPMENT_EVENT_TIME VARCHAR2 (10)

    EVENT_TIME_ZONE VARCHAR2 (30)

    EVENT_CITY VARCHAR2 (100)

    EVENT_STATE VARCHAR2 (50)

    EVENT_ZIP VARCHAR2 (50)

    CUSTOMER_CONFIRM VARCHAR2 (100)

    DELIVERY_CONFIRM VARCHAR2 (100)

    TRACKING_NUMBER VARCHAR2 (50)

    MAIL_REC_WEIGHT VARCHAR2 (20)

    MAIL_REC_WEIGHT_CD VARCHAR2 (10)

    MAIL_RED_QTY VARCHAR2 (10)

    INSERT_USER VARCHAR2 (50)

    DATE OF INSERT_USER_DATE

    INSERT_STATUS_CODE VARCHAR2 (10)

    In light of this data file:

    Oracle: mydb$ cat eds_edstest.dat

    HDR, 0005114090, MYORG, CSV, MY NAME OF THE COMPANY, 123 ELM ST, STUCKYVILLE, OH, 12345

    DTL, TOADSUCK, NC, 27999, NS, ARRIVED at the UNIT, X 4, 20140726, 063100, AND, TOADSUCK, NC,.3861, 27999, 12345, 23456 lbs, 1

    DTL, TOADSUCK, NC, 27999, lbs, 1 NS, SORTING COMPLETE, X 6, 20140726, 080000, AND TOADSUCK, NC,.3861, 27999, 12345, 23456

    DTL, TOADSUCK, NC, 27999, NS, PRONOUNCED, D1, 20140726, 121800, TOADSUCK, NC, 27999, 12345, 23456,.3861, lbs and 1

    Given this control sqlloader file:

    Oracle: mydb$ cat eds_edstest_combined.ctl

    Load

    INFILE ' / xfers/oracle/myapp/data/eds_edstest.dat'

    BADFILE ' / xfers/oracle/myapp/data/eds_edstest.bad'

    DISCARDFILE ' / xfers/oracle/myapp/data/eds_edstest.dsc'

    ADD

    IN THE TABLE estevens.edstest_header

    WHERE (rec_type = 'HDR')

    FIELDS TERMINATED BY ', '.

    SURROUNDED OF POSSIBLY "" "

    TRAILING NULLCOLS

    (rec_type CHAR,

    sold_to_account TANK,

    scac_receiver_id TANK,

    format_type TANK,

    client_name TANK,

    customer_pickup_address TANK,

    customer_pickup_city TANK,

    customer_pickup_state TANK,

    customer_pickup_zip TANK,

    INSERT_USER "1"

    INSERT_USER_DATE sysdate,

    INSERT_STATUS_CODE CONSTANT 'I')

    IN THE TABLE estevens.edstest_detail

    WHERE (rec_type = 'PIF')

    FIELDS TERMINATED BY ', '.

    SURROUNDED OF POSSIBLY "" "

    TRAILING NULLCOLS

    (rec_type CHAR,

    ship_to_name TANK,

    ship_to_address TANK,

    ship_to_city TANK,

    ship_to_state TANK,

    ship_to_zip TANK,

    status_or_appt_reason_cd TANK,

    event_description TANK,

    shipment_status_cd TANK,

    shipment_event_date TANK,

    shipment_event_time TANK,

    event_time_zone TANK,

    event_city TANK,

    Event_State TANK,

    event_zip TANK,

    customer_confirm TANK,

    delivery_confirm TANK,

    tracking_number TANK,

    mail_rec_weight TANK,

    mail_rec_weight_cd TANK,

    mail_red_qty TANK,

    INSERT_USER "1"

    INSERT_USER_DATE sysdate,

    INSERT_STATUS_CODE CONSTANT 'I')

    -END CONTROL FILE

    And the time of execution of transactions:

    SQL * Loader: release 11.2.0.3.0 - Production the kill Jul 29 07:50:04 2014

    Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

    Control file: /xfers/oracle/myapp/control/eds_edstest_combined.ctl

    Data file: /xfers/oracle/myapp/data/eds_edstest.dat

    Bad leadership: /xfers/oracle/myapp/data/eds_edstest.bad

    Delete the file: /xfers/oracle/myapp/data/eds_edstest.dsc

    (Allow all releases)

    Number of loading: ALL

    Number of jump: 0

    Authorized errors: 50

    Link table: 10000 lines, maximum of 256000 bytes

    Continuation of the debate: none is specified

    Path used: classics

    Silent options: your COMMENTS

    Table ESTEVENS. EDSTEST_HEADER, loaded when REC_TYPE = 0 X 484452 (character "HDR") add the option in effect for this table: APPEND TRAILING NULLCOLS option in effect

    Column Position Len term Encl. Datatype name

    ------------------------------ ---------- ----- ---- ---- ---------------------

    FIRST REC_TYPE *, O ("") CHARACTER

    SOLD_TO_ACCOUNT NEXT *, O ("") CHARACTER

    SCAC_RECEIVER_ID NEXT *, O ("") CHARACTER

    FORMAT_TYPE NEXT *, O ("") CHARACTER

    CLIENT_NAME NEXT *, O ("") CHARACTER

    CUSTOMER_PICKUP_ADDRESS NEXT *, O ("") CHARACTER

    CUSTOMER_PICKUP_CITY NEXT *, O ("") CHARACTER

    CUSTOMER_PICKUP_STATE NEXT *, O ("") CHARACTER

    CUSTOMER_PICKUP_ZIP NEXT *, O ("") CHARACTER

    INSERT_USER NEXT *, O ("") CHARACTER

    The SQL string for the column: "1."

    INSERT_USER_DATE SYSDATE

    CONSTANT INSERT_STATUS_CODE

    The value is 'I '.

    Table ESTEVENS. EDSTEST_DETAIL, loaded when REC_TYPE = 0X44544c ('PIF' character) in effect for this table insert option: APPEND TRAILING NULLCOLS option in effect

    Column Position Len term Encl. Datatype name

    ------------------------------ ---------- ----- ---- ---- ---------------------

    REC_TYPE NEXT *, O ("") CHARACTER

    SHIP_TO_NAME NEXT *, O ("") CHARACTER

    SHIP_TO_ADDRESS NEXT *, O ("") CHARACTER

    SHIP_TO_CITY NEXT *, O ("") CHARACTER

    SHIP_TO_STATE NEXT *, O ("") CHARACTER

    SHIP_TO_ZIP NEXT *, O ("") CHARACTER

    STATUS_OR_APPT_REASON_CD NEXT *, O ("") CHARACTER

    EVENT_DESCRIPTION NEXT *, O ("") CHARACTER

    SHIPMENT_STATUS_CD NEXT *, O ("") CHARACTER

    SHIPMENT_EVENT_DATE NEXT *, O ("") CHARACTER

    SHIPMENT_EVENT_TIME NEXT *, O ("") CHARACTER

    EVENT_TIME_ZONE NEXT *, O ("") CHARACTER

    EVENT_CITY NEXT *, O ("") CHARACTER

    EVENT_STATE NEXT *, O ("") CHARACTER

    EVENT_ZIP NEXT *, O ("") CHARACTER

    CUSTOMER_CONFIRM NEXT *, O ("") CHARACTER

    DELIVERY_CONFIRM NEXT *, O ("") CHARACTER

    TRACKING_NUMBER NEXT *, O ("") CHARACTER

    MAIL_REC_WEIGHT NEXT *, O ("") CHARACTER

    MAIL_REC_WEIGHT_CD NEXT *, O ("") CHARACTER

    MAIL_RED_QTY NEXT *, O ("") CHARACTER

    INSERT_USER NEXT *, O ("") CHARACTER

    The SQL string for the column: "1."

    INSERT_USER_DATE SYSDATE

    CONSTANT INSERT_STATUS_CODE

    The value is 'I '.

    value used for the parameter LINES increased from 10000 to 30 sheet 2: discarded - failed all WHEN clauses.

    Sheet 3: Discarded - failed all WHEN clauses.

    Folder 4: Discarded - failed all WHEN clauses.

    Table ESTEVENS. EDSTEST_HEADER:

    1 row loaded successfully.

    0 rows not loaded due to data errors.

    3 rows not loading because all WHEN clauses were failed.

    0 rows not populated because all fields are null.

    Table ESTEVENS. EDSTEST_DETAIL:

    0 rows successfully loaded.

    0 rows not loaded due to data errors.

    4 rows not loading because all WHEN clauses were failed.

    0 rows not populated because all fields are null.

    The space allocated to bind table: 247800 byte (30 lines)

    Bytes of read buffer: 1048576

    Total logical records ignored: 0

    Total logical records read: 4

    Total rejected logical records: 0

    Logical records discarded total: 3

    Run started the kill Jul 29 07:50:04 2014

    Run finished on Tue Jul 29 07:50:04 2014

    Time was: 00:00:00.07

    Time processor was: 00:00:00.01

    It works on linux, and the file data calculated from a Windows system to the time we get to it, but it's in the * nix format - with a simple x '0A' as the line terminator.

    If, in the control file, I comment on the block of INSERTION for the header table, retail inserts very well.

    If, in the control file, (return to the initial charge, two tables) I change the line

    INFILE ' / xfers/oracle/myapp/data/eds_edstest.dat'

    To read

    INFILE ' / xfers/oracle/myapp/data/eds_edstest.dat' "str" | "» \n' »

    The saved result becomes

    Table ESTEVENS. EDSTEST_HEADER:

    1 row loaded successfully.

    0 rows not loaded due to data errors.

    0 rows not loading because all WHEN clauses were failed.

    0 rows not populated because all fields are null.

    Table ESTEVENS. EDSTEST_DETAIL:

    0 rows successfully loaded.

    0 rows not loaded due to data errors.

    1 row not loaded because all WHEN clauses were failed.

    0 rows not populated because all fields are null.

    I try to help the developer on this, and it resists change to use external tables. Even if I can overcome that, I now have a puzzle I want to solve, just to add to my knowledge.  Plus, I have some concerns at this stage that whatever it is that miss me here could also come into play if I convert external tables.

    Ed,

    Are you sure that put you the post in the right place?  It should be located in the first definition of the following field each time after the first when clause clause.  Put after the first when the clause is optional.  When I use the following with what you have provided, it loads 1 record in the table header and 3 records in the Details table.  Did you actually do or tell your developer to do and wish that he understood and put it in the right place?

    Load

    INFILE 'eds_edstest.dat '.

    BADFILE "eds_edstest.bad."

    DISCARDFILE 'eds_edstest.dsc '.

    ADD

    IN THE TABLE edstest_header

    WHERE (rec_type = 'HDR')

    FIELDS TERMINATED BY ', '.

    SURROUNDED OF POSSIBLY "" "

    TRAILING NULLCOLS

    (rec_type CHAR,

    sold_to_account TANK,

    scac_receiver_id TANK,

    format_type TANK,

    client_name TANK,

    customer_pickup_address TANK,

    customer_pickup_city TANK,

    customer_pickup_state TANK,

    customer_pickup_zip TANK,

    INSERT_USER "1"

    INSERT_USER_DATE sysdate,

    INSERT_STATUS_CODE CONSTANT 'I')

    IN THE TABLE edstest_detail

    WHERE (rec_type = 'PIF')

    FIELDS TERMINATED BY ', '.

    SURROUNDED OF POSSIBLY "" "

    TRAILING NULLCOLS

    (rec_type POSITION (1) TANK,

    ship_to_name TANK,

    ship_to_address TANK,

    ship_to_city TANK,

    ship_to_state TANK,

    ship_to_zip TANK,

    status_or_appt_reason_cd TANK,

    event_description TANK,

    shipment_status_cd TANK,

    shipment_event_date TANK,

    shipment_event_time TANK,

    event_time_zone TANK,

    event_city TANK,

    Event_State TANK,

    event_zip TANK,

    customer_confirm TANK,

    delivery_confirm TANK,

    tracking_number TANK,

    mail_rec_weight TANK,

    mail_rec_weight_cd TANK,

    mail_red_qty TANK,

    INSERT_USER "1"

    INSERT_USER_DATE sysdate,

    INSERT_STATUS_CODE CONSTANT 'I')

  • calculation of values in sqlloader before loading the data.

    Is it possible in the control sqlloader file to calculate the values in a particular field and then transfer the data to a table.

    for example. If I have a, b, c, d, e, f in csv files.

    e field value is based on if a is null or = 0 then b, if c is null or = 0 then d
    and finally e = b | » -'c

    Yes. You can use the SQL functions in your control as a GARNISH file in the example below. You could solve your problem by using NVL I think...

    LOAD DATA
    
    INFILE 'city.txt'
    APPEND
    INTO TABLE CITYOR_TOWN
    
    (
     COTN_NAM             POSITION(7:34) "TRIM(:COTN_NAM)",
     COTN_CITYCODE      POSITION(1:6),
     COTN_ID                SEQUENCE (COUNT,1),
     COTN_STT           POSITION(35:36),
     COTN_CNTY          POSITION(37:39),
    ...
     FROM_DATE POSITION(84:91) DATE "YYYYMMDD" "replace(:MED_FROM_DATE,'00000000',null)",
     H_AVGESTAY           POSITION(19:21) ":H_AVGESTAY/10",
     CLTYPE      CONSTANT "RXTZASCode",
    ...
    )
    

    Published by: Zoltan Kecskemethy on May 27, 2013 17:24

  • How to get the current date in script sqlloader

    I have a sqlloader script that loads data into a table with apparently a large number of columns... my upload script looks like this.
    options (skip=2)
    load data
    badfile 'C:\CAP_SPS\Batchscripts\psafixbad.txt'
    discardfile 'C:\CAP_SPS\Batchscripts\psafixdiscard.txt'
    append
    into table sps_psafix
    when record_layer='Project'
    fields terminated by ','
    trailing nullcols
    (
    record_layer position(1),
    file_name,
    attr1,
    attr2 filler,
    attr3 filler,
    attr4 filler,
    attr5 filler,
    attr6 filler,
    attr7 filler,
    attr8 filler,
    ...
    ...
    read_flag constant '0',
    select current_date from dual
    )
    the last column is a column of timestamp of charge and I want to capture him since the time of execution instead of manually update. But the sqlloader's mistake to say-

    SQL * Loader-350: 268-line syntax error.
    Expected ', 'or') ', found 'current_date '.
    Select current_date double
    ^

    No idea how to pass on the latest date in the field?

    Enjoy your entries.

    Thank you
    Sanders.
    options (skip=2)
    load data
    badfile 'C:\CAP_SPS\Batchscripts\psafixbad.txt'
    discardfile 'C:\CAP_SPS\Batchscripts\psafixdiscard.txt'
    append
    into table sps_psafix
    when record_layer='Project'
    fields terminated by ','
    trailing nullcols
    (
    record_layer position(1),
    file_name,
    attr1,
    attr2 filler,
    attr3 filler,
    attr4 filler,
    attr5 filler,
    attr6 filler,
    attr7 filler,
    attr8 filler,
    ...
    ...
    read_flag constant '0',
    date_column_name SYSDATE
    )
    

    SY.

  • How to load the date and time from text file to oracle using sqlloader table

    Hi friends

    I need you to show me what I miss loading date and time text file in a table oracle using sqlloader

    It's my data in this way (c:\external\my_data.txt)
    7369,SMITH,17-NOV-81,09:14:04,CLERK,20
    7499,ALLEN,01-MAY-81,17:06:08,SALESMAN,30
    7521,WARD,09-JUN-81,17:06:30,SALESMAN,30
    7566,JONES,02-APR-81,09:24:10,MANAGER,20
    7654,MARTIN,28-SEP-81,17:24:10,SALESMAN,30
    my table in the database emp2
    create table emp2 (empno number,
                      ename varchar2(20),
                      hiredate date,
                      etime date,
                      ejob varchar2(20),
                      deptno number);
    the code for the control in this path (c:\external\ctrl.ctl) file
    load data
     infile 'C:\external\my_data.txt'
     into table emp2
     fields terminated by ','
     (empno, ename, hiredate, etime, ejob, deptno)
    This is the error:
    C:\>sqlldr scott/tiger control=C:\external\ctrl.ctl
    
    SQL*Loader: Release 10.2.0.1.0 - Production on Mon May 31 09:45:10 2010
    
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    
    Commit point reached - logical record count 5
    
    C:\>
    any help that I enjoyed

    Thank you

    Published by: user10947262 on May 31, 2010 09:47

    load data
    INFILE 'C:\external\my_data.txt '.
    in the table emp2
    fields completed by «,»
    (empno, ename, hiredate, etime, Elysa, deptno)

    Try

    load data
     infile 'C:\external\my_data.txt'
     into table emp2
     fields terminated by ','
     (empno, ename, hiredate, etime "to_date(:etime,'hh24:mi:ss')", ejob, deptno)
    

    This is the error:

    C:\>sqlldr scott/tiger control=C:\external\ctrl.ctl
    
    SQL*Loader: Release 10.2.0.1.0 - Production on Mon May 31 09:45:10 2010
    
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    
    Commit point reached - logical record count 5
    
    C:\>
    

    This isn't a mistake, you can see errors in the log file and bad.

  • load data from csv file into table

    Hello

    I'm working on Oracle 11 g r2 on UNIX platform.

    We have an obligation to load data to the table to a flat file, but on one condition, need to make the comparison between filed primary key if the file is available then update another filed and if the recording is not available, we need to insert.

    How can achieve us.

    SQLLoader to load the CSV file data into a staging Table.

    Then use the MERGE SQL command to the lines of insert/update of table setting for the target table.

    Hemant K Collette

  • SQLLOAD with various Charactersets

    Hello

    I'm working on a project of Oracle EBS (version 11i) with DB Oracle 11 G hosted on Oracle Linux.

    Charset is UTF8 on the database, since we have some Polish users.

    Logically, our ERP is interfaced with many systems which we receive the data ASCII files we need to download in our database with the help of utility SQLLOAD.

    The problem is that according to the delivery system, the characters in a data file may vary between the following values:

    -EE8MSWIN1250 = > files sent by our Polish subsidiary

    -WE8MSWIN1252 = > files sent by WINDOWS systems

    -WE8ISO8859P1 = > files sent by some Unix systems

    We have developed a specific Linux shell that submits the SQLLOAD for data files with the appropriate control option "CHARACTERSET" file

    The problem is that so far I was not able to precisely detect the character set for a given data file.

    -the Linux "file-i" command returns "text/plain;" charset = iso-8859-1"even for a windows file encoded with WINDOWS-1252 or WINDOWS-1250.

    -J' also tried linux command iconv to convert the file to UTF8, but this command is successful no matter what the 'of' characterset we specify (ISO-8859-1 / WINDOWS-1252 / WINDOWS-1250)

    My Question:

    How can I determine the characters of a given ASCII data file in order to properly set the CHARACTERSET SQLLOAD in the control file?

    (in batch on Linux mode)

    Browsers like IE, Chrome or Firefox are able to (detect the character set for a web page to display correctly) so I guess that a tool or a command exists for this purpose.

    Thanks in advance for assistance and the sharing of experience.

    Karim happen

    Toshiba France

    Karim,

    There is no completely reliable way to detect the character set for a file.  You can check how tasks are performed by the lcsscan utility, that you will find in your Oracle Home database. This utility uses statistical analysis to guess the character set and the language of the text but the results are not always good. See the documentation here: http://docs.oracle.com/cd/E16655_01/server.121/e17750/ch11charsetmig.htm#NLSPG982

    My recommendation is to involve the character set of the file based on its source. You should have a Protocol (an agreement) with suppliers of these files to get the files in games of a regulatory nature. Then, you can use lcsscan as a tool for verification of quality for you warn possible violations of the Protocol.

    Thank you

    Sergiusz

  • SQLLoader performance using vs County sequence object

    All,

    I had a look at the Oracle Documentation on SQLLoader when using MAX or COUNT to generate a value to insert in a column as ROW_ID.

    ROW_ID sequence(COUNT,1)
    

    However, I could not find how this feature works and how effective is in large batch of data charges.

    That's why I'm considering using a sequence of object instead of MAX, or COUNT.

    Can you give me or direct me towards a definitive answer on the COUNT function works?

    Do I just do a full table scan once to get the number of lines and then just increment after each insertion, or should I do a count of lines after each insertion of a data file?

    I tried to use an object of sequence as an alternative, but he didn't like my syntax

    ROW_ID "Siebel.mysequence.nextval"
    

    I have used this example in the forum

    To use the oracle sequence object when loading data to sql loader

    Thank you

    The following example illustrates a number of things.  It shows that SQL * Loader only scans the table once to get the max or count to a sequence.  He demonstrates by voluntarily putting a row in the data that will be rejected and yet this sequence is assigned, there is no further analysis of the table.  Your statement that he did not like your syntax is a bit vague.  It will help you see what result and/or the error you got, as well as of the data file, control file, create the table create statement sequence statement, the orders allowing you to run SQL * Loader and what it takes to reproduce the problem.  There are different things that could be the problem.  For example, you cannot use sequences of database with the load of the direct path.  The following example 2 loads using a sequence of database, SQL * Loader using max and a SQL sequence * sequence charger using County.  The first load uses the classic path and the second charge the direct path generally faster.  The second load of database sequences are null.  In the second charge, SQL * Loader sequence using County has duplicate values.  I recommend using the SQL * Loader sequence using max as the most perfect and fastest.

    Scott@orcl12c > host type test.dat

    200,

    a,

    300,

    Scott@orcl12c > type host test.ctl

    load data

    test.dat INFILE

    Add

    in the test_tab table

    fields completed by «,»

    trailing nullcols

    (data_col

    , db_seq 'test_seq.nextval '.

    sqlldr_max sequence(max,1)

    sqlldr_count sequence(count,1)

    )

    Scott@orcl12c > create table test_tab

    (number 2 data_col

    3, number of db_seq

    4, number of sqlldr_max

    5, number sqlldr_count)

    6.

    Table created.

    Scott@orcl12c > insert into test_tab values (5, 15, 100, 1)

    2.

    1 line of creation.

    Scott@orcl12c > commit

    2.

    Validation complete.

    Scott@orcl12c > select * from test_tab

    2.

    DATA_COL DB_SEQ SQLLDR_MAX SQLLDR_COUNT

    ---------- ---------- ---------- ------------

    100          5         15            1

    1 selected line.

    Scott@orcl12c > declare

    2 v_start number;

    3. start

    4 Select nvl (max (db_seq), 0) + 1 in test_tab v_start;

    5 run immediately ' create the sequence test_seq with | v_start;

    6 end;

    7.

    PL/SQL procedure successfully completed.

    Scott@orcl12c > host sqlldr scott/tiger control = test.ctl log = test.log

    SQL * Loader: release 12.1.0.1.0 - Production on Fri Nov 22 10:50:29 2013

    Copyright (c) 1982, 2013, Oracle and/or its affiliates.  All rights reserved.

    Path used: classics

    Commit the point reached - the number of logical records 3

    Table TEST_TAB:

    2 rows loaded successfully.

    Check the log file:

    test.log

    For more information on the charge.

    Scott@orcl12c > select * from test_tab by sqlldr_max

    2.

    DATA_COL DB_SEQ SQLLDR_MAX SQLLDR_COUNT

    ---------- ---------- ---------- ------------

    100          5         15            1

    200          6         16            2

    300          8         18            4

    3 selected lines.

    Scott@orcl12c > host sqlldr scott/tiger control = test.ctl log = direct test.log = true

    SQL * Loader: release 12.1.0.1.0 - Production on Fri Nov 22 10:50:29 2013

    Copyright (c) 1982, 2013, Oracle and/or its affiliates.  All rights reserved.

    Path used: Direct

    Loaded - the number of logical records 3.

    Table TEST_TAB:

    2 rows loaded successfully.

    Check the log file:

    test.log

    For more information on the charge.

    Scott@orcl12c > select * from test_tab by sqlldr_max

    2.

    DATA_COL DB_SEQ SQLLDR_MAX SQLLDR_COUNT

    ---------- ---------- ---------- ------------

    100          5         15            1

    200          6         16            2

    300          8         18            4

    200                    19            4

    300                    21            6

    5 selected lines.

  • How to find the broken blob data

    Hello

    I am using oracle 10G express edition and Oracle Apex 4.1 front-end, have created a table with structure below.

    > > CREATE TABLE 'HR_EMPLOYEE_DETAILS '.
    > > ('ID' NUMBER ENABLE NOT NULL,)
    > > 'HR_ID' ENABLE NUMBER NOT NULL,
    > > 'PHOTO_BLOB' BLOB,
    > > "MIME_TYPE" VARCHAR2 (64)
    (> >)


    I wrote a program sqlloader to load all the images in the table above, there are a few rows of data where the photos have not get uploaded, which shows the user but photo id is empty, how to identify such lines. Please suggest me

    Thank you
    Sudhir

    Your photo_blob column can have NULL values, in this case you can find them with something like:

    select id
      from hr_employee_details
     where photo_blob is null
    

    Or the photo_blob was filled with a LOB Locator is valid but contains no data. These cases can be found with something like:

    select id
      from hr_employee_details
     where dbms_lob.getlength(photo_blob) = 0
    

    Or you could have a case where some of the bytes actually enter the blob, but not all resulting in an invalid image. Who is going to be more difficult, try so the other two cases first ;-)

  • problem sqlloader

    Hello

    I can't load a simple csv file using sqlloader. Please note that the .csv file has two columns, while in the table, the first column in the .csv file should go in col1 and col2 in the table.

    The last column of the table is not to take.

    Any help please.

    SQL > desc testtab
    Name Null? Type
    ----------------------------------------- -------- ----------------------------
    T1D DATE
    T2D DATE
    VAL VARCHAR2 (20)


    $> cat testtab_cntl.txt
    DOWNLOAD THE DATA
    INFILE 'testtab.csv '.
    BADFILE "testtab.bad."
    DISCARDFILE 'testtab.dsc '.
    INSERT INTO TABLE testtab
    fields completed by «,»
    trailing nullcols
    (
    T1D TIMESTAMP ' YYYY-MM-DD HH24:MI:SS. FF3' NULLIF t1d = "NULL."
    DM2"to_timestamp (: t1d, ' YYYY-MM-DD HH24:MI:SS.) FF3') «,»
    Val)

    $> cat testtab.csv
    2012-01-25 00:00:00.000, LS
    2012-01-25 00:00:00.000, LW

    $> sqlldr CONTROL = testtab_cntl.txt JOURNAL = journal of the USERID=scott/tiger@testdb


    prod@dyl06635app11:/Home/prod > cat log.log
    ...
    ...
    Control file: testtab_cntl.txt
    Data file: testtab.csv
    Bad leadership: testtab.bad
    Delete the file: testtab.dsc
    (Allow all releases)

    Number of loading: ALL
    Number of jump: 0
    Authorized errors: 50
    Link table: 64 lines, maximum of 256000 bytes
    Continuation of the debate: none is specified
    Path used: classics

    Table TESTTAB, loaded from every logical record.
    Insert the option in effect for this table: INSERT
    TRAILING NULLCOLS option in effect

    Column Position Len term Encl. Datatype name
    ------------------------------ ---------- ----- ---- ---- ---------------------
    T1D FIRST *, DATETIME YYYY-MM-DD HH24:MI:SS. FF3
    NULL if t1d = 0X4e554c4c ('NULL' character)
    T2D ACCORDING *, CHARACTER
    The SQL string for the column: "to_timestamp (: t1d, ' YYYY-MM-DD HH24:MI:SS.). (FF3') ".
    VAL NEXT *, CHARACTER


    Table TESTTAB:
    2 rows loaded successfully.
    0 rows not loaded due to data errors.
    0 rows not loading because all WHEN clauses were failed.
    0 rows not populated because all fields are null.

    ...

    SQL > select * from testtab;

    T1D T2D VAL
    ----------- ----------- --------------------
    JANUARY 25, 2012 JANUARY 25, 2012
    JANUARY 25, 2012 JANUARY 25, 2012

    Published by: chakra on August 30, 2012 19:59

    Invert the values 2 and 3, and that he does very well.

  • date format sqlldr

    Dear experts,
    I'm trying to load a txt file into table via sqlldr. I receive the error "rejected - error on table TEST_TABLE, column PERIOD_START_TIME.» ORA-01843: not one month valid
    The date in a flat txt file format is MM:DD:YYYY; in the output of the table, I want to have is JJ. MM YYYY
    It seems that sqlldr interprets to wrong format of the entry date - when the input is 08/07/2012, the output is 07.08.2012 then it assumes that the entry is in the MM-DD-YYYY instead of DD-MM-YYYY format. And when the day is greater than 12 (13,14 etc.) then an error message

    Do you know where and how force SQLLDR interpret correcly? It is somehow related to NLS_Lang settings. ?



    Control file looks as follows:

    OPTIONS (SKIP = 1)
    DOWNLOAD THE DATA
    Test.txt INFILE
    ADD
    IN THE TABLE TEST_TABLE
    FIELDS TERMINATED BY '; '.
    TRAILING NULLCOLS
    (NE_ID,
    LAC_ID,
    DATE OF PERIOD_START_TIME ' JJ. MM YYYY HH24:MI:SS',.
    PERIOD_DURATION,
    NSCURRENT,
    NSAVERAGE
    )

    Table definition:

    NUMBER OF NE_ID
    NUMBER OF LAC_ID
    DATE OF PERIOD_START_TIME
    NUMBER OF PERIOD_DURATION
    NUMBER OF NSCURRENT
    NUMBER OF NSAVERAGE

    INPUT DATA:
    NE_ID; LAC_ID; PERIOD_START_TIME; PERIOD_DURATION; NSCURRENT; NSAVERAGE;
    576527001; 37000; 16/07/2012 09:00; 60; 24846 24956;
    576527001; 37000; 08/07/2012 10:00, 60; 1; 1


    Thanks in advance for any advice
    Rgds
    Lukasz

    Dates are stored as an internal number, not a string.
    If the file contains dates in the format mm:dd:yyyy you need to specify as a format, DD No. MM YYYY.
    Solution: correct the control file.

    Other than that: as this has nothing to do with SQL, and there is a separate instance for Oracle Utilities like SQLloader, post in the appropriate forum.

    --------------
    Sybrand Bakker
    Senior Oracle DBA

  • SQL Loader failed to load the same source sometimes data file

    I meet different types of data, loading errors when you try to load data. What makes it really annoying, is that I'm not able to identify the real culprit since the success of load depends on the amount of lines in the source data file but not its content. I use dataset Toad export feature to create delimited data set. When I decided to take the first 50 lines then succeeds data to load into the target table. When I decide to take the first 150 lines then the data to load into the target table fails indicating that:
    Record 13: Rejected - Error on table ISIKUD, column SYNLAPSI.
    ORA-01722: invalid number
    I can see .bad file that the same line has been loaded successfully when the data file that contains 50 rows was used. The content has no column for this particular row is NULL (no chain).
    I suspect that the toad generates faulty delimited text file when taking 150 lines. File data for reasons of confidentiality, I can't show. What can I do? How can we all further investigate this problem? Size of the array that must be loaded by SQL Loader is almost 600 MB. I use Windows XP.

    Published by: totalnewby on June 30, 2012 10:17

    I do not believe that customer sqlloader 11g allows you to load a 10g database. You can run sqlldr on the server of database 10g itself? PL also post the rest I asked information

    HTH
    Srini

  • Conditional insertion in staging table using sqlloader

    Hello
    In Oracle apps I submit a concurrent program by program that will call sqlloader and insert in a staging a table.
    This table consists of 30 columns. Program has an input parameter. If the parameter value = REQUIRED, then it must insert in the first three columns of table staging. If it is APPROVED then it must insert into 10 first columns in the same table.
    Data file is pipe delimited file that may or may not have all of the possible values: For Required, I many have all three column values

    the first thing that comes to my mind as an approach to this scenario is to use UTL_FILE. It is a file handle object that can read and write to a file. You can use this object within your PL/SQL code and adds the condition of IF statement to manage your other needs.

Maybe you are looking for

  • Bad reviews on iOS 10

    How can I stop my phone to download iOS 10, all tests are negative and I don't want to update more

  • Satellite click L9W Mini-B - how do I find my BIOS Version?

    I am trying to determine what version of BIOS that I have in my Mini of click on Satellite, but I can't work out how to do it. It has been suggested that the Tablet part and the detachable keyboard separated BIOS and sometimes we get updated but not

  • Compaq CQ58: Win 8.1 recovery dvd creation failed

    Hello I support the elderly, problems of computer as a volunteer. A member has a laptop Compaq CQ58 with Windows 8.1.I started RecoveryManager for a set of 4 recovery DVDs.(4 unused dvd of same brand were ready to use.) Disc 1 + 2 finished ok. during

  • entry paths

    does anyone have a solution for the IPHLPAPI. DLL? This appears when I try to play a darkstar or deadspace. no help for it.

  • LGS308 Port Trunking Switch Cisco 3650

    I have a Cisco 3650 switch I need to connect my Linksys LGS308 again. I have the port trunking install everything on the side of Cisco, however, I'm not very familiar with trunking on the side of Linksys. What should the configuration looks like on t