SQL Loader: Problem with WHEN

I'm using Oracle 11 g, Win XP.

I'm trying to load data with control below folder:
OPTIONS (SKIP=0, DIRECT=FALSE, PARALLEL=FALSE, BINDSIZE=50000, errors=999999,ROWS=200, READSIZE=65536)
LOAD DATA

APPEND
INTO TABLE v_table
when COL_3  = 'XXXX'
fields terminated by "|" optionally enclosed by '"'
trailing nullcols
(
COL_1  "trim(:COL_1)",
COL_2  "trim(:COL_2)",
COL_3  "trim(:COL_3)",
COL_4  "trim(:COL_4)",
COL_5  "trim(:COL_5)",
COL_6  "trim(:COL_6)",
COL_7  "trim(:COL_7)",
)

INTO TABLE v_table
APPEND
when  COL_3 = 'YYY'
fields terminated by "|" optionally enclosed by '"'
trailing nullcols
(
COL_1  "trim(:COL_1)",
COL_2  "trim(:COL_2)",
COL_3  "trim(:COL_3)",
COL_4  "trim(:COL_4)",
COL_5  "trim(:COL_5)",
COL_6  "trim(:COL_6)",
COL_7  "trim(:COL_7)",
)
Here is the sample data in the data file:
33432|"ORACLE"|"XXXX"|"555827             "|"317564"|" "|""|"ORACLE "|2011-07-20-15.37.11.879915|0001-01-01-01.01.01.000001
33433|"ORACLE"|"XXXX"|"555828             "|"317564"|" "|""|"ORACLE "|2011-07-24-15.37.11.879915|0001-01-01-01.01.01.000001
33434|"ORACLE"|"XXXX"|"555829             "|"317564"|" "|""|"ORACLE "|2011-07-10-15.37.11.879915|0001-01-01-01.01.01.000001
33435|"ORACLE"|"XXXX"|"555830             "|"317564"|" "|""|"ORACLE "|2011-07-22-15.37.11.879915|0001-01-01-01.01.01.000001
33436|"ORACLE"|"XXXX"|"555831             "|"317564"|" "|""|"ORACLE "|2011-07-20-15.37.11.879915|0001-01-01-01.01.01.000001
33437|"ORACLE"|"XXXX"|"555832             "|"317564"|" "|""|"ORACLE "|2011-07-20-15.37.11.879915|0001-01-01-01.01.01.000001
40048|"SAS"|"ZZZ "|"1017838            "|"317551"|" "|""|"COD "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
40049|"SAS"|"ZZZ "|"1017839            "|"317551"|" "|""|"COD "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
40050|"SAS"|"ZZZ "|"1017840            "|"317551"|" "|""|"COD "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
20046|"SUNUSA"|"YYY "|"1017836            "|"317551"|" "|""|"JAVA "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
20047|"SUNUSA"|"YYY "|"1017837            "|"317551"|" "|""|"JAVA "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
20048|"SUNUSA"|"YYY "|"1017838            "|"317551"|" "|""|"JAVA "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
20049|"SUNUSA"|"YYY "|"1017839            "|"317551"|" "|""|"JAVA "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
20050|"SUNUSA"|"YYY "|"1017840            "|"317551"|" "|""|"JAVA "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
It is:
When I'm loading data into the table with the above command card, only data with when COL_3 = "XXXX" is to take. And if I comment the block that has COL_3 = "XXXX", then the second block is to take (when COL_3 = "YYY"). But I am unable to load data for XXXX and YYY on a single charge. Can someone help me on this please?

This give a try.

  ...
  when  COL_3  = 'XXXX' and COL_3 = 'YYY'
  ...

Tags: Database

Similar Questions

  • SQL loader, problem with the POSITION & EXTERNAL

    Hi gurus of the Oracle.

    I have problem with position and external.

    I have the data file with the data of 1 million records

    data delimiter is to and eventually closed by «»

    some lines are not loaded due to errors of data, i.e. data contains some «»

    now, we decided to use the position & external. am unable to write the control file.

    any help would be much appreciated

    The table name is person_memo, 4 columns

    ID_PERSONVARCHAR2 (9 bytes)
    TX_MEMOVARCHAR2 (1000 bytes)
    ID_USERVARCHAR2 (20 bytes)
    TM_STAMPTIMESTAMP (6)

    my control file is

    DOWNLOAD THE DATA

    ADD THE PERSON_MEMO TABLE

    FIELDS TERMINATED BY ' |'

    SURROUNDED OF POSSIBLY "" "

    TRAILING NULLCOLS

    (

    ID_PERSON POSITION (1) "CUT (: ID_PERSON).

    , TX_MEMO POSITION (10) TANK (1000) "CUT (: TX_MEMO).

    , POSITION (1012) ID_USER "TRIM (: ID_USER).

    , TM_STAMP POSITION (1031) EXTERNAL (26) ' DECODE (: TM_STAMP, NULL, NULL, TO_TIMESTAMP (: TM_STAMP, ' YYYY-MM-DD - HH24.MI.)). SS. "(FF')).

    )

    specimen of data file

    "04725813" | "aka"Little Will"" "|" " 095TDEAN «|» 2013-02-21 - 11.13.44.632000

    "05599076" | "" FIRST NAME - ADDED A 'T' AS ON THE REG MAP | ' 016DDEAL ' |. ' 2014-04-11 - 10.06.35.598000

    Thanks and greetings

    REDA

    In your control file, EXTERNAL (26) must be INTEGER EXTERNAL (26).

    Unfortunately, this forum destroyed the spacing, so I can't say whether or not your data are positionnelles.  If it is positional, you can then use positions, but you need start and end positions.  The positions that you posted have nothing to do with the data you've posted.  If you use positions, you can eliminate the delimiters and the beginning and the end of citations using these positions.

    If your data are not positionnelles and you have quotes in your data in quotes, but you don't have the pipe delimiters in your data, then you can only use the delimiters and trim the quotes start and final data.

    I have demonstrated the two methods below, using test1.ctl for the positional method and test2.ctl for the defined method.

    Scott@orcl12c > host type test.dat

    "04725813" | "aka"Little Will"" | "" 095TDEAN «|» 2013-02-21 - 11.13.44.632000

    "05599076" | "" FIRST NAME - ADDED A 'T' AS ON THE REG MAP | ' 016DDEAL ' |. ' 2014-04-11 - 10.06.35.598000

    Scott@orcl12c > type host test1.ctl

    DOWNLOAD THE DATA

    ADD THE PERSON_MEMO TABLE

    FIELDS TERMINATED BY ' |'

    TRAILING NULLCOLS

    (ID_PERSON POSITION (02:10))

    , TX_MEMO POSITION (14:59)

    POSITION ID_USER (63:82)

    , TM_STAMP POSITION (85:110) EXTERNAL INTEGER (26) ' DECODE (: TM_STAMP, NULL, NULL,).

    TO_TIMESTAMP (: TM_STAMP, ' YYYY-MM-DD - HH24.MI.) SS. "(FF')).

    )

    Scott@orcl12c > type host test2.ctl

    DOWNLOAD THE DATA

    ADD THE PERSON_MEMO TABLE

    FIELDS TERMINATED BY ' |'

    TRAILING NULLCOLS

    (ID_PERSON "TRIM ("------"" FROM: ID_PERSON ")")

    , TX_MEMO CHAR (1000) "TRIM ("------"" FROM: TX_MEMO ").

    , ID_USER "TRIM ("------"" FROM: ID_USER ").

    , TM_STAMP INTEGER EXTERNAL (26) ' DECODE (: TM_STAMP, NULL, NULL,).

    TO_TIMESTAMP (: TM_STAMP, ' YYYY-MM-DD - HH24.MI.) SS. "(FF')).

    )

    Scott@orcl12c > create table person_memo

    2 (ID_PERSON VARCHAR2 (9 bytes)

    3, TX_MEMO VARCHAR2 (1000 bytes)

    4, ID_USER VARCHAR2 (20 bytes)

    5, TM_STAMP TIMESTAMP (6))

    6.

    Table created.

    Scott@orcl12c > host sqlldr scott/tiger control = test1.ctl data = test.dat log = test1.log

    SQL * Loader: release 12.1.0.1.0 - Production Thursday, May 15, 10:53:11-2014

    Copyright (c) 1982, 2013, Oracle and/or its affiliates.  All rights reserved.

    Path used: classics

    Commit the point reached - the number of logical records 2

    Table PERSON_MEMO:

    2 rows loaded successfully.

    Check the log file:

    test1.log

    For more information on the charge.

    Scott@orcl12c > select * from person_memo

    2.

    ID_PERSON

    ---------

    TX_MEMO

    --------------------------------------------------------------------------------

    ID_USER

    --------------------

    TM_STAMP

    ---------------------------------------------------------------------------

    04725813

    aka "Little Will"

    095TDEAN

    21 FEBRUARY 13 11.13.44.632000 AM

    05599076

    FIRST NAME - ADDED A 'T' AS ON THE REG MAP

    016DDEAL

    11 APRIL 14 10.06.35.598000 AM

    2 selected lines.

    Scott@orcl12c > truncate table person_memo

    2.

    Table truncated.

    Scott@orcl12c > host sqlldr scott/tiger control = test2.ctl data = test.dat log = test2.log

    SQL * Loader: release 12.1.0.1.0 - Production Thursday, May 15, 10:53:11-2014

    Copyright (c) 1982, 2013, Oracle and/or its affiliates.  All rights reserved.

    Path used: classics

    Commit the point reached - the number of logical records 2

    Table PERSON_MEMO:

    2 rows loaded successfully.

    Check the log file:

    test2.log

    For more information on the charge.

    Scott@orcl12c > select * from person_memo

    2.

    ID_PERSON

    ---------

    TX_MEMO

    --------------------------------------------------------------------------------

    ID_USER

    --------------------

    TM_STAMP

    ---------------------------------------------------------------------------

    04725813

    aka "Little Will"

    095TDEAN

    21 FEBRUARY 13 11.13.44.632000 AM

    05599076

    FIRST NAME - ADDED A 'T' AS ON THE REG MAP

    016DDEAL

    11 APRIL 14 10.06.35.598000 AM

    2 selected lines.

  • java.sql.SQLException: problems with the native/lack of loading methods library

    Hi all

    Please let me know how to fix the exception "java.sql.SQLException: problems with the native/lack of loading methods library: no ttJdbc in java.library.path".

    Thank you
    Prabhu

    Published by: Nina Prabhu on November 20, 2012 02:12

    Hi Prabhu,

    Probably, you must specify the LD_LIBRARY_PATH variable. Like the following:

    export LD_LIBRARY_PATH=$TIMESTEN_HOME/ttoracle_home/instantclient_11_1
    

    Best regards
    Gennady

  • Import data... Wizard creates the file SQL Loader ctl with columns out of order

    4.1.1.19 SQL Developer version. Connected to Oracle XE to test this.

    I'm trying to understand what the problem with the data in my import file when finally, I realize that the Import Wizard of data did not care how I traced the columns at all. The ctl SQL Loader file generated by the Wizard expects that the columns of data in my file to match the order that they appear in the definition of the table and not how they have been mapped in the wizard. Manually edit the ctl file is a workaround. Has anyone else seen elsewhere?

    I see that this is a bug.

  • SQL Loader problem

    Hi I'm using version 11.2.0.3.0 version of oracle, and I'm trying to push the file using sql loader utility to my DB table, below is my table structure, two of the column will be constant appears as mentioned below in the table structure. And the File_DATE column must be the combination of two fields (Date and time) of flat file to have the appropriate data column format. I get the error message and not be able to download the data using slot control file, so need help.

    sample data file

    OrderDate, name, size, Records, Date, Time
    06202014, authlogfile06202014.txt, 40777214, 198915, June 21 at 03:51
    06202014, transferfile06202014.txt, 372144, 2255, June 21 at 01:34
    06202014, balancefile06202014.txt, 651075, 10343, June 21 at 03:28

    The table structure

    Create the table file_stats
    (Systemtypecode VARCHAR2 (4000)-this will be a value to hardcode "CBD")
    LETTER Date,
    FILENAME VARCHAR2 (4000).
    Size of the file Number (20.0).
    Noofrecords NUMBER (20.0).
    File_DATE VARCHAR2 (4000).
    CREATED_DATE Date - it will be filled with the value SYSDATE
    );

    Here's my control file

    Options
    (SKIP = 1)
    DOWNLOAD THE DATA
    INFILE 'files.csv '.
    ADD
    IN the file_stats table
    FIELDS TERMINATED BY ', '.
    (Systemtypecode CONSTANT 'CBD',
    LETTER DATE 'MMDDYYYY.
    CHAR FILE NAME,
    Size of the ENTIRE file,
    Noofrecords INTEGER,
    boundfiller TANK file_DATE_DDmon
    , file_DATE ' to_date ('2014':: file_DATE_DDmon:: file_DATE, 'YYYYMON DDHH24:MI').
    , created_date CONSTANT 'sysdate.
    )

    When running under command, all erroneous records on as below

    sqlldr schema1/pwd@db1 control = file_stats.ctl log = file_stats.log file_stats.bad = bad

    ERROR:
    Sheet 1: Rejected - error on the table FILE_STATS, column FILE_DATE.
    ORA-01843: not one month valid

    You must add TRAILING NULLCOLS and use EXTERNAL INTEGER instead of an INTEGER and your file_date column must be of type DATE data.  Please see the demo below with some additional corrections.  Note that since there is no year in the data file for the file_date column, which by default is the current year.  If you want to 2014 so you need to concatenate that and add the date format YYYY.

    Scott@orcl12c > files.csv TYPE of HOST

    OrderDate, name, size, Records, Date, Time

    06202014, authlogfile06202014.txt, 40777214, 198915, June 21 at 03:51

    06202014, transferfile06202014.txt, 372144, 2255, June 21 at 01:34

    06202014, balancefile06202014.txt, 651075, 10343, June 21 at 03:28

    Scott@orcl12c > file_stats.ctl TYPE of HOST

    OPTIONS (SKIP = 1)

    DOWNLOAD THE DATA

    INFILE 'files.csv '.

    ADD IN THE TABLE file_stats

    FIELDS TERMINATED BY ',' TRAILING NULLCOLS

    (systemtypecode CONSTANT "CBD"

    , odate DATE 'MMDDYYYY '.

    filename CHAR

    INTEGER EXTERNAL file size

    noofrecords INTEGER EXTERNAL

    file_date_ddmon BOUNDFILLER TANK

    , file_date DATE 'My DDHH24:MI' ': file_date_ddmon |: file_date '

    , created_date 'SYSDATE.

    )

    Scott@orcl12c > file_stats CREATE TABLE

    2 (systemtypecode VARCHAR2 (14),)

    3 DATE odate,.

    4 name of file VARCHAR2 (24).

    size of the file NUMBER 5 (20.0).

    6 noofrecords NUMBER (20.0).

    7 file_date DATE,

    8 created_date DATE)

    9.

    Table created.

    Scott@orcl12c > HOST SQLLDR scott/tiger CONTROL = file_stats.ctl LOG = file_stats.log file_stats.bad = BAD

    SQL * Loader: release 12.1.0.1.0 - Production on Thu Feb 5 15:26:49 2015

    Copyright (c) 1982, 2013, Oracle and/or its affiliates.  All rights reserved.

    Path used: classics

    Commit the point reached - the number of logical records 3

    Table FILE_STATS:

    3 rows loaded successfully.

    Check the log file:

    file_stats.log

    For more information on the charge.

    Scott@orcl12c > SELECT * FROM file_stats

    2.

    SYSTEMTYPECODE, KAI NOOFRECORDS FILE_DATE CREATED_DATE FILENAME FILE SIZE

    -------------- --------------- ------------------------ ---------- ----------- --------------- ---------------

    Friday, June 20, 2014 authlogfile06202014.txt 40777214 198915 Sunday 21 June 2015 CBD Thursday, February 5, 2015

    Friday, June 20, 2014 transferfile06202014.txt 372144 2255 Sunday 21 June 2015 CBD Thursday, February 5, 2015

    Friday, June 20, 2014 balancefile06202014.txt 651075 10343 Sunday 21 June 2015 CBD Thursday, February 5, 2015

    3 selected lines.

  • Problem with when-validate-trigger

    Hi all

    I work with form of oracle 10g,

    I developed a Form with two blocks: block Query_find and Guarantee_block (this is my main block). And query to find block has 5 fields. For example Po_number, Guarantee_number and Guarantee_type etc.

    Thus, in the query is block I have a button called NEW, FIND, when we enter Po_number and click SEARCH, then he gets the details of this po number and moved to Guarantee_block with all the details of the guarantee_block.

    So when the user click on the button again travels to the block of the warranty and the user needs to enter new data and save. SO I wrote some simple validations like that for any field. When the user enter a value guarantee no field and he tries to move it will lift a msg. Or well, if he tries to enter the existing guarantee number, it will get a message. Its works fine.

    But my problem is when the user to enter the po_number to block Query_find and click Find button it throws this message ("security number already exist"). I don't know how its happens I wrote this procedure inside a pkg and called only in the when validate article in this topic only. Can any pls tell me what is wrong with this code, and why this trigger activated automatically when we click on the button search.

    PROCEDURE C_GUARANTEE_NO (event VARCHAR2)
    IS
    number of lv_count;
    BEGIN
    IF (event = "WHEN-VALIDATE-ITEM")
    THEN
    IF: BANK_GUARANTEE_BLK. C_GUARANTEE_NO IS NULL
    THEN
    fnd_message.set_string ('enter security number');
    fnd_message. Show;
    RAISE form_trigger_failure;
    On the other
    Select count (1) in the lv_count of xxbgs_bank_guarantee_master
    where c_guarantee_number =: BANK_GUARANTEE_BLK. C_GUARANTEE_NO;
    If lv_count > 0 then
    fnd_message.set_string ("guaranteed this number already Exist");
    fnd_message. Show();
    RAISE form_trigger_failure;
    END IF;
    END IF;
    END C_GUARANTEE_NO;


    Concerning
    Srikkanth

    Hello
    Your design is very complex to me.
    Your design says, insert you data in to your BANK_GUARANTEE_BLK, so the validation trigger is activated.
    You can do another way using the lov.

    Hoping to understand...

    Published by: HamidHelal on January 25, 2012 22:44

  • Temporary solution to load problem with Satellite C650

    I here provide a temporary solution to loading with Satellite C650 problem, which I have faced.

    First of all, I've updated my BIOS. After that, I removed the battery and plugged AC for 2-3 minutes.

    Then replace the battery and it loaded.

    However, the battery got charged up to 99% only and after that the interrupted load and the load remained at 99% and renewed battery indicating "plugged in, not charging."

    Therefore, each time I hv to repeat the procedure above to charge my battery.

    If any1 has the solution to the prob. above, it would be gr8 to hear.

    Thank you very much!

    Thanks for your posting.
    To be honest I really don t know what else you can expect and test it with new battery.

    I think that your laptop model must always have a warranty valid, right?
    If so, contact Toshiba nearest service provider and ask for replacement.

  • Developer SQL connected to a DB SQL Server (problem with "INSERT an IDENTITY")

    I use developer SQL to connect to a DB in SQL Server by using the jTDS jar file.

    Problem is that when I try to run the "SET IDENTITY_INSERT < TABLE >;" SQL Developer has told me that he skips the command.
    Error message in the console: "SQLPLUS Command ignored: set IDENTITY_INSERT < TABLE > WE.

    I am running:
    Developer SQL v3.2.09 (Build HAND-09: 30 pm)
    JTDS - jtds - 1.2.6.jar
    JAVA jdk1.7.0_05

    Any idea why this might be happening?

    NOTE: I know that if the 'IDENTITY_INSERT' is already activated, then it could happen, but believe me, it isn't. Because, if I try to run a command "INSERT IN...» ", I get the error that the"INSERT an IDENTITY' is DISABLED for the table.

    Published by: Ali-Star August 29, 2012 11:22

    Set identity_insert is a command to SQL Server that is not managed by SQL Developer. You must pass the set command as it is on the SQl Server using the / * sqldev:stmt * /, for example:

    drop table test1;
    -Online moved as test1 table.

    create table test1 (col1 int identity (1,1));
    -Online table TEST1 created.

    Insert into test1 (col1) values (10);
    => Error from line 4 on order:
    -Online insert into test1 (col1) values (10)
    -Online error in the command line: 4 column: 0
    => Error report:
    -Online SQL error: cannot insert an explicit value for identity in the table 'test1' column when IDENTITY_INSERT is set to OFF.

    When I use the sqldev:stmt setting now I can place an order directly to SQL Server:

    / * sqldev:stmt * / set identity_insert test1;
    -Online set IDENTITY_INSERT succeeded.

    Insert into test1 (col1) values (11);
    -Online 1 lines inserted.

  • Oracle <>- MS SQL server, problem with DATE

    HS current options are:

    HS_FDS_TRACE_LEVEL = 255

    HS_FDS_SHAREABLE_NAME=/usr/lib64/libodbc.so

    HS_FDS_FETCH_ROWS = 1

    HS_FDS_SQLLEN_INTERPRETATION = 32

    The following image contains 2 screenshots.

    http://maslovd.no-IP.org/public/doc/date_problem.PNG

    1. selection of table of MS SQL server.

    2 selection in Oracle DB link.

    Any suggestions?

    I asked here, that is why the DateForm column are returned as data type-(9)SQL_WVARCHAR).

    Could you please post the SQL Server table definition?

    What's the FreeTDS version and on which platform did you configure DG4ODBC?

    - Klaus


  • Garmin iQue 3600 PDA/GPS map loading problems with Vista Home Premium 64

    I can download the Garmin DVD for my hp computer but the operating system does not recognize the Ique 3600 device on the USB port.
    How can I get 64 WV to recognize and download on my Garmin?
    My Toshiba 32 bit system works well, but it takes hours to download a complete set of cards.

    Thank you;  j.Kuss

    http://www.Microsoft.com/Windows/compatibility/Windows-Vista/default.aspx

    Windows Vista Compatibility Center

    First thing to do is to check its Vista compatibility at the link above, and if not to see what patches/solutions are available from its manufacturer...

    See you soon. Mick Murphy - Microsoft partner

  • SQL * Loader vs external tables to a single file with several types of records (intercalated)

    I have a file of sample data (we will have the a 'true' at a later date and put in day after that) which includes a header, footer, and 5 types of records, that have different columns and lengths, noticed by the first two characters. The different types of records are not all together. On the contrary, some (in particular, two of these types in this example) are intertwined. I am currently working on a SQL * Loader configuration file when it was suggested that I use external tables. I know very little of either, then I would ask what is the best to use.

    Scott@orcl12c > host type test.dat

    header line

    AB, 123, efg

    CD, hij, 456

    Scott@orcl12c > type host test.ctl

    options (Skip = 1)

    load data

    in the ab table truncate where table_name = 'ab'

    fields ended by ',' trailing nullcols

    (table_name filler position (1), col1, col2)

    in the cd table add where table_name = 'cd'

    fields ended by ',' trailing nullcols

    (table_name filler position (1), col3, col4)

    Scott@orcl12c > create table ab

    2 (col1 number,

    3 col2 varchar2 (8))

    4.

    Table created.

    Scott@orcl12c > insert into ab values (1, 'old data')

    2.

    1 line of creation.

    Scott@orcl12c > create table cd

    2 (col3 varchar2 (8))

    3 col4 number)

    4.

    Table created.

    Scott@orcl12c > insert into cd values ("old data", 1).

    2.

    1 line of creation.

    Scott@orcl12c > commit

    2.

    Validation complete.

    Scott@orcl12c > host sqlldr scott/tiger control = test.ctl data = test.dat log = test.log

    SQL * Loader: release 12.1.0.1.0 - Production on Thu Mar 27 13:11:47 2014

    Copyright (c) 1982, 2013, Oracle and/or its affiliates.  All rights reserved.

    Path used: classics

    Commit the point reached - the number of logical records 2

    Table AB:

    1 row loaded successfully.

    Table D:

    1 row loaded successfully.

    Check the log file:

    test.log

    For more information on the charge.

    Scott@orcl12c > select * AB

    2.

    COL1 COL2

    ---------- --------

    EFG 123

    1 selected line.

    Scott@orcl12c > select * from cd

    2.

    COL3 COL4

    -------- ----------

    old data 1

    hij 456

    2 selected lines.

  • SQL Loader Help in 10g

    Hello

    Is it possible to abort the sql loader with force when a value is not present? I have data like this file

    1. XXX123 | XXX | 20121121 |
    4. XXX123 | XXX |
    5. XXX123 | XXX | 1.
    5. XXX123 | XXX | 2.
    5. XXX123 | XXX |
    9. XXX123 | XXX |

    Model:
    record type | Batch number. lot desc | date | detail line num | other

    1,4,5,9 are the types of records, if you see this line 5. XXX123 | XXX | 1| .. 1 represents a number of detail line, my requirement is if the number of detail line is null for the record type 5 then I want to abort the sqlloader.

    Is this possible?

    Published by: 940838 on November 21, 2012 23:54

    940838 wrote:
    I think that I am not clear in my requirement...

    The question was how to abort the charger if the number of detail line is not present in the record type 5. It is, however, normal detailing line num is not required for other types of records. points of view.

    Hello

    you were clear, and I did a quick test. Unfortunately you can not do this check in SQL * Loader as the WHEN in the control file clause allows not just GOLD.

    Even if you add this check using a constraint in your table and specify the maximum number of errors to 0, SQL * Loader will load the files until this error.

    Let me show you an example:

    (1) create a table with a constraint that, for record_type detail_line_number 5, may not be null.

    CREATE TABLE test
    (
       record_type    INTEGER
     , batch_number   VARCHAR2 (10)
     , batch_desc     VARCHAR2 (10)
     , batch_date     DATE
     , detail_line_num INTEGER
     , other          VARCHAR2 (10)
    );
    
    ALTER TABLE test
      ADD CONSTRAINT check_rec_5
         CHECK (   record_type = 5 AND detail_line_num IS NOT NULL
                OR record_type != 5) ENABLE;
                
    

    In this table, you will not be able to load lines with a record_type = 5 and NULL detail_line_num as this will be considered an error.

    We will prepare your input file:

    1|XXX123|XXX|20121121||
    4|XXX123|XXX|||
    5|XXX123|XXX||1|
    5|XXX123|XXX|||
    5|XXX123|XXX|||
    9|XXX123|XXX|||
    1|XXX123|XXX|20121121||
    4|XXX123|XXX|||
    5|XXX123|XXX||1|
    5|XXX123|XXX||2|
    5|XXX123|XXX|||
    9|XXX123|XXX|||1|XXX123|XXX|20121121||
    4|XXX123|XXX|||
    5|XXX123|XXX||1|
    5|XXX123|XXX||2|
    5|XXX123|XXX|||
    9|XXX123|XXX|||1|XXX123|XXX|20121121||
    4|XXX123|XXX|||
    5|XXX123|XXX||1|
    5|XXX123|XXX||2|
    

    As you can see that the input file has the fourth line with record_type = 5 and detail_line_num NULL. It is a mistake of the constraint.

    Here I used the control file:

    --test.ctl
    load data
    INFILE 'test.dat'
    APPEND
    INTO TABLE test
    FIELDS TERMINATED BY '|'
    TRAILING NULLCOLS
    (
     record_type     ,
     batch_number    ,
     batch_desc      ,
     batch_date      Date 'YYYYMMDD',
     detail_line_num ,
     other
    )
    

    If I try to run the SQL * Loader and ask to stop at the first error in this way:

    sqlldr userid=yourname/yourpass@yourdb control=test.ctl errors=0 rows=100
    

    SQL * Loader performs only 3 folders because it encounters an error in line 4 and having specified errors = 0 will not continue to load. In fact, the process will continue until she reached the point of validation (in this case 100 lines), but it is not loading any record after the error, or continue to read the file.

    So, if I check the table

    SELECT * FROM test;
    
    RECORD_TYPE BATCH_NUMBER BATCH_DESC BATCH_DATE            DETAIL_LINE_NUM OTHER
    ----------- ------------ ---------- --------------------- --------------- ----------
              1 XXX123       XXX        21-11-2012 00:00:00
              4 XXX123       XXX
              5 XXX123       XXX                                            1           
    

    You will only see the records until you have reached the error.

    This can be avoided, as documented in SQL * Loader manual reference:

    Load interrupted because they exceeded the maximum number of errors


    If the maximum number of errors is exceeded, SQL * Loader stops loading documents in any table and the work accomplished so far is committed.

    As you can see SQL * Loader to interrupt the treatment, but it engage in any case files until this error.

    Another solution is to create an external table in Oracle and do all the checks you want before you copy your table in a database table, as suggested BluShadow.

    Kind regards.
    Al

  • SQL Loader control file (.ctl) given in table

    Hello

    I have a table 'region' in my schema called dss.
    SQL > desc region
    R_REGIONKEY
    R_NAME
    R_COMMENT
    I want to LOAD the data into this table of the region in a file called "region.tbl" located at d:\tpch\region.tbl, using a CONTROL file that is also on the the same place is to say d:\tpch\region.ctl.

    Region.tbl file contains few records. Recording of a SAMPLE of region.tbl is as follows:

    * 1 | AMERICA | HS ironic use, same request. s | *

    Region.CTL file contains the following lines to load data using sql loader:

    DOWNLOAD THE DATA
    INFILE'd:\tpch\region.tbl'
    IN THE REGION OF THE TABLE
    FIELDS TERMINATED BY ' |'
    * (R_REGIONKEY, R_NAME, R_COMMENT) *.

    so I go to the command prompt, navigate to the location of the above files and run the command:
    D:\TPCH > sqlldr control = region.ctl dss/dss = userid
    and get the following error:

    SQL * Loader-128: unable to start session
    ORA-01017: name of user and password invalid. connection refused

    I have the same table under the "system" user creation, but I get a different error with this that is:

    SQL * Loader-941: error when the table REGION describe statement
    ORA-04043: object REGION does not exist

    Using Oracle 11 g on my Windows XP.

    I'll appreciate the valuable suggestion.

    Thank you very much.

    Best regards
    Kam

    You can connect with dss/dss? For example, what

    sqlplus dss/dss
    

    If no, and you are sure that the name of username/password combination is correct, have you checked things like $ORACLE_SID - i.e. it points to the right database?

  • Problem with a value using LOV Colon

    PLUG

    Background

    Normally, I wouldn't create a dynamic LOV which has two points of value (ie the 2nd column of the SELECTION)

    However, I use a dynamic LOV to choose the names of different collections of APEX_COLLECTIONS and display the collection selected in a basic report.

    This is used to help me build "excel analyzers" which are pure SQL based.

    (The interactive report is used to develop the SQL)

    Problem

    When I have the listener configured to put every Excel sheet in its own tab, Collection names are in the format:

    {Explorer of goods name}: {name of the journal}

    example:

    P63_FILENAME:SHEET1

    P63_FILENAME:SHEET2

    The colon seems to cause confusion within the APEX; I LOV setting "Page Action when Value Change ' value 'redirect and set the value" to allow an "auto-refresh".

    Actually, I only return the string "P63_FILENAME".

    My question:

    How can I correct the SQL for the LOV and/or the REPORT so that the colon is correctly encoded / decoded and will work?

    SQL for LOV:  (Name of the P62_COLLECTION_NAME element)

    Select distinct collection_name C_NAME

    , collection_name C_VALUE - I tried APEX_ESCAPE. HTML_ATTRIBUTE(), but also failed

    of apex_collections

    SQL for the base report:

    Select seq_id, c001, c002, c003, c004, c005, c006, c007, c008, c009, c010

    of apex_collections

    where collection_name = upper (: P62_COLLECTION_NAME)

    Dear Mike,

    "Page Action during change of value" value "Submit Page" seems to fix your problem - to keep the same setting, one option is to replace the colon in your LOV thus is at the request of report to something else (for example the feature underscores):

    SQL for LOV:

    Select distinct collection_name C_NAME

    , replace (collection_name, ':', '_') C_VALUE

    of apex_collections

    SQL for the base report:

    Select * from apex_collections

    where replace (collection_name, ':', '_') = upper (: P62_COLLECTION_NAME)

    Kind regards

    Peter

  • SQL Loader what Condition

    Hello. I am currently using the when option in my SQL Loader script:
    when (01:07) <> '9000001'
    But now I find that I must exclude several channels '9000001', '9000002 ', '9000003', 9000004'. Is there a way to do this? Any help would be greatly appreciated.

    Sharpe says:
    Awesome. Thank you. One last question... If I have these channels (for example '9000001', etc.) stored in a different table is anyway to reject the records corresponding to this value into my table? Something like a nested statement select?

    when (01:07) not in(select mystr from mytable)
    

    Let me know is that references a table is possible or not. Thank you.

    As far as I know, you can do that. I believe that the Organizer must be or = or! = and to compare value must be a literal, not a select function or expression. You can use SQL * Loader (or an external table if your data on the server, not the client) to load the data into a staging table, then use SQL to insert... where... not in (mystr select from myTable).

Maybe you are looking for

  • Problems with macOS Sierra?  This fix can help you.

    Hello world. I wanted to post a solution that I have developed due to some erratic behaviors that I lived with macOS Sierra (no sound, slow don't boot, sluggish performance,... etc.).  This problem all my problems: After the upgrade to Sierra, restar

  • How do I reinstall my Tecra M11-103?

    I need to (re) format my laptop, I have a Tecra M11-103 with Windows 7 pro 64-bit. I got it with only a downgrade WXP CD but no CD installation of Windows 7. Laptop is no longer under warranty. How can I reformat? It seems that I used to have nero ba

  • NB510-109 - recovery data backup

    Hello My SSD has failed and I need a new. Is it possible to save or retrieve the data recovery including the Windows Version of content so that I can install my existing Windows Version on my new SSD starter? The restore option does not work. I have

  • Table write2D LabVIEW 2011 to the spreadsheet file

    Today, I tried to write 2D array in a file with the 'writing on a spreadsheet file' function with the new 2011 (f2) of LabVIEW. But it generates the following results - all the data in a column (supposed to be column 2). LV2011 WT

  • Why the fast user switching and the new Windows Search use the Terminal Services?

    Recently, I was struck by a piece of really bad, malicious malware that totally trashed my operating system.  Therefore, as I rebuild my system, I am applying the appropriate safety precautions.  Among the recommendations of Microsoft is to reduce yo