Important step of respective when table SQL loader data in Forms10g

Hi all

I load the content of the file .csv at table. I m using Oracle Forms 10 g, I kept the file .csv and .ctl file in application server c:\ DIRECTORY. Shaped, I called the host command to execute the batch file. But the data does not load and no mistake.
Even I also batch file in the path of the Application Server C:\.
I used a code is

HOST ('C:\TEST.) BEATS ');

contents of the batch file is

Jump control = c:\GL2009_TEST.ctl bad csd/a@iismig C:\oraclexe\app\oracle\product\10.2.0\server\BIN\sqlldr.exe = C:\GL2009_TEST.bad = 1 log = c:\gl2009_test.log


the .ctl file content is

DOWNLOAD THE DATA
INFILE ' C:\GL2009. CSV'
REPLACE
IN THE F_GL_SMRY_TEMP TABLE
FIELDS TERMINATED BY ', '.
(ref_id, tran_date, dr_amt, cr_amt, acct_code, sub_code, cctr_code, sundry_code, dept_code)


If I run test.bat in manual mode without Application Forms... It works fine without error. sqlloader is installed on the application server.

Please give suggestions...

Kind regards
Krishna

Hello
Are there any bad or fine log generated? and when you press the button to run the host command it takes time
It seems like treatment?

-Clément

Tags: Oracle Development

Similar Questions

  • How to load xml using sql loader data?

    Is there any load script of XML data in a table through sql loader?

    Help me out here...

    Thank you

    You can post your code here? while I can help you... your code format should be like this...

    Example code:

    DOWNLOAD THE DATA

    INFILE *.

    IN THE TABLE po_tab

    ADD

    XMLTYPE (xmldata)

    FIELDS

    (xmldata TANK (2000))

    BEGINDATA

    "" http://www.Oracle.com/po/"xmlns: xsi ="http://www.w3.org/2001/XMLSchema-instance"

    "" xsi: schemaLocation = "http://www.oracle.com/PO http://www.oracle.com/scha0/po1.xsd"

    orderDate = "1999-10-20" >

    Alice Smith

    123 maple Street

    Mill Valley

    CA

    90952

    Robert Smith

    8 oak Avenue

    Old town

    PA

    95819

    Hurry, my lawn is going wild!

    Lawn mower

    1

    148.95

    Confirm this is electric

    Baby monitor

    1

    39.98

    1999-05-21

  • SQL * Loader data loading

    Hello
    I have oracle 10g AIX environment. I am trying to load data into a table that contains columns about 200. The flat file composed of records that extends up to 2 lines of the file. While trying to load, it displays error. When I change the flat for each record it in a single line woks. How can I load files where the fields of total of records spans several lines in the flat file.
    Thanks in advance.

    Hello

    You can take a look at these two options for setting: CONCATENATE and CONTINUIF. If you have always 2 physical records to form a logic, you can use CONCATENATE followed to an integer, which said combine integer rows (records) in a logical record. CONTINUIF you use if you do not always have these hard limits, but it depends on the values in the first row.

    You can find the options in the Manual: http://download.oracle.com/docs/cd/E11882_01/server.112/e16536/ldr_control_file.htm#i1005509

    Herald tiomela
    http://htendam.WordPress.com

  • SQL * Loader Date show

    How can I get sql loader to recognize recognize AM/PM. My control file is below. The error I get is: error of syntax on line 14. Expecting Field_Name, found ").

    OPTIONS (SKIP = 0)
    DOWNLOAD THE DATA
    INFILE ' / ftp_data/labor_scheduling/kronos_punch_temp/kronos_punches.csv'
    BADFILE ' / ftp_data/labor_scheduling/kronos_punches/kronos_punches.bad'
    DISCARDFILE ' / ftp_data/labor_scheduling/kronos_punches/kronos_punches.dsc'

    IN THE TABLE "STAFFING". "" KRONOS_TEMP_PUNCHES ".
    TRUNCATE
    FIELDS TERMINATED BY ', '.
    SURROUNDED OF POSSIBLY "" "
    (EMPLOYE_ID,
    SCHEDULE_DATE DATE MM/DD/YY"."
    DATE STAMP "DD/MM/YY HH: MI AM.
    )

    Or try this->

    PUNCH DATE to_date(:PUNCH,'MM/DD/YY HH:MI AM')
    

    Kind regards.

    LOULOU.

  • SQL Loader data loading problem

    Hi all, Oracle version 11.0.2 I have a table with 6 columns with all the non-null fields, if my flat file does not contain a date and then during the insertion, I insert the sysdate. But here, I can't make my column as default sysdate while creating the table. Is my control file and data are the following: the error I get is on the PLAY_DATE column. LOAD DATA INFILE *-'c:\combo-fix.txt' INTO TABLE results APPEND FIELDS TERMINATED BY ',' TRAILING NULLCOLS (fixture_date DATE play_date, 'DDMMYYYY' DATE 'DDMMYYYY' 'NVL(:play_date,sysdate)', home_team, away_team, home_points 'NVL(:home_points,0)' away_points' NVL(:away_points,0) "") BEGINDATA 18082012,18082012, AUSTRALIA, NEW ZEALAND, 19, 27 18082012,18082012, AFRICA, ARGENTINA, 27, 6, 25082012, New Zealand, AUSTRALIA, 22, 0 25082012 , ARGENTINA, SOUTH AFRICA, 08092012,08092012 16.16, AUSTRALIA, SOUTH AFRICA, 26, 19, 08092012, New Zealand, ARGENTINA, 21, 5 15092012, AUSTRALIA, ARGENTINA, 15092012,15092012, NEW ZEALAND, AFRICA, 29092012, SOUTH AUSTRALIA, 29092012 AFRICA, ARGENTINA, New Zealand, 06102012,06102012, AFRICA, New Zealand, 06102012, ARGENTINA, AUSTRALIA, thank you...

    Apply function to_char sysdate to bring it to the same format as play_date. See below:

    DOWNLOAD THE DATA

    INFILE 'd:\fix.txt '.

    TABLE OF RESULTS

    ADD COMPLETED FIELDS BY «,»

    TRAILING NULLCOLS

    (

    fixture_date DATE "DDMMYYYY"

    play_date DATE "DDMMYYYY" "NVL (:play_date,to_char(sysdate,'DDMMYYYY')),"

    home_team,

    away_team,

    home_points "NVL(:home_points,0)"

    away_points "NVL(:away_points,0)".

    )

  • Index rebuild required after truncate the table and load data

    Hello


    I have a situation that we truncate tables bit and then we loaded data [only content] on these tables. What you need to rebuild the index online is necessary or not?


    And another question is if we drop a few clues is the total amount of space is released or not. And re-create indexes will use the same amount of space. As I don't have disk space more? In this situation, rebuild the index online will be a better idea...

    Can you please on this...


    truncate the table some the few loading tables + reconstruction markings online is the best (or) droping little tables, a few tables loading + re-create the index is better

    Can you suggest the best way... We have a time that it currently we don't have enough space on the disk... [Option should not effect the space]

    user13095767 wrote:
    Ok. I have it...

    u want to say if we disbale the index while loading... Next, we need to spend the time to build.

    If the indexes are enabled, then rebuild again is not necessary after loading tables...

    Please answer if my understanding is correct...

    above is correct

    >

    If so, how abt the differences in the space occupied by the spaces of storage during the index rebuild and re-create... T he acquires more space if recreate us [deletion and creation] or rebuild online is preferable to an index...?

    space used is the same for all options.

  • Ctl SQL Loader Date Format file

    Hi, I have a source file with the format of date as * [23:02:2012] [04:04:50:579] *, how can I remove ' [' and ']' and load it as 23:02:2012 04:04:50:579.

    I tried like

    CallSetupTime DATE completed by ',' ENCLOSED BY '[' AND '] ' HH24MISSFFFFF ' DD/MM/YYYY "

    BUT did not work. Please help me. Thank you very much..

    Kind regards
    Sam

    I suggest to use the correct data type. See here: sql loader timestamp

    for example CallSetupTime TIMESSTAMP "[JJ] [HH24:MI:SS:FFF].

    the format must exactly match the format of your data. Maybe it isa best format for the second millis, see the mentioned thread and formats the docs on the stamp

    concerning

  • How do I skip the first 5 lines of a txt file when using sql * loader

    Hello

    I have a txt file that contains state of header information that I don't need. How can I skip this line when you import the file into my database?

    See you soon

    You can use the SKIP option: http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_params.htm#sthref626

  • SQL Loader Data Conversion

    We is receving a column of values (example: Product Description) which is in a language Chinese. These lines must be converted into English and load it into the database table custom.

    No opportunities?

    Thank you and best regards,
    Balakirshna

    There are services of commercial translation out there that can do it for you for a fee. Alternatively, you can do it yourself if you know languages.

    HTH
    Srini

  • Optimize the BSO - SQL Loader data loading

    Hello

    From left to right on the fields, I put the sparse dims sorted and in some order first then the Dense dims, then data?  Or is this Dense first and then Sparse and then data?   Seems there are conflicting ideas here...

    optimal order is most certainly rare members fors, followed by dense members.  To confirm the best order, take you cube and if there is no data in it, then charge a small amount using a lock of excel and send it, if there is, fine, then make a column for EAS export (level 0 is fine) and look at the exported file. The order of the dimensions is the optimal order of reloading.

  • Question to load data using sql loader in staging table, and then in the main tables!

    Hello

    I'm trying to load data into our main database table using SQL LOADER. data will be provided in separate pipes csv files.

    I have develop a shell script to load the data and it works fine except one thing.

    Here are the details of a data to re-create the problem.

    Staging of the structure of the table in which data will be filled using sql loader

    create table stg_cmts_data (cmts_token varchar2 (30), CMTS_IP varchar2 (20));

    create table stg_link_data (dhcp_token varchar2 (30), cmts_to_add varchar2 (200));

    create table stg_dhcp_data (dhcp_token varchar2 (30), DHCP_IP varchar2 (20));

    DATA in the csv file-

    for stg_cmts_data-

    cmts_map_03092015_1.csv

    WNLB-CMTS-01-1. 10.15.0.1

    WNLB-CMTS-02-2 | 10.15.16.1

    WNLB-CMTS-03-3. 10.15.48.1

    WNLB-CMTS-04-4. 10.15.80.1

    WNLB-CMTS-05-5. 10.15.96.1

    for stg_dhcp_data-

    dhcp_map_03092015_1.csv

    DHCP-1-1-1. 10.25.23.10, 25.26.14.01

    DHCP-1-1-2. 56.25.111.25, 100.25.2.01

    DHCP-1-1-3. 25.255.3.01, 89.20.147.258

    DHCP-1-1-4. 10.25.26.36, 200.32.58.69

    DHCP-1-1-5 | 80.25.47.369, 60.258.14.10

    for stg_link_data

    cmts_dhcp_link_map_0309151623_1.csv

    DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2

    DHCP-1-1-2. WNLB-CMTS-03-3,WNLB-CMTS-04-4,WNLB-CMTS-05-5

    DHCP-1-1-3. WNLB-CMTS-01-1

    DHCP-1-1-4. WNLB-CMTS-05-8,WNLB-CMTS-05-6,WNLB-CMTS-05-0,WNLB-CMTS-03-3

    DHCP-1-1-5 | WNLB-CMTS-02-2,WNLB-CMTS-04-4,WNLB-CMTS-05-7

    WNLB-DHCP-1-13 | WNLB-CMTS-02-2

    Now, after loading these data in the staging of table I have to fill the main database table

    create table subntwk (subntwk_nm varchar2 (20), subntwk_ip varchar2 (30));

    create table link (link_nm varchar2 (50));

    SQL scripts that I created to load data is like.

    coil load_cmts.log

    Set serveroutput on

    DECLARE

    CURSOR c_stg_cmts IS SELECT *.

    OF stg_cmts_data;

    TYPE t_stg_cmts IS TABLE OF stg_cmts_data % ROWTYPE INDEX BY pls_integer;

    l_stg_cmts t_stg_cmts;

    l_cmts_cnt NUMBER;

    l_cnt NUMBER;

    NUMBER of l_cnt_1;

    BEGIN

    OPEN c_stg_cmts.

    Get the c_stg_cmts COLLECT in BULK IN l_stg_cmts;

    BECAUSE me IN l_stg_cmts. FIRST... l_stg_cmts. LAST

    LOOP

    SELECT COUNT (1)

    IN l_cmts_cnt

    OF subntwk

    WHERE subntwk_nm = l_stg_cmts (i) .cmts_token;

    IF l_cmts_cnt < 1 THEN

    INSERT

    IN SUBNTWK

    (

    subntwk_nm

    )

    VALUES

    (

    l_stg_cmts (i) .cmts_token

    );

    DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_cmts (i) .cmts_token);

    ON THE OTHER

    DBMS_OUTPUT. Put_line ("token is already present'");

    END IF;

    WHEN l_stg_cmts EXIT. COUNT = 0;

    END LOOP;

    commit;

    EXCEPTION

    WHILE OTHERS THEN

    Dbms_output.put_line ('ERROR' |) SQLERRM);

    END;

    /

    output

    for dhcp


    coil load_dhcp.log

    Set serveroutput on

    DECLARE

    CURSOR c_stg_dhcp IS SELECT *.

    OF stg_dhcp_data;

    TYPE t_stg_dhcp IS TABLE OF stg_dhcp_data % ROWTYPE INDEX BY pls_integer;

    l_stg_dhcp t_stg_dhcp;

    l_dhcp_cnt NUMBER;

    l_cnt NUMBER;

    NUMBER of l_cnt_1;

    BEGIN

    OPEN c_stg_dhcp.

    Get the c_stg_dhcp COLLECT in BULK IN l_stg_dhcp;

    BECAUSE me IN l_stg_dhcp. FIRST... l_stg_dhcp. LAST

    LOOP

    SELECT COUNT (1)

    IN l_dhcp_cnt

    OF subntwk

    WHERE subntwk_nm = l_stg_dhcp (i) .dhcp_token;

    IF l_dhcp_cnt < 1 THEN

    INSERT

    IN SUBNTWK

    (

    subntwk_nm

    )

    VALUES

    (

    l_stg_dhcp (i) .dhcp_token

    );

    DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_dhcp (i) .dhcp_token);

    ON THE OTHER

    DBMS_OUTPUT. Put_line ("token is already present'");

    END IF;

    WHEN l_stg_dhcp EXIT. COUNT = 0;

    END LOOP;

    commit;

    EXCEPTION

    WHILE OTHERS THEN

    Dbms_output.put_line ('ERROR' |) SQLERRM);

    END;

    /

    output

    for link -.

    coil load_link.log

    Set serveroutput on

    DECLARE

    l_cmts_1 VARCHAR2 (4000 CHAR);

    l_cmts_add VARCHAR2 (200 CHAR);

    l_dhcp_cnt NUMBER;

    l_cmts_cnt NUMBER;

    l_link_cnt NUMBER;

    l_add_link_nm VARCHAR2 (200 CHAR);

    BEGIN

    FOR (IN) r

    SELECT dhcp_token, cmts_to_add | ',' cmts_add

    OF stg_link_data

    )

    LOOP

    l_cmts_1: = r.cmts_add;

    l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));

    SELECT COUNT (1)

    IN l_dhcp_cnt

    OF subntwk

    WHERE subntwk_nm = r.dhcp_token;

    IF l_dhcp_cnt = 0 THEN

    DBMS_OUTPUT. Put_line ("device not found: ' |") r.dhcp_token);

    ON THE OTHER

    While l_cmts_add IS NOT NULL

    LOOP

    l_add_link_nm: = r.dhcp_token |' _TO_' | l_cmts_add;

    SELECT COUNT (1)

    IN l_cmts_cnt

    OF subntwk

    WHERE subntwk_nm = TRIM (l_cmts_add);

    SELECT COUNT (1)

    IN l_link_cnt

    LINK

    WHERE link_nm = l_add_link_nm;

    IF l_cmts_cnt > 0 AND l_link_cnt = 0 THEN

    INSERT INTO link (link_nm)

    VALUES (l_add_link_nm);

    DBMS_OUTPUT. Put_line (l_add_link_nm |) » '||' Has been added. ") ;

    ELSIF l_link_cnt > 0 THEN

    DBMS_OUTPUT. Put_line (' link is already present: ' | l_add_link_nm);

    ELSIF l_cmts_cnt = 0 then

    DBMS_OUTPUT. Put_line (' no. CMTS FOUND for device to create the link: ' | l_cmts_add);

    END IF;

    l_cmts_1: = TRIM (SUBSTR (l_cmts_1, INSTR (l_cmts_1, ',') + 1));

    l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));

    END LOOP;

    END IF;

    END LOOP;

    COMMIT;

    EXCEPTION

    WHILE OTHERS THEN

    Dbms_output.put_line ('ERROR' |) SQLERRM);

    END;

    /

    output

    control files -

    DOWNLOAD THE DATA

    INFILE 'cmts_data.csv '.

    ADD

    IN THE STG_CMTS_DATA TABLE

    When (cmts_token! = ") AND (cmts_token! = 'NULL') AND (cmts_token! = 'null')

    and (cmts_ip! = ") AND (cmts_ip! = 'NULL') AND (cmts_ip! = 'null')

    FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "

    TRAILING NULLCOLS

    ('RTRIM (LTRIM (:cmts_token))' cmts_token,

    cmts_ip ' RTRIM (LTRIM(:cmts_ip)) ")". "

    for dhcp.


    DOWNLOAD THE DATA

    INFILE 'dhcp_data.csv '.

    ADD

    IN THE STG_DHCP_DATA TABLE

    When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')

    and (dhcp_ip! = ") AND (dhcp_ip! = 'NULL') AND (dhcp_ip! = 'null')

    FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "

    TRAILING NULLCOLS

    ('RTRIM (LTRIM (:dhcp_token))' dhcp_token,

    dhcp_ip ' RTRIM (LTRIM(:dhcp_ip)) ")". "

    for link -.

    DOWNLOAD THE DATA

    INFILE 'link_data.csv '.

    ADD

    IN THE STG_LINK_DATA TABLE

    When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')

    and (cmts_to_add! = ") AND (cmts_to_add! = 'NULL') AND (cmts_to_add! = 'null')

    FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "

    TRAILING NULLCOLS

    ('RTRIM (LTRIM (:dhcp_token))' dhcp_token,

    cmts_to_add TANK (4000) RTRIM (LTRIM(:cmts_to_add)) ")" ""

    SHELL SCRIPT-

    If [!-d / log]

    then

    Mkdir log

    FI

    If [!-d / finished]

    then

    mkdir makes

    FI

    If [!-d / bad]

    then

    bad mkdir

    FI

    nohup time sqlldr username/password@SID CONTROL = load_cmts_data.ctl LOG = log/ldr_cmts_data.log = log/ldr_cmts_data.bad DISCARD log/ldr_cmts_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &

    nohup time username/password@SID @load_cmts.sql

    nohup time sqlldr username/password@SID CONTROL = load_dhcp_data.ctl LOG = log/ldr_dhcp_data.log = log/ldr_dhcp_data.bad DISCARD log/ldr_dhcp_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &

    time nohup sqlplus username/password@SID @load_dhcp.sql

    nohup time sqlldr username/password@SID CONTROL = load_link_data.ctl LOG = log/ldr_link_data.log = log/ldr_link_data.bad DISCARD log/ldr_link_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &

    time nohup sqlplus username/password@SID @load_link.sql

    MV *.log. / log

    If the problem I encounter is here for loading data in the link table that I check if DHCP is present in the subntwk table, then continue to another mistake of the newspaper. If CMTS then left create link to another error in the newspaper.

    Now that we can here multiple CMTS are associated with unique DHCP.

    So here in the table links to create the link, but for the last iteration of the loop, where I get separated by commas separate CMTS table stg_link_data it gives me log as not found CMTS.

    for example

    DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2

    Here, I guess to link the dhcp-1-1-1 with balancing-CMTS-01-1 and wnlb-CMTS-02-2

    Theses all the data present in the subntwk table, but still it gives me journal wnlb-CMTS-02-2 could not be FOUND, but we have already loaded into the subntwk table.

    same thing is happening with all the CMTS table stg_link_data who are in the last (I think here you got what I'm trying to explain).

    But when I run the SQL scripts in the SQL Developer separately then it inserts all valid links in the table of links.

    Here, she should create 9 lines in the table of links, whereas now he creates only 5 rows.

    I use COMMIT in my script also but it only does not help me.

    Run these scripts in your machine let me know if you also get the same behavior I get.

    and please give me a solution I tried many thing from yesterday, but it's always the same.

    It is the table of link log

    link is already present: dhcp-1-1-1_TO_wnlb-cmts-01-1

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-02-2

    link is already present: dhcp-1-1-2_TO_wnlb-cmts-03-3
    link is already present: dhcp-1-1-2_TO_wnlb-cmts-04-4

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-5

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-01-1

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-8
    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-6
    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-0

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-03-3

    link is already present: dhcp-1-1-5_TO_wnlb-cmts-02-2
    link is already present: dhcp-1-1-5_TO_wnlb-cmts-04-4

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-7

    Device not found: wnlb-dhcp-1-13

    IF NEED MORE INFORMATION PLEASE LET ME KNOW

    Thank you

    I felt later in the night that during the loading in the staging table using UNIX machine he created the new line for each line. That is why the last CMTS is not found, for this I use the UNIX 2 BACK conversion and it starts to work perfectly.

    It was the dos2unix error!

    Thank you all for your interest and I may learn new things, as I have almost 10 months of experience in (PLSQL, SQL)

  • SQL Loader: Problem with WHEN

    I'm using Oracle 11 g, Win XP.

    I'm trying to load data with control below folder:
    OPTIONS (SKIP=0, DIRECT=FALSE, PARALLEL=FALSE, BINDSIZE=50000, errors=999999,ROWS=200, READSIZE=65536)
    LOAD DATA
    
    APPEND
    INTO TABLE v_table
    when COL_3  = 'XXXX'
    fields terminated by "|" optionally enclosed by '"'
    trailing nullcols
    (
    COL_1  "trim(:COL_1)",
    COL_2  "trim(:COL_2)",
    COL_3  "trim(:COL_3)",
    COL_4  "trim(:COL_4)",
    COL_5  "trim(:COL_5)",
    COL_6  "trim(:COL_6)",
    COL_7  "trim(:COL_7)",
    )
    
    INTO TABLE v_table
    APPEND
    when  COL_3 = 'YYY'
    fields terminated by "|" optionally enclosed by '"'
    trailing nullcols
    (
    COL_1  "trim(:COL_1)",
    COL_2  "trim(:COL_2)",
    COL_3  "trim(:COL_3)",
    COL_4  "trim(:COL_4)",
    COL_5  "trim(:COL_5)",
    COL_6  "trim(:COL_6)",
    COL_7  "trim(:COL_7)",
    )
    Here is the sample data in the data file:
    33432|"ORACLE"|"XXXX"|"555827             "|"317564"|" "|""|"ORACLE "|2011-07-20-15.37.11.879915|0001-01-01-01.01.01.000001
    33433|"ORACLE"|"XXXX"|"555828             "|"317564"|" "|""|"ORACLE "|2011-07-24-15.37.11.879915|0001-01-01-01.01.01.000001
    33434|"ORACLE"|"XXXX"|"555829             "|"317564"|" "|""|"ORACLE "|2011-07-10-15.37.11.879915|0001-01-01-01.01.01.000001
    33435|"ORACLE"|"XXXX"|"555830             "|"317564"|" "|""|"ORACLE "|2011-07-22-15.37.11.879915|0001-01-01-01.01.01.000001
    33436|"ORACLE"|"XXXX"|"555831             "|"317564"|" "|""|"ORACLE "|2011-07-20-15.37.11.879915|0001-01-01-01.01.01.000001
    33437|"ORACLE"|"XXXX"|"555832             "|"317564"|" "|""|"ORACLE "|2011-07-20-15.37.11.879915|0001-01-01-01.01.01.000001
    40048|"SAS"|"ZZZ "|"1017838            "|"317551"|" "|""|"COD "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
    40049|"SAS"|"ZZZ "|"1017839            "|"317551"|" "|""|"COD "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
    40050|"SAS"|"ZZZ "|"1017840            "|"317551"|" "|""|"COD "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
    20046|"SUNUSA"|"YYY "|"1017836            "|"317551"|" "|""|"JAVA "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
    20047|"SUNUSA"|"YYY "|"1017837            "|"317551"|" "|""|"JAVA "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
    20048|"SUNUSA"|"YYY "|"1017838            "|"317551"|" "|""|"JAVA "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
    20049|"SUNUSA"|"YYY "|"1017839            "|"317551"|" "|""|"JAVA "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
    20050|"SUNUSA"|"YYY "|"1017840            "|"317551"|" "|""|"JAVA "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
    It is:
    When I'm loading data into the table with the above command card, only data with when COL_3 = "XXXX" is to take. And if I comment the block that has COL_3 = "XXXX", then the second block is to take (when COL_3 = "YYY"). But I am unable to load data for XXXX and YYY on a single charge. Can someone help me on this please?

    This give a try.

      ...
      when  COL_3  = 'XXXX' and COL_3 = 'YYY'
      ...
    
  • Reg: Problem SQL Loader-

    Hi Experts,

    I am trying to load data from a flat file in an Oracle table, but face some problems.

    Concern (1) (1)

    I have a directory where there are about 20 similar files named as 'rb_1', 'rb_2', 'rb_3', etc...

    All these data should be loaded in a Word only table 'X '.

    Is it possible that only 1 CTL file will make a loop and load all the files in X?

    Concern (2) (2)

    Field delimiter is Ctrl-X (CAN: cancel) and Ctrl-M (EM: em) characters

    Line delimiter is Ctrl-Z (SUB: substitute) and Ctrl-T (DC4: Device Control) characters

    Is there a way I can specify this in my CTL file?

    (I've only worked on delimiter of field as comma "«»", de champ comme virgule «», pas speciales personnages comme ca not special characters like that)

    Please let me know if any additional information is required.

    Help much appreciated.

    Thank you

    -Nordine


    Agree with Hoek, you'd better to use external tables you can specify multiple filenames at once, otherwise you will have a script that provides the name of the input file when calling sql loader and make this script loop for each file.

    Regarding the ctrl characters in your delimiters, control file supports the hexagonal versions of them for example

    fields terminated by ' 09 x. "

    where 9 is the hexagon of the ascii value of the character (in this case a tab character).

    CTRL-m, that would be x ' 0 of etc.

  • Error loading XML file in the column of XMLTYPE through SQL loader

    Hi gurus,

    I am trying to load the XML file into the column of XMLTYPE through SQL Loader but the errors themselves. Here are the details

    Databases
    SQL*Plus: Release 10.2.0.3.0 - Production on Tue Jul 24 17:17:55 2012
    
    Copyright (c) 1982, 2006, Oracle.  All Rights Reserved.
    
    
    Connected to:
    Oracle Database 10g Release 10.2.0.3.0 - 64bit Production
    
    BANNER
    ----------------------------------------------------------------
    Oracle Database 10g Release 10.2.0.3.0 - 64bit Production
    PL/SQL Release 10.2.0.3.0 - Production
    CORE    10.2.0.3.0      Production
    TNS for Linux: Version 10.2.0.3.0 - Production
    NLSRTL Version 10.2.0.3.0 - Production
    The table structure
    CREATE TABLE TH_XML
    (
      COL_ID_1   VARCHAR2(100 BYTE),
      IN_FILE_1  XMLTYPE
    )
    XMLTYPE IN_FILE_1 STORE AS CLOB (TABLESPACE SMDAT)
    XML (simple.xml) file
    <?xml version="1.0"?>
     <catalog> 
     <book id="bk101"> 
               <author>Some Author1</author> 
               <title>Some Title1</title> 
               <genre>Computer</genre> 
               <price>44.95</price> 
               <publish_date>2000-10-01</publish_date> 
               <description>creating applications</description> 
       </book> 
       <book id="bk112"> 
               <author>Some Author2</author> 
               <title>Some Title2</title> 
               <genre>Computer</genre> 
               <price>49.95</price> 
               <publish_date>2001-04-16</publish_date> 
               <description>Microsoft Visual Studio 7 is explored in depth</description> 
    </book> 
    </catalog>
    Control file
    LOAD DATA
    INFILE 'c:\simple.xml'
    APPEND
    INTO TABLE TH_XML 
    XMLTYPE(in_file_1)
    (
    col_id_1 filler  CHAR (100),
    in_file_1 LOBFILE(CONSTANT "c:\simple.xml") TERMINATED BY EOF
    )
    LOG file
    SQL*Loader: Release 10.2.0.3.0 - Production on Tue Jul 24 16:42:25 2012
    
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    
    Control File:   c:\my_file.ctl
    Data File:      c:\simple.xml
      Bad File:     c:\simple.bad
      Discard File:  none specified
     
     (Allow all discards)
    
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array:     64 rows, maximum of 256000 bytes
    Continuation:    none specified
    Path used:      Conventional
    
    Table TH_XML, loaded from every logical record.
    Insert option in effect for this table: APPEND
    
       Column Name                  Position   Len  Term Encl Datatype
    ------------------------------ ---------- ----- ---- ---- ---------------------
    COL_ID_1                            FIRST   100           CHARACTER            
      (FILLER FIELD)
    IN_FILE_1                         DERIVED     *  EOF      CHARACTER            
        Static LOBFILE.  Filename is c:\simple.xml
    
    Record 1: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 2: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 3: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 4: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 5: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 6: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 7: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 8: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 9: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 10: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 11: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 12: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 13: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 14: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 15: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 16: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 17: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 18: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 19: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 20: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 21: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 22: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    Record 23: Rejected - Error on table TH_XML.
    ORA-00904: "SYS_NC_ROWINFO$": invalid identifier
    
    
    Table TH_XML:
      0 Rows successfully loaded.
      23 Rows not loaded due to data errors.
      0 Rows not loaded because all WHEN clauses were failed.
      0 Rows not loaded because all fields were null.
    
    
    Space allocated for bind array:                    256 bytes(64 rows)
    Read   buffer bytes: 1048576
    
    Total logical records skipped:          0
    Total logical records read:            23
    Total logical records rejected:        23
    Total logical records discarded:        0
    
    Run began on Tue Jul 24 16:42:25 2012
    Run ended on Tue Jul 24 16:42:26 2012
    
    Elapsed time was:     00:00:00.23
    CPU time was:         00:00:00.05
    I get error ORA-00904: "SYS_NC_ROWINFO$": invalid identifier in the logfile (mentioned above). Could someone help me know where I am doing wrong?

    Thanks in advance.

    Published by: 876991 on 24 July 2012 14:18

    Hello

    This remove the control file:

    XMLTYPE(in_file_1)
    

    It is used only if the target table is an array of XMLType object.

    For an XMLType column LOBFILE is sufficient, for example:

    LOAD DATA
    INFILE *
    APPEND INTO TABLE TH_XML
    (
     col_id_1  CHAR (100),
     in_file_1 LOBFILE(CONSTANT "c:\simple.xml") TERMINATED BY EOF
    )
    begindata
    MYID1
    

    It tells SQL * Loader data consisting of one record with COL_ID_1 = "MYID1" and content = "c:\simple.xml" IN_FILE_1

    SQL> CREATE TABLE TH_XML
      2  (
      3    COL_ID_1   VARCHAR2(100 BYTE),
      4    IN_FILE_1  XMLTYPE
      5  );
    
    Table created.
    
    SQL> host sqlldr control=test.ctl
    Username:dev
    Password:
    
    SQL*Loader: Release 11.2.0.2.0 - Production on Mer. Juil. 25 01:30:46 2012
    
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    
    Commit point reached - logical record count 1
    
    SQL> set long 5000
    SQL> column col_id_1 format a15
    SQL> select * from th_xml;
    
    COL_ID_1        IN_FILE_1
    --------------- --------------------------------------------------------------------------------
    MYID1           
                     
                     
                               Some Author1
                               Some Title1
                               Computer
                               44.95
                               2000-10-01
                               creating applications
                       
                       
                               Some Author2
                               Some Title2
                               Computer
                               49.95
                               2001-04-16
                               Microsoft Visual Studio 7 is explored in depth
                    
     
    
  • Insert the page elements when loading data

    Hi all

    I use APEX 5.0 and using wizard to load data to download the data.

    My table has 7 fields and I need to insert 3 fields in the table while loading data.

    I have created a process after Parse the data downloaded as below.

    The result is that the 5th field will be First Row column as 'Y' and no 6th and 7th column.  There is value of 5th field only.

    When click on next to data validation page, it shows the 5th field with column field and 6th without the name of the column.

    Please help me to the point where I must change my code.

    BEGIN
    APEX_COLLECTION. ADD_MEMBER)
    p_collection_name = > 'PARSE_COL_HEAD ',.
    p_c001 = > 'HUBCODE ',.
    p_c002 = > 'UPLOAD_FILENAME ',.
    p_c003 = > 'UPLOAD_DATE');

    FOR UPLOAD_ROW IN (SELECT SEQ_ID
    OF APEX_COLLECTIONS
    WHERE COLLECTION_NAME = "SPREADSHEET_CONTENT")
    LOOP
    APEX_COLLECTION. () UPDATE_MEMBER_ATTRIBUTE
    p_collection_name = > 'SPREADSHEET_CONTENT ',.
    p_seq = > UPLOAD_ROW. SEQ_ID,
    p_attr_number = > '5',.
    p_attr_value = >: P2_HUB_CODE);

    APEX_COLLECTION. () UPDATE_MEMBER_ATTRIBUTE
    p_collection_name = > 'SPREADSHEET_CONTENT ',.
    p_seq = > UPLOAD_ROW. SEQ_ID,
    p_attr_number = > '6'.
    p_attr_value = >: P2_FILE_NAME);

    APEX_COLLECTION. () UPDATE_MEMBER_ATTRIBUTE
    p_collection_name = > 'SPREADSHEET_CONTENT ',.
    p_seq = > UPLOAD_ROW. SEQ_ID,
    p_attr_number = > '7'.
    p_attr_value = > SYSDATE);

    END LOOP;
    END;

    Want to close this loop and hope that it will be useful for others.

    In the end, I managed download document using the latest plugin for APEX 5.0 excel2collections.

    As MK pointed out, I can control my data using the plugin.

    Another point more directly is users can download xls or xlsx files, do not need to convert them to csv format before downloading.

    I thank all of you for the help.

Maybe you are looking for

  • Apps wont install on apple tv

    I just can't get anything to install on my new apple tv.  I click the button install for netflix for example and nothing happens

  • "Unable to connect" remote pages ("could not read the chrome manifest file'); local pages OK; proxy server is configured to none - what to do?

    Firefox 5.0.1 claims to be updated (how he knew that if she could not connect to the Web?), Windows XP Home 2002 SP3 Dell DIM3000 Pentium 4, 3 GHz, 1 GB of Ram. Internet Explorer (8.0.6001.18702 with 128-bit encryption) and Opera (10.2? can't get it

  • Trash does not

    My trash does not all documents. When you put a document in the trash, I get a message asking me to delete the document.  I would like to have the normal functioning of the trash. I have iMac and OS X El Capitan 10.11.3. How to fix the problem?

  • HP Pavilion 15: HP Pavilion computer 15 laptop

    Help! Of course, my laptop has the drama when it is just out of warranty. Last night, I was doing a word doc - and it just froze so I turned it off and when it's loaded - he had a BLUE SCREEN and said: RECOVERY Your PC/device needs to be repairded -

  • leave a space of land

    Hello In fasteners, high values are incorrect data (x = 624, 855, 1125 and 1750).  Is it possible to have a gap in the plot where these high values occur there? Thank you hiNi.