SQL Loader loading data into two Tables using a single CSV file

Dear all,

I have a requirement where in I need to load the data into 2 tables using a simple csv file.

So I wrote the following control file. But it loads only the first table and also there nothing in the debug log file.

Please suggest how to achieve this.

Examples of data

Source_system_code,Record_type,Source_System_Vendor_number,$vendor_name,Vendor_site_code,Address_line1,Address_line2,Address_line3

Victor, New, Ven001, Vinay, Vin001, abc, def, xyz

Control file script

================

OPTIONS (errors = 0, skip = 1)
load data
replace
in the table1 table:
fields ended by ',' optionally surrounded "" "
(
Char Source_system_code (1) POSITION "ltrim (rtrim (:Source_system_code)),"
Record_type tank "ltrim (rtrim (:Record_type)),"
Source_System_Vendor_number tank "ltrim (rtrim (:Source_System_Vendor_number)),"
$vendor_name tank "ltrim (rtrim (:Vendor_name)),"
)
in the Table2 table
1 = 1
fields ended by ',' optionally surrounded "" "
(
$vendor_name tank "ltrim (rtrim (:Vendor_name)),"
Vendor_site_code tank "ltrim (rtrim (:Vendor_site_code)),"
Address_line1 tank "ltrim (rtrim (:Address_line1)),"
Address_line2 tank "ltrim (rtrim (:Address_line2)),"
Address_line3 tank "ltrim (rtrim (:Address_line3)).
)

the problem here is loading into a table, only the first. (Table 1)

Please guide me.

Thank you

Kumar

When you do not provide a starting position for the first field in table2, it starts with the following after a last referenced in table1 field, then it starts with vendor_site_code, instead of $vendor_name.  So what you need to do instead, is specify position (1) to the first field in table2 and use the fields to fill.  In addition, he dislikes when 1 = 1, and he didn't need anyway.  See the example including the corrected below control file.

Scott@orcl12c > test.dat TYPE of HOST

Source_system_code, Record_type, Source_System_Vendor_number, $vendor_name, Vendor_site_code, Address_line1, Address_line2, Address_line3

Victor, New, Ven001, Vinay, Vin001, abc, def, xyz

Scott@orcl12c > test.ctl TYPE of HOST

OPTIONS (errors = 0, skip = 1)

load data

replace

in the table1 table:

fields ended by ',' optionally surrounded "" "

(

Char Source_system_code (1) POSITION "ltrim (rtrim (:Source_system_code)),"

Record_type tank "ltrim (rtrim (:Record_type)),"

Source_System_Vendor_number tank "ltrim (rtrim (:Source_System_Vendor_number)),"

$vendor_name tank "ltrim (rtrim (:Vendor_name)).

)

in the Table2 table

fields ended by ',' optionally surrounded "" "

(

source_system_code FILL (1) POSITION.

record_type FILLING,

source_system_vendor_number FILLING,

$vendor_name tank "ltrim (rtrim (:Vendor_name)),"

Vendor_site_code tank "ltrim (rtrim (:Vendor_site_code)),"

Address_line1 tank "ltrim (rtrim (:Address_line1)),"

Address_line2 tank "ltrim (rtrim (:Address_line2)),"

Address_line3 tank "ltrim (rtrim (:Address_line3)).

)

Scott@orcl12c > CREATE TABLE table1:

2 (Source_system_code VARCHAR2 (13),)

3 Record_type VARCHAR2 (11),

4 Source_System_Vendor_number VARCHAR2 (27),

5 $vendor_name VARCHAR2 (11))

6.

Table created.

Scott@orcl12c > CREATE TABLE table2

2 ($vendor_name VARCHAR2 (11),)

3 Vendor_site_code VARCHAR2 (16).

4 Address_line1 VARCHAR2 (13),

5 Address_line2 VARCHAR2 (13),

Address_line3 6 VARCHAR2 (13))

7.

Table created.

Scott@orcl12c > HOST SQLLDR scott/tiger CONTROL = test.ctl DATA = test.dat LOG = test.log

SQL * Loader: release 12.1.0.1.0 - Production on Thu Mar 26 01:43:30 2015

Copyright (c) 1982, 2013, Oracle and/or its affiliates.  All rights reserved.

Path used: classics

Commit the point reached - the number of logical records 1

TABLE1 table:

1 row loaded successfully.

Table TABLE2:

1 row loaded successfully.

Check the log file:

test.log

For more information on the charge.

Scott@orcl12c > SELECT * FROM table1

2.

RECORD_TYPE SOURCE_SYSTEM_VENDOR_NUMBER $VENDOR_NAME SOURCE_SYSTEM

------------- ----------- --------------------------- -----------

Victor Ven001 new Vinay

1 selected line.

Scott@orcl12c > SELECT * FROM table2

2.

$VENDOR_NAME VENDOR_SITE_CODE ADDRESS_LINE1 ADDRESS_LINE2 ADDRESS_LINE3

----------- ---------------- ------------- ------------- -------------

Vinay Vin001 abc def xyz

1 selected line.

Scott@orcl12c >

Tags: Database

Similar Questions

  • Insert data into two tables in a single transaction

    Hi all
    I have a problem with the development of features.

    Background:
    I have two tables: OFFER_HEADER and OFFER_CONTENT

    For now, user must insert and commit the OFFER_HEADER (single-row view), then content becomes accessible and OFFER_CONTENT(multi-row view) can be filled. It is done by selecting the save form PRODUCTS and integration of values in OFFER_CONTENT. Product data can be changed on the canvas of CONTENT form.

    My goal:
    I know this isn't a practical way to implement the functionality. I want to insert all the data (header and content) in a single transaction. What is the best way to do it?

    Thanks in advance,
    Best regards
    Bartek

    Hai,

    The error is now with the primary key. In order to check the value of the primary key as part of the operation.

    Kind regards

    Manu.

  • Need a sql script loader to load data into a table

    Hello

    IM new to Oracle... Learn some basic things... and now I want the steps to do to load the data from a table dump file...

    and the script for sql loader

    Thanks in advance

    Hello

    You can do all these steps for loading data...

    Step 1:

    Create a table in Toad to load your data...

    Step 2:

    Creating a data file... Create your data file with column headers...

    Step 3:

    Creating a control file... Create your control file to load the data from the table data file (there is a structure of control file, you can search through the net)

    Step 4:

    Move the data file and the control file in the path of the server...

    Step 5:

    Load the data into the staging table using sql loader.

    sqlldr control = data =

    connect as: username/password@instance.

  • Load data into a table

    Hi friends,

    I'm trying to load records into the rules of the product table of the table with the following...

    create table product)

    prod_id varchar2 (20).

    prod_grp varchar2 (20).

    from_amt number (10),

    to_amt number (10),

    share_amt number (10)

    );

    Insert into product (prod_id, prod_grp, from_amt, share_amt) Values ('10037', "STK", 1, 18);

    Insert into product (prod_id, prod_grp, from_amt, share_amt) Values ('10037', "NSTK", 1: 16.2);

    Insert into product (prod_id, prod_grp, from_amt, to_amt, share_amt) Values ('10038', "NSTK", 1, 5000, 12);

    Insert into product (prod_id, prod_grp, from_amt, to_amt, share_amt) Values ('10038', "STK", 5001, 10000, 16);

    Insert into product (prod_id, prod_grp, from_amt, share_amt) Values ('10038', "STK", 10001, 20);

    Insert into product (prod_id, prod_grp, from_amt, to_amt, share_amt) Values ('10039', "NSTK", 1, 8000, 10);

    Insert into product (prod_id, prod_grp, from_amt, share_amt) Values ('10039', "STK", 8001, 12);

    create table rules)

    rule_id varchar2 (30),

    rule_grp varchar2 (10),

    rate_1 number (10),

    point_1 number (10),

    rate_2 number (10),

    point_2 number (10),

    rate_3 number (10),

    point_3 number (10)

    );

    Criteria of loading in the rules of the table:

    rule_id - "RL" | Product.prod_id

    rule_grp - product.prod_grp

    rate_1 - product.share_amt where from_amt = 1

    point_1 - product.to_amt

    rate_2 - if product.to_amt in point_1 is not NULL, then find product.share_amt of the next record with the same rule_id/prod_id where from_amt (of the next record) = to_amt (current record -

    point_1) + 1

    point_2 - if product.to_amt in point_1 is not NULL, then find product.to_amt of the next record with the same rule_id/prod_id where from_amt (of the next record) = to_amt (current record - )

    point_1) + 1

    rate_3 - if product.to_amt in point_2 is not NULL, then find product.share_amt of the next record with the same rule_id/prod_id where from_amt (of the next record) = to_amt(current )

    Enregistrement-point_2) + 1

    point_3 - if product.to_amt in point_2 is not NULL, then find product.to_amt of the next record with the same rule_id/prod_id where from_amt (of the next record) = to_amt (current record - )

    point_2) + 1

    I tried to load the first columns (rule_id, rule_grp, rate_1, point_1, rate_2, point_2) via the sql loader.

    SQL > select * from product;

    PROD_ID PROD_GRP FROM_AMT TO_AMT SHARE_AMT

    -------------------- -------------------- ---------- ---------- ----------

    10037                STK                           1                    18

    10037                NSTK                          1                    16

    1 5000 12 NSTK 10038

    10038 5001-10000-16 STK.

    10038 10001 20 STK.

    10039 1 8000 10 NSTK

    10039                STK                        8001                    12

    produit.dat

    PROD_ID | PROD_GRP | FROM_AMT | TO_AMT | SHARE_AMT

    "10037' |'. STK' | 1. 18

    "10037' |'. NSTK' | 1. 16.2

    '10038' |' NSTK' | 1. 5000 | 12

    '10038' |' STK' | 5001 | 10000 | 16

    '10038' |' STK' | 10001 | 20

    "10039' |'. NSTK' | 1. 8000 | 10

    "10039' |'. STK' | 8001 | 12

    Product.CTL

    options (Skip = 1)

    load data

    in the table rules

    fields ended by ' |'

    surrounded of possibly ' '.

    trailing nullcols

    (rule_id POSITION (1) ""RL"|: rule_id")

    rule_grp

    from_amt BOUNDFILLER

    point_1

    share_amt BOUNDFILLER

    , rate_1 ' BOX WHEN: from_amt = 1 THEN: share_amt END.

    , rate_2 expression "(sélectionnez pr.share_amt de produit pr où: point_1 n'est pas null et pr.prod_id=:rule_id et: point_1 =: from_amt + 1)" "

    , expression point_2 "(sélectionnez pr.to_amt de produit pr où: point_1 n'est pas null et pr.prod_id=:rule_id et: point_1 =: from_amt + 1)" "

    )

    He has not any support only values in rate_2, point_2... no error either... Not sure if there is another method to do this...

    Please give your suggestions... Thank you very much for your time

    Hello

    Thanks for posting the CREATE TABLE and INSERT instructions for the sample data; It's very useful!

    Don't forget to post the exact results you want from this data in the sample, i.e. what you want the rule table to contain once the task is completed.

    As ground has said, there is no interest to use SQLLDR to copy data from one table to another in the same database.  Use INSERT, or perhaps MERGE.

    2817195 wrote:

    Thank you for your answers... I thought it would be easier to manipulate the data using sql loader... I tried to use insert but do not know how to insert values in point_2, rate_3, rate_2, point_3, columns... For example, when point_1 is not null, need to do a find for the next with the same rule_id record and if the inserted record = pr.from_amt + 1 point_1 then RATE_2 should be inserted with this pr.share_amt of this record...

    SQL > insert into the rules)

    2 rule_id,

    rule_grp 3,.

    rate_1 4,.

    point_1 5,.

    rate_2 6,.

    point_2 7,.

    rate_3 8,.

    9 point_3)

    10. Select

    11 'RL ' | PR.prod_id RULE_ID,

    12 pr.prod_grp RULE_GRP,

    13 CASES WHEN END of pr.from_amt = 1 THEN pr.share_amt RATE_1,

    14 pr.to_amt POINT_1,

    15 (select pr.share_amt from product pr where point_1 is not null and rules.rule_id = pr.prod_id and point_1 = pr.from_amt + 1) RATE_2,

    16 (select pr.to_amt from product pr where point_1 is not null and rules.rule_id = pr.prod_id and = pr.from_amt + 1 point_1) POINT_2,.

    17 (select pr.share_amt from product pr where point_2 is not null and rules.rule_id = pr.prod_id and = pr.from_amt + 1 point_2) RATE_3,.

    18 (select pr.to_amt from product pr where point_2 is not null and rules.rule_id = pr.prod_id and = pr.from_amt + 1 point_2) POINT_3

    19 product pr;

    (select pr.share_amt from product pr where point_1 is not null and point_1 = pr.from_amt + 1) RATE_2,

    *

    ERROR on line 15:

    ORA-00904: "POINT_1": invalid identifier

    Help, please... Thank you very much

    This is what causes the error:

    The subquery on line 15 references only 1 table in the FROM clause, and this table is produced.  There is no point_1 column in the product.

    A scalar subquery like this can be correlated to a table in the Super request, but the only table in the FROM (line 19) clause is also produced.  Since the only table that you read is produced, only columns that you can read are the columns of the product table.

    You use the same table alias (pr) to mean different things 5. It's very confusing.  Create aliases for single table in any SQL statement.  (What you trying to do, I bet you can do without all these subqueries, in any case.)

  • How to load data into the table of correspondence in OIM 11 g

    Hello
    I have a lot of data that must be loaded into the table of choice as Codekey and decode. How can I load the data in bulk instead of manually via console Design?

    Is it possible to do? No API IOM or any what SQL query

    Kind regards
    JS

    Here's what you need to get.

    psLookupCode - search name,
    psValue - Code key
    psDescriptivevalue - DecodeKey

    The rest can be empty.

    addLookupValue

    void addLookupValue (java.lang.String psLookupCode,
    java.lang.String psValue,
    java.lang.String psDescriptiveValue,
    java.lang.String psLanguage,
    java.lang.String psCountry)
    throws Thor.API.Exceptions.tcAPIException,
    Thor.API.Exceptions.tcInvalidLookupException,
    Thor.API.Exceptions.tcInvalidValueException,
    tcAPIException adds a new entry for the specified search

    Parameters:
    psLookupCode - the Code for the definition of research
    psValue - the value that will be stored in the database
    psDescriptiveValue - a descriptive version of the value to be added
    psLanguage - the language for the added entry. Leave blank to accept the default value (en)
    psCountry - country for the specified entry. Leave blank to accept the default (US)

    Throws: tcAPIException tcInvalidLookupException - thrown if the search Code is no tcInvalidValueException - thrown if the value cannot be added to the list of choices (because it's a double, etc.)

    M

  • How dynamically load data in target tables using as source files

    Hello

    My script needs a single interface to load data from 5 different files in five paintings of target using a single interface. All target tables have the same structure. It is possible to point to files variable source using ODI. But the same approach does not work with database tables. I get errors trying to make my table as dynamic target/source.

    Can anyone suggest anything. The last option would write a dynamic PL/SQL block in the KM. Other friends of suggestions?

    Kind regards
    Jay

    and not exists)
    Select 'X '.
    h. #PLAYGROUND. "v_tab_name"T ".

    have you provided the name of the resource with the quotes, if so please remove it and try.

    If you have provided the name of the variable, a v_tab_name and do not provide quotes. Can you please change the name of the variable to the capital, namely V_TAB_NAME and use the same in the data store too and please try again.

  • Load data into a table from two sources

    I developed an interface that has tables of the source of a source database that contains data of 4 companies. (we have multiple servers for different companies). Now, there is a requirement to load the data to another company in the same data warehouse. The problem is that the source is on another server. The database structure is the same. The names of tables, constraints are the same.

    I can't just duplicate my previous interface and change the data source to this interface. -Does anyone know how to do this. I have about 50 + such interface of re - developing another interface is quite a tedious task.

    Hello

    Here are links to examples of the context, in the examples they use development and Production environments in order to demonstrate the different environments.  In your project, it will be the different companies.  Create a context for each company, provided that the scheme remains the same across companies.

    https://blogs.Oracle.com/dataintegration/entry/executing_the_same_code_in_all

    Context, physical and logical schema - how does it work? -ODIExperts.com

    http://www.odigurus.com/2011/10/ODI-contexts.html

    Thank you

    ARV

  • Loading data into the table by file name

    Hi all

    I am new to ODI. I have a requirement of loading a flat file in oracle, as well as the name of the file in a single column of the table. This means that, if there are 10 rows of data from this file, then the extra column "FileName" to the table will be the name of the active file written 10 times. Please suggest how this can be achieved. I am able to do half of it but the file name is not filled.

    Thank you.

    Hello

    See the link below for dynamic fair value can be useful for you

    http://blogs.Oracle.com/dataintegration/2009/04/using_parameters_in_odi_the_dy_1.html

    Thank you
    Merlina.

  • SQL * Loader data loading

    Hello
    I have oracle 10g AIX environment. I am trying to load data into a table that contains columns about 200. The flat file composed of records that extends up to 2 lines of the file. While trying to load, it displays error. When I change the flat for each record it in a single line woks. How can I load files where the fields of total of records spans several lines in the flat file.
    Thanks in advance.

    Hello

    You can take a look at these two options for setting: CONCATENATE and CONTINUIF. If you have always 2 physical records to form a logic, you can use CONCATENATE followed to an integer, which said combine integer rows (records) in a logical record. CONTINUIF you use if you do not always have these hard limits, but it depends on the values in the first row.

    You can find the options in the Manual: http://download.oracle.com/docs/cd/E11882_01/server.112/e16536/ldr_control_file.htm#i1005509

    Herald tiomela
    http://htendam.WordPress.com

  • Select this OPTION to generate XML data from the table using XMLELEMENT, XMLAGG gives error ORA-19011 string buffer too small

    My select statement fails with the error:


    The ORA-19011 string buffer too small


    The select statement looks like:


    SELECT TO_CLOB)

    XMLELEMENT ("accounts",

    XMLELEMENT ("count",

    XMLATTRIBUTES)

    rownum AS "recordId."

    To_date('20130520','YYYYMMDD') AS "datestarted."

    123456 AS "previousBatchId."

    56789 AS 'previousRecordId '.

    ),

    ....

    .... .

    .....

    XMLFOREST)

    SIG_ROLE AS "SignatoryRole."

    To_char(TRANSFER_DATE,'YYYY-mm-DD') AS "TransferDate."

    NVL(Reason,0) AS 'reason '.

    ) AS the 'transfer '.

    )

    ()) AS CRDTRPT

    OF ANY_TABLE;

    • It looks like I can choose only 4000 characters using the SELECT statement (please, correct me if I'm wrong)

    I'd use the XMLGEN package. But the environment team says no mounted drives in the future with the arrival of the EXADATA.

    NO HARD DRIVE MOUNTED, NO ACCESS TO THE DATABASE DIRECTORIES

    No UTL_FILE

    I need to use the REEL spool the resulting XML data of the SELECT query.

    SQL is a standard in my org, but I can do with a PL/SQL solution also to load data into a table (cannot use a COIL with PL/SQL)

    What I do is:

    1. a column of type CLOB to a xml_report of the loading of the above SELECT query table
    2. Then using SELECT * FROM xml_report to SPOOL the data to a file report.xml

    No need of XMLTYPE data behind. Xml data stream is fine for me.

    In addition, I need to validate the XML file, also using XSD.

    Problem is that the resulting lines of the select query are supposed to be from 15000 to 20000 bytes long.

    Oracle database version: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64 bit Production

    A Suggestion or a solution to this problem would be appreciated.

    (Sorry for the use of "BOLD", just to make it more readable and highlight the imp points)

    Bravo!

    Rahul

    It looks like I can choose only 4000 characters using the SELECT statement (please, correct me if I'm wrong)

    You use the right method.

    There is an implicit conversion from XMLType to the data type VARCHAR2 as expected by the function TO_CLOB, where the limitation, and the error.

    To serialize XMLType to CLOB, use the XMLSerialize function:

    SELECT XMLSerialize (DOCUMENT

    XMLELEMENT ("accounts",

    ...

    )

    )

    OF ANY_TABLE;

    For the rest of the requirement, I wish you good luck trying to spool the XML correctly.

    You may need to play around with the SET LONG and SET LONGCHUNKSIZE commands to operate.

  • Load data from one table into another table of 2 different schemes

    Hello

    We have a requirement to insert data into a table in a schema of a table that is in other schemas in ODI.

    We are able to do so by creating interfaces and mappings. But we now expect to do using SQL instead of ODI interfaces.

    Is it possible to do this using the SQL statements we have source and target data sources defined in ODI.

    should be like "Insert into src.table select * from tar.table'"

    Thank you...

    Hello

    If you try to load a db to another db using sql (sql free hand) instead of interfaces instructions.

    It is possible using procedures odi too.

    1. Select the project (designer navigator) odi mode

    2. create the new procedure. Then add the command.

    in this command on the source command on the tabs of the target.

    Enter the sql statements. See the screenshots below...

    I think this will help for you,

    Thank you best regards &,.

    A.Kavya

  • Cannot load data into Essbase using ODI

    Hi guys,.

    Help help. I have problem loading data into essbase using ODI. The error message is
    java.sql.SQLException: unexpected token: ACCOUNT in the statement [select C1_ACCOUNT "" account]

    I have a very simple flat file that are similar to the below:

    Account, resources, time, data
    Active, Na_Resource, Jan, 10
    Active, Na_Resource, 12, February

    With the same flat files, I am able to load data to load rules.


    I use 9.3.1.0 and ODI 10.1.3.4.0 essbase. I use the ODI to load members and data in the planning without any problem.


    Thank you

    Hello

    It seems to generate an extra set of quotation marks around the SQL, in my interface it generates.

    SQL = "" "select C1_ACCOUNT 'Account', C2_PERIOD 'Period', C3_RESOURCE 'Resource', C4_DATA 'Data' of the" C$ _0TestApp_testData "where (1 = 1) «»

    Note the single quotes around the account.

    If you go to the topology Manager, on the tab of the physical architecture, right-click 'Hyperion Planning' > 'change '.
    Select the "Langugage" tab for the "JYTHON" line, make sure that the "Object Delimiter" field has no quotes, if it's remove and apply and save.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Loading data into the APEX problem

    Hello world...

    I am new to Oracle APEX and I am facing problems loading the data into the table. My problem is the text data are load with double quotes.

    Please suggest.

    Thank you
    Suresh

    Hello

    assuming that you load the data via the UI APEX via home > SQL Workshop > utilities > data workshop > Load Data
    By also provide more information like the version used, APEX how/where you download data, etc.

    If you put your data in a file and select ' 'Download the file (separated by commas or tabs' then on the next screen, you can field 'Optionally Enclosed By' in double quotes ("").
    This should remove the double quotes of the data during the import.

    Concerning
    Bottom

  • Loading XML into a Table

    Hi all
    I want to create a procedure in which, I provided the name of the table and the location of the XML file, the procedure goes to this place and pick the file XML and load it into the table. Can it is possible?

    Thank you

    Best regards


    Adil

    Finally, I got your issue! It is the use of DBMS_LOB. LOADFROMFILE, I used DBMS_LOB. LOADCLOBFROMFILE instead. It was actually loading the binary content!. This is a fully functional code based on your table and XML file structure.
    The XML file, I used:

    
    
    
    28004125
    251942
    05-SEP-92
    400
    513
    1
    0
    
    
    28004125
    251943
    04-OCT-92
    400
    513
    1
    0
    
    
    

    True PL/SQL code:

    SQL> /* Creating Your table */
    SQL> CREATE TABLE IBSCOLYTD
      2  (
      3  ACTNOI VARCHAR2 (8),
      4  MEMONOI NUMBER (7,0),
      5  MEMODTEI DATE,
      6  AMOUNTI NUMBER (8,0),
      7  BRCDSI NUMBER (4,0),
      8  TYPEI NUMBER (4,0),
      9  TRANSMONI NUMBER (6,0)
     10  );
    
    Table created.
    
    SQL> CREATE OR REPLACE PROCEDURE insert_xml_emps(p_directory in varchar2,
      2                                              p_filename  in varchar2,
      3                                              vtableName  in varchar2) as
      4    v_filelocator    BFILE;
      5    v_cloblocator    CLOB;
      6    l_ctx            DBMS_XMLSTORE.CTXTYPE;
      7    l_rows           NUMBER;
      8    v_amount_to_load NUMBER;
      9    dest_offset      NUMBER := 1;
     10    src_offset       NUMBER := 1;
     11    lang_context     NUMBER := DBMS_LOB.DEFAULT_LANG_CTX;
     12    warning          NUMBER;
     13  BEGIN
     14    dbms_lob.createtemporary(v_cloblocator, true);
     15    v_filelocator := bfilename(p_directory, p_filename);
     16    dbms_lob.open(v_filelocator, dbms_lob.file_readonly);
     17    v_amount_to_load := DBMS_LOB.getlength(v_filelocator);
     18    ---  ***This line is changed*** ---
     19    DBMS_LOB.LOADCLOBFROMFILE(v_cloblocator,
     20                              v_filelocator,
     21                              v_amount_to_load,
     22                              dest_offset,
     23                              src_offset,
     24                              0,
     25                              lang_context,
     26                              warning);
     27
     28    l_ctx := DBMS_XMLSTORE.newContext(vTableName);
     29    DBMS_XMLSTORE.setRowTag(l_ctx, 'ROWSET');
     30    DBMS_XMLSTORE.setRowTag(l_ctx, 'IBSCOLYTD');
     31    -- clear the update settings
     32    DBMS_XMLStore.clearUpdateColumnList(l_ctx);
     33    -- set the columns to be updated as a list of values
     34    DBMS_XMLStore.setUpdateColumn(l_ctx, 'ACTNOI');
     35    DBMS_XMLStore.setUpdateColumn(l_ctx, 'MEMONOI');
     36    DBMS_XMLStore.setUpdatecolumn(l_ctx, 'MEMODTEI');
     37    DBMS_XMLStore.setUpdatecolumn(l_ctx, 'AMOUNTI');
     38    DBMS_XMLStore.setUpdatecolumn(l_ctx, 'BRCDSI');
     39    DBMS_XMLStore.setUpdatecolumn(l_ctx, 'TYPEI');
     40    DBMS_XMLStore.setUpdatecolumn(l_ctx, 'TRANSMONI');
     41    -- Now insert the doc.
     42    l_rows := DBMS_XMLSTORE.insertxml(l_ctx, v_cloblocator);
     43    DBMS_XMLSTORE.closeContext(l_ctx);
     44    dbms_output.put_line(l_rows || ' rows inserted...');
     45    dbms_lob.close(v_filelocator);
     46    DBMS_LOB.FREETEMPORARY(v_cloblocator);
     47  END;
     48  /
    
    Procedure created.
    
    SQL> BEGIN
      2  insert_xml_emps('TEST_DIR','load.xml','IBSCOLYTD');
      3  END;
      4  /
    
    PL/SQL procedure successfully completed.
    
    SQL> SELECT * FROM ibscolytd;
    
    ACTNOI      MEMONOI MEMODTEI     AMOUNTI     BRCDSI      TYPEI  TRANSMONI
    -------- ---------- --------- ---------- ---------- ---------- ----------
    28004125     251942 05-SEP-92        400        513          1          0
    28004125     251943 04-OCT-92        400        513          1          0
    
    SQL> 
    
  • How to load data into the App MVDEMO schema example

    Hi all

    I'm a POC on Oracle Mapviewer and try to build some reports in OBIEE using MApviewer.

    This POC, I use Oracle MVDEMO example Data (11g). I think that these sample data covers few countries like the USA.

    I need to make PDS for the Brazil, I downloaded data from the map of the site as Shapefiles Brazil

    http://GADM.org/country

    in these data of the Brazil, I got from .csv files 4 extensions, .dbf, .shp and SHX

    I need to know how can I load these files into my Oracle 11 g DB? Should I load data into the same pattern of mvdemo, if yes then which table?

    Any help will be appreciated a lot.

    Thank you

    Amit

    Use the Java shapefile Converter utility (http://www.oracle.com/technetwork/database/options/spatialandgraph/downloads/index-093371.html)

    GDAL (gdal.org) FME (Safe) or or MapBuilder.

    Specify the to SRID (i.e. the SRID for loading in Oracle geoms) 4326 or 8307.

    Load into a new table named anything you want. for example brazil_gadm with the geometry named GEOMETRY column

    Once it's loaded, verify that there is an entry for the table and column (BRAZIL_GADM, GEOMETRY) in user_sdo_geom_metadata

    Create a space on brazil_gadm.geometry index if the tool has not created a.

    Add the definitions of topic for the country, State or whatever the admin areas exist in the dataset.

    Import them as layers in OBIEE.

Maybe you are looking for