SQL Ldr-load from multiple files to multiple tables in a single control file

Hello

It is possible to load data from files dishes mutiple at several tables using a control file in SQL Loader? The flat file to database table relation is 1:1. All data from a flat file goes to a table in the database.

I have about 10 flat files (large volume) that must be loaded into 10 different tables. And this must be done on a daily basis. Can I start charging for all 10 tables in parallel and a file command? Should I consider a different approach.

Thank you
Sisi

What operating system? ('Command Prompt' looks like Windows)

UNIX/Linux: a shell script with multiple calls to sqlldr run in the background with '&' (and possibly nohup)

Windows: A file of commands using 'start' to start multiple copies of sqlldr.
http://www.PCTOOLS.com/Forum/showthread.php?42285-background-a-process-in-batch-%28W2K%29
http://www.Microsoft.com/resources/documentation/Windows/XP/all/proddocs/en-us/start.mspx?mfr=true

Published by: Brian Bontrager on May 31, 2013 16:04

Tags: Database

Similar Questions

  • Update multiple columns from multiple tables in a single UPDATE request

    Hello

    I'm trying to figure if I'm heading in the right direction.

    I want to update multiple columns from multiple tables in a single UPDATE request. Also, I would like to update multiple columns in a table from the tables.

    Scenario 1

    UPDATE Table2, Table 3
    SET T2.Column1 = T1.Column1 
    ,T2.Column2 = T1.Column2
    ,T3.Column2 = T1.Column2
    FROM Table1 T1, Table2 T2, Table3 T3
    WHERE T1.id = T2.id
    and T1.id = T3.id
    
    

    Scenario 2

    UPDATE Table3
    SET T3.Column1 = T1.Column1 
    T3.Column2 = T1.Column2
    ,T3.Column3 = T2.Column3
    ,T3.Column4 = T2.Column4
    FROM Table1 T1, Table2 T2, Table3 T3
    WHERE T3.id = T1.id
    and T3.id = T2.id
    
    

    Hello

    For scenario 1, you must write separate instructions UPDATE table2 and table3.

    To guard against someone else change one of these tables while you act so you can copy all relevant data in a global temporary table and update this global temporary table table3.

    ENGAGE only when two tables have been changed.

    You can write a procedure or an INSTEAD OF trigger to do all this.

    For scenario 2, you can reference many tables that you need when new table3.  It might be more efficient and simpler to use the MERGER rather than UPDATED.  For example:

    MERGE INTO table3 dst

    WITH THE HELP OF)

    SELECT t1.id

    t1.column1

    t1.column2

    t2.column3

    t2.column4

    FROM table1 t1

    JOIN table2 t2 ON t1.id = t2.id

    )             src

    WE (dst.id = src_id

    WHEN MATCHED THEN UPDATE

    SET dst.column1 = src.column1

    dst.column2 = src.column2,

    dst.column3 = src.column3,

    dst.column4 = src.column4,

    ;

  • To create an interactive report in the apex by selecting from multiple tables

    Hi, I am creating an interactive report by selecting from multiple tables.

    SELECT w.FIRST_NAME as name, w.SURNAME as name, i.ROAD Road, i.DATE_OF_INC as DATE_OF_INC, S.STATEMENT as a STATEMENT OF Declaration

    JOIN THE

    WITNESS w

    ON w.witness_id = s.FK1_WITNESS_ID

    JOIN THE

    Incident I have

    WE

    i.incident_no = w.FK1_INCIDENT_NO

    JOIN THE

    user_station ps

    ON ps.station_id = i.nearest_station_id

    JOIN THE

    the user in.

    WE

    in. STATION_ID = ps.station_id

    WHERE po.officer_id = 1

    by I continue to encounter this error "the report query requires a unique key to identify each row. The supplied key cannot be used for this query. Please change the report attributes to define a unique key column. "ORA-01445: cannot select ROWID from where sample, a join without key preserved table view '

    So I googled around and found that in the attibutres tables report, I need to change the "LINK" COLUMN so first, I changed "Link to custom target", but the report is so I changed it to "Exclude link Clolumn" again, the report didn't report and I STILL got a blank page with only the tabs.

    I wonder you can not create a report by selecting from multiple tables?

    If you can please I need your help.

    Thank you

    You can, but in this case, it might be easier to build your report tables that are joined Oracle views and then build your report out of the newly built sight...

    Or wrap a selection around your selection with joins, and then make the where clause on the external selection...

    Thank you

    Tony Miller
    Software LuvMuffin
    Ruckersville, WILL

  • Use with need to collect in bulk to insert records from multiple tables

    Hello

    I plsql record type with several tables with multiple columns. so when I used bulk collect with education for ALL. I want to insert records in multiple tables.

    Please give me suggestions.

    ForAll is designed to be used with a single DML statement, which may include dynamic SQL statements. However, I do not know what advantage this will give you your list iteration save several times, one for each table - especially since there is an air show with SQL dynamic.

    Example 1 (dynamic SQL):

    begin

      ...

      forall i in vRecList.First..vRecList.Last
        execute immediate '
        begin
          insert into Table1 (Col1, Col2, Col3) values (:1, :2, :3);
          insert into Table2 (Col1, Col2, Col3) values (:1, :2, :3);
        end;' using vRecList(i).Col1, vRecList(i).Col2, vRecList(i).Col3;
    end;

    Another approach that I should work (but not tested) is using to insert all the Scriptures and based record inserts, but you need to try on your version of Oracle forall has changed between the versions.  In this case vRecList must be compatible with the Table % ROWTYPE and Table2% ROWTYPE type.


    Example 2 (insert all):

    begin

      ...

      forall i in vRecList.First..vRecList.Last

        insert all

          into Table1 values vRecList(i)
          into Table2 values vRecList(i)
        select 1 from dual;

    end;

  • Delete query to delete records from multiple tables

    All,

    I need a delete query that will delete the records from the tables. Please see the structure of the table & below
    CREATE TABLE TEMP1 (ID NUMBER(10),NAME VARCHAR2(40),CLASS VARCHAR2(40),COLLEGE VARCHAR2(40));
    CREATE TABLE TEMP2 (ID NUMBER(10),CITY VARCHAR2(40),STATE(40));
    
    INSERT INTO TEMP1 (ID, NAME,CLASS,COLLEGE) VALUES (1000,'SAM','CS','UNIV_1');
    INSERT INTO TEMP1 (ID, NAME,CLASS,COLLEGE) VALUES (2000,'RIO','CS','UNIV_1');
    INSERT INTO TEMP1 (ID, NAME,CLASS,COLLEGE) VALUES (3000,'CHRIS','CS','UNIV_1');
    INSERT INTO TEMP1 (ID, NAME,CLASS,COLLEGE) VALUES (4000,'ALEX','CS','UNIV_1');
    
    INSERT INTO TEMP2 (ID, CITY,STATE) VALUES (1000,'Auburn','NY');
    INSERT INTO TEMP2 (ID, CITY,STATE) VALUES (2000,'Ithaca','NY');
    INSERT INTO TEMP2 (ID, CITY,STATE) VALUES (3000,'Mount Vernon','NY');
    INSERT INTO TEMP2 (ID, CITY,STATE) VALUES (4000,'Port Jervis','NY');
    Now, I need to delete the records in these tables where the ID is '2000' by using a single delete query. Is this possible? This may be a newbie question. Help, please.

    "using a single request deletion. Is this possible?

    Nope.
    You can insert into multiple tables by using a single query, INSERT ALL job, but you cannot delete more than one table using a single query.

  • XML: loading from a file vs using AS

    The code below fails to load the mxl, but if I put the XML in a file and use
    < mx:XML id = "tempXML" source = "cuePoints.xml" / > "
    It loads. What must I do to make the XML to load from the inside of the mxml file?

    NO, DO NOT USE THE TEMPLATE TAG! which converts your beautiful XML in a hideous tree of dynamic objects. Don't, don't.

    Your problem is simply that you forgot to call the init() function. Call it since the event creationComplete on the Application tag and you will be fine.

    Tracy

  • SQL LDR LKM generate CTL file incorrect

    Hello

    The LKM for SQL LDR is generating incorrect CTL file for a fixed-length data file. For this reason the ODI wanders out.
    Here is the content of the CTL file:

    SnpsOutFile «-File=//Myserver/myfile.ctl.

    OPTIONS)
    SKIP = 0,
    ERRORS = 0,
    DIRECT = TRUE
    )
    DOWNLOAD THE DATA
    INFILE ' / / myserver/myfile. " RDY.
    BADFILE "/ / Myserver/myfile.bad.
    DISCARDFILE "/ / Myserver/myfile.dsc.
    DISCARDMAX 1
    IN THE ORA_SCHEMA TABLE. C$ _0RAW_TABLE
    (
    C1_CHAR9 POSITION (:),
    C2_CHAR2 POSITION (:),
    C3_CHAR6 POSITION (:),
    C4_CODE POSITION (:),
    C5_RAWG_CODE POSITION (:),
    C6_E_NUMBER POSITION (:),
    C7_T_NUMBER POSITION (:),
    C8_C_COLOR POSITION (:),
    C9_D_COLOR POSITION (:),
    C10_R_CODE POSITION (:)
    )

    Why 'POSITION (:) '. "don't be no figures before and after": "?
    I have reverse engineered the file and have the correct values for 'Physical length' and 'logic '.

    TIA,
    Ankit

    Hi Pierre,.

    I think column transformation mappings are defined on the Source (radio button) in the mapping of Interface integration window.

    Change the column transformation mappings to perform on STAGE.

    It should solve this problem.

    Thank you
    Fati

  • [Database Toolbox] possibility to Load From xml file in Labview and then in the database

    Before you write in my database, I want to save it and reload if the user wants to cancel the new charge that can last several minutes.

    If he cancels the load I get back my previous database data.

    I managed to save my database in XML through the "DB tools SaveRecordSet To File" VI. This VI records directly contains my database table in the xml file. And finally, I only if the other VI to load file will do the same thing, means load the file and save it in my database but it gives just a recordset in labview.

    Question: is it possible just to load the xml file into the database directly through Labview?

    Why, finally, these screws do not have the same behavior?

    I don't know, but I just thought I would chime, that if you fail to do with LabvIEW, you might want to look by making your writing in a 'Transaction' database, if your DBM takes in charge (most except MS Access)

  • Can we use Source accounting entity group to load the data from multiple entities with a single rule to load?


    Page 79 of erpi_admin_11123200.pdf says that "reporting entity groups are used to extract data from several reporting entities in a single data rule execution.  I use standard EBS adapter to load data in HFM application, I created a group of entity made up of several accounting entities, but I can't find a place in FDMEE where you get to select/use this group... When you define an import format you type the name and select Source (e, g. EBS) system you can select Map Source (for example EBS11 I adapter) or the accounting entity (it is what I select to define data load maps), but not both.  Note that there is no place to select the Group of accounting entity...  Location check menu group entity drop-down but it doe not my group of accounting entity which I believe is anyway something different... and creating a location and pointing to a format compatible with the selected Source adapter is no not good either... I'm confused, so is it possible to load data from several reporting entities in one rule to load or am I misreading the documentation?  Thank you in advance.

    Do not leave the field blank in the Import Format. If leave you empty to the place, when you set the rule to load data (for the location with EBS as a Source system), you will be able to select a single accounting entity or a group of accounting entity.

    You can see here

    Please mark this as useful or answer question so that others can see.

    Thank you

  • Loading data from multiple tables in essbase using ODI

    Hello

    We have a scenario where the data comes from more than one table. I would like to know how ODI will load the data for the right combination of members

    Hello

    Consider each data table has a field that corresponds to the other table. You can simply drag the source interface data warehouses and create a join between the tables.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Create views of data from multiple lines in a single column shows

    Hi all - it's probably posted in the wrong forum, but I couldn't find that was right.

    I'm almost a perfect beginner in sql, but I have a need to create a view that can be expanded to 10g (which effectively runs the volumes are likely to be high) who will do the following.

    Authentic table with columns Parent_code, Child_code
    Parent_Code Child_Code
    1000-2000
    1000-3000
    1000-4000
    2000 3000
    2000-5000

    (note that Parents may have several children and a child can have multiple parents!)

    What I have to finish with in my opinion is the following

    Child_Code Parent_List
    ' 2000 ' 1000 (3).
    3000 "1000 (3), 2000 (2)"
    ' 4000 ' 1000 (3).
    "5000 ' 2000 (2)"

    Note the number in parentheses is the number of children whose parent's - IE in the original parent a 1000, 3 table lines (one for each child)

    This point of view should be used as a quick glance upward (on the children's code) for a report of business objects.

    Is there someone who could you PLEASE, PLEASE help me quickly on what I have very little time to find a solution?

    Hello

    You can test these:

    select child_code
         , ltrim(sys_connect_by_path(parent_info,', '), ', ') as parent_list
    from (
      select child_code
           , to_char(parent_code) ||
             ' (' ||
             count(*) over(partition by parent_code) ||
             ')' as parent_info
           , row_number() over(partition by child_code order by parent_code) rn
      from your_table
    )
    where connect_by_isleaf = 1
    connect by prior rn = rn-1
           and prior child_code = child_code
    start with rn = 1
    ;
    
    select child_code,
           rtrim(
             extract(
               xmlagg(xmlelement("e",parent_info||', ') order by parent_info)
             , '//text()'
             )
           , ', '
           ) as parent_list
    from (
      select child_code,
             to_char(parent_code) ||
             ' (' ||
             count(*) over(partition by parent_code) ||
             ')' as parent_info
      from your_table
    )
    group by child_code
    ;
    

    What you need is called 'chain aggregation '.
    See here for the various techniques, including the two above: http://www.oracle-base.com/articles/misc/StringAggregationTechniques.php

  • Dedicated for each channel from multiple channels in a single task task disadvantage

    Hello

    My current acquisition software (C + c++ / GCC) encapsulates the methods rather clumsy niDAQmx C to interface with the data acquisition equipment in a class that represents a task of acquisition. This way I can create several instances, for example counter input, analog input, analog output, their terminals and the class supports all work low level as ensuring input analog fake internal is started if there are only counter entries such as the sample clock starts, or configure reminders N-sample, etc.

    It seems to work very well, and also the time seems to be good, because first of all the tasks on multiple instances of my wrapper. For triggered early, that I use

    DAQmxCfgDigEdgeStartTrig(mTask,mTriggerTerminal.toAscii().constData(),DAQmx_Val_Rising)

    in-house.

    Now my real question: what is the advantage of multiple channels, when everything seems to work fine with multiple tasks and only one channel per task? I don't see the disadvantage, it would first classify necessary acquisitions in types (I, ao,...) because several strings in a single task must be of the same type. With my approach I need not care because each channel still gets its own task.

    I don't know I'm missing something here. Maybe someone can explain it to me, maybe some limitation of multi-tasking, I have not yet read.

    Hey!

    Unless you specified for managing the it (simultaneous sampling) or modular instruments and hardware devices (see link )

    You cannot perform two tasks at the same time that access to the analog inputs, for example, because the

    ADC is a shared resource that is connected to a multiplexer, and that only one task can work in it at a time given. (see here )

    Similar restrictions often apply to other types of operations.

    I'm not aware of any performance issues, perhaps a little more memory could be used.

    So as long as your hardware supports what you are doing, you should be ok, I think,

    and it is only a question of clarity and intelligibility, ease of use and structure.

    As you use classes, I'm sure you've heard about encapsulation - so it is a

    question of how you want to design your application.

    In addition, when you work in LabVIEW, tasks feel more natural to the principle of data flow, because you have a thread for your data acquisition,

    and it works very well with our modes of standard design.

    So, if it is better for you (and works with the hardware), you can give all the channels its own task.

    Hopefully this might clarify some things,

    Kind regards

    Rome

    OR Germany

  • Update using several terms from multiple tables

    Hello

    I'm quite new to PL/SQL.

    I need to update a record in a table with several conditions in which the clause of several tables.

    On the internet I found something like

    Update < table >
    Set < column > = < value >
    < Table >
    Join them < table >
    on < condition >
    Where < condition >
    And < another condition >

    Now, I have:

    Update outbound_order
    Set outbound_order_sorting_code = "A".
    to orderref o, oo outbound_order
    where oo.ordernumber = o.ordernumber
    and o.delivery_status = 'c ';

    It gives ORA00933 of Pl/SQL to->

    I can't use more than one table in update?

    Kind regards

    Chi Wai
    update outbound_order o
    set outbound_order_sorting_code = 'A'
    where exists(select 1 from orderref oo
                  where oo.ordernumber = o.ordernumber)
    and delivery_status = 'c';
    
  • Problem with image display in a Datagrid loaded from local files

    Hello

    I'm trying to get (charges from the local file system) images to be displayed in a datagrid control.

    the paths are fine, but no images are displayed in the datagrid control (I use a project based Air, AS3).

    DataGrid code is:

    < mx:DataGrid id = "dg" dataProvider = "{myArrayCollection}" >

    < mx:columns >

    < mx:DataGridColumn headerText = "Image" width = "100" >

    < / mx:DataGridColumn >

    < / mx:columns >

    < / mx:DataGrid >

    The Actionscript code is:

    fileStream = new FileStream();

    var imageBytes:ByteArray = new ByteArray();

    var stream: FileStream = new FileStream();

    array7 = new Array();

    for (var j: int = 0; j < fileList1.length; j ++)

    {

    var file1:File = File.desktopDirectory;

    File1 = new File();

    File1 = File.desktopDirectory.resolvePath ("DG_TEST /" + fileList1 [j] .name);

    fileStream = new FileStream();

    Stream.Open (File1, FileMode.READ);

    stream.readBytes (imageBytes);

    Stream.Close ();

    array7.push (imageBytes);

    array7.push (j);

    }

    myArrayCollection.source = array7;

    I also have: [Bindable] private var myArrayCollection:ArrayCollection = new ArrayCollection();

    If I Uncomment the line array7.push (j), the figures show in the DataGrid and seem to leave a line for images?

    Any ideas/code (preferably because I am very new to flex/Air) to help me to display the images in the control datagrid would be greatly appreciated.

    Thanks in advance,

    Guida

    You are welcome

    don't forget to put the discussion as 'responded' to close

  • Cursor from multiple children to a single query (run once)

    Hello

    in a database Standard Edition version 11.2.0.3 I see a strange behavior that I can not explain this now: if I run a simple query on dba_temp_free_space I get children several sliders for the first run:

    -the first execution of the query (which returns 3 rows)

    Select / * test: double cursor * / *.

    of dba_temp_free_space;

    Select child_number, open_versions, extractions, rows_processed

    executions, px_servers_executions, parse_calls, buffer_gets

    v $ sql

    where sql_id = "69azwxdshhffc."

    order of child_number;


    CHILD_NUMBER OPEN_VERSIONS GET PX_SERVERS_EXECUTIONS PARSE_CALLS BUFFER_GETS ROWS_PROCESSED EXECUTIONS

    ------------ ------------- ---------- -------------- ---------- --------------------- ----------- -----------

    0             1          2              3          1                     0           1           7

    1             0          0              0          0                     1           1          55

    2             0          0              0          0                     1           1           2

    Select VERSION_COUNT, LOADED_VERSIONS, EXECUTIONS, PX_SERVERS_EXECUTIONS, PARSE_CALLS, BUFFER_GETS, ROWS_PROCESSED

    v $ sqlarea

    where sql_id = "69azwxdshhffc."

    VERSION_COUNT LOADED_VERSIONS PX_SERVERS_EXECUTIONS PARSE_CALLS BUFFER_GETS ROWS_PROCESSED EXECUTIONS

    ------------- --------------- ---------- --------------------- ----------- ----------- --------------

    3               3          1                     2           3          64              3


    -second run

    Select / * test: double cursor * / *.

    of dba_temp_free_space;


    Select child_number, open_versions, extractions, rows_processed

    executions, px_servers_executions, parse_calls, buffer_gets

    v $ sql

    where sql_id = "69azwxdshhffc."

    order of child_number;


    CHILD_NUMBER OPEN_VERSIONS GET PX_SERVERS_EXECUTIONS PARSE_CALLS BUFFER_GETS ROWS_PROCESSED EXECUTIONS

    ------------ ------------- ---------- -------------- ---------- --------------------- ----------- -----------

    0             0          2              3          1                     0           1           7

    1             0          0              0          0                     1           1          55

    2             0          0              0          0                     1           1           2

    3             1          2              3          1                     0           1           7

    4             0          0              0          0                     1           1          55

    5             0          0              0          0                     1           1           2

    Select VERSION_COUNT, LOADED_VERSIONS, EXECUTIONS, PX_SERVERS_EXECUTIONS, PARSE_CALLS, BUFFER_GETS, ROWS_PROCESSED

    v $ sqlarea

    where sql_id = "69azwxdshhffc."

    VERSION_COUNT LOADED_VERSIONS PX_SERVERS_EXECUTIONS PARSE_CALLS BUFFER_GETS ROWS_PROCESSED EXECUTIONS

    ------------- --------------- ---------- --------------------- ----------- ----------- --------------

    6               6          2                     4           6         128              6

    Just for completeness: the given system is a cluster with two nodes-CARS.


    Thus, every next run creates new child cursors:

    • I don't understand why I get child to run three sliders.
    • I don't understand why the subsequent executions do not re-use a cursor existing.

    Take a look at v$ sql_shared_cursor I see the reasons the optimizer mentions for not to reuse the sliders - but again, I don't understand

    Select child_number

    PX_MISMATCH

    USE_FEEDBACK_STATS

    TOP_LEVEL_RPI_CURSOR

    v $ sql_shared_cursor

    where sql_id = "69azwxdshhffc."

    order of child_number;

    CHILD_NUMBER P U T

    ------------ - - -

    0 N O N

    1 O N N

    2 O N N

    3 N N Y

    4 O N N

    5 O N N

    Yet once I don't see why these reasons must be valid. If I run the query frequently enough I see strange plan renderings when I use dbms_xplan.display_cursor - as described by Timur Akhmadeev in https://timurakhmadeev.wordpress.com/2012/03/19/obsolete-cursors/.

    So the question is: someone knows something like that before? And there at - it an explanation why the server constantly reusing cursors?

    Thanks for your replies in advance.

    Concerning

    Martin Preiss

    Martin:

    It's a slow day before a long weekend, and this kind of intrigued me.  There are a few bugs listed on MOS, at least one of them remarked that it is supposed to be a regression in 11.2.0.3:

    Bug 14016187 - GV$ queries cause high version due to the PX_MISMATCH County to the CARS (Doc ID 14016187.8)

    Bug 14711917 - County high version to the CCR because of the px_mismatch (Doc ID 14711917.8)

    John

Maybe you are looking for

  • im getting this error eregent errror

    Im having this error eregent when I re start the computer

  • Msconfig - load startup items is grayed out, cannot check or uncheck

    I have a problem with the internet, not loading pages, internet options opens, very slow computer and the gel to the top. I uninstalled IE and reinstalled. I can now go into internet options, then on the first page I load on IE opens, but goes no fur

  • Subnet routed on the main gateway of the WiFi Access Point

    Hello people. I have the wireless router WAG200G. It sports a RJ-11 input and a four port RJ-45, while it is compatible 802.11 g. I need to use it as an Access Point, WiFi via DHCP clients accommodation in its own range (I fear not the 192.168.1.x de

  • Reinstall the printer software?

    I have a HP Envy printer which has worked beautifully for a few years on my Windows 7 desktop pc. All of a sudden, it does not print. Message says it is 'offline', but the printer says it is connected to the home network. I see that an icon in the no

  • Easy VPN with LDAP integration

    Hello! Currently I have an EASY VPN server on a Cisco 2911 with LDAP integration to authenticate the user. Everything works well except for one aspect. When you try to connect to the VPN (IPSec Client), the user is prompted for the credentials that a