data from table 2d data

I am using data socket server to send and receive data from my camera (a wave front sensor) on the same computer. The data I received is a variant, and I need to convert the variant type to another type of data that I can use in Labview for the calculation and furthur processing.

Variant of data I've received a lot of attributes, and the attribute that I need is a 2D array. However, I see that the size of my table in Labview, but not numeric values. The indicator shows me

4 attributes:
'Centroid_Pos_X'-> [33 x 41]

'Centroid_Pos_Y'-> [33 x 41]

"Centroids_X"-> 15
"Centroids_Y"-> 15

These four attribute must all turn around a table, but they are all return me the size of the array...

I tried 'get variant attribute' and also 'variant to data'... but impossible to get the correct data.

can someone give me some advice?

You perform an additional step that is not necessary. As I pointed out, the Get variant attribute function has a value entry default, whereby you can wire a 2D table for your table directly:

Tags: NI Software

Similar Questions

  • How to recover data from table

    I'm new to APEX 4.0, how to recover data from table

    My table is CR_USERPROFILE... and I want to recover data to the following recommendations:

    Select the title, user name, address, email, mobile, cr_userprofile where e-mail is APEX_CUSTOM_AUTH. GET_USERNAME


    Text P110_EMAIL 10 field
    20 P110_TITLE selection list
    P110_USERNAME 30 text field
    Text field P110_ADDRESS 40
    P110_PHONE 50 text field
    Text P110_MOBILE 60 field

    could any body if you please help...

    Thanks and greetings
    Luke

    Hello

    1. create the process Page.
    2. Select the Data Manipulation
    3. in the category select "Automated line Fetch".
    4 enter the process name, sequence, select "on the load - before the headers" to the point.
    5. specify the owner, table, primary key and the key column primary (Item name contains the primary key).
    6. create a process.
    7. in each element, select "Column of the database" in "the Source Type.

    Kind regards

    Patel Kartik
    ------------------------------------------------------------------------
    http://patelkartik.blogspot.com/
    http://Apex.Oracle.com/pls/Apex/f?p=9904351712:1

  • Extracting data from table without refreshment and without using the tab key.

    Hi friends,

    I have a problem I want to extract data from table without discount in the text field without using the Tab key. When I enter a field value any value then the text corressponding should enter into corressponding textfield without using the Tab key.

    for example. When I get back emp_id 101 in a text field then first_name and last_name, address would come in to the text fields corressponding without refresh and use the Tab key.

    How can I do that.

    Thank you
    Maury

    Hi Maury,

    I guess it's similar to: retrieving data without refreshing rather than Re: value of a textfield should enter into an another textfield without using the TAB ?

    If so, the only change you want to bring on the first is to use the parameter "Onkeyup" instead of "onchange" in the 'HTML Form attributes of the element' element.

    Note, however, that the user must move away from the issue at some point (for example, to click on a button), so the onchange will fire anyway.

    Andy

  • How to pass a value from table in to another using java-oracle script: apex 5.0

    Hello

    Step 1:

    Two Table (product, product 2)

    Created an IR where all data are from Table Product

    -> a cell in the column is editable.

    Step 2:

    Whenever the user change certain values of cell and click on the button set to day then cell value must also be updated in the table leader2.

    -> entire product line (table) must be inserted into the product 2 (table) with update of the cell value.

    JS:

    var arr_f01 = [];

    () $("input[name='f01']").each

    function() {}

    If ($(this).) Val() > 0)

    {

    arr_f01.push ($(this).) Val());

    }

    });

    (apex). Server.Process

    "Update".

    {f01: arr_f01,}

    {dataType: "text", success: function (pData) {alert ("' data inserted into the Table Product");}}

    } }

    );

    Thank you.

    Hi Dominique,.

    Pranav.Shah wrote:

    Hello

    Step 1:

    Two Table (product, product 2)

    Created an IR where all data are from Table Product

    --> A cell in the column is editable.

    Step 2:

    Whenever the user change certain values of cell and click on the button set to day then cell value must also be updated in the table leader2.

    ---> Whole product line (table) must be inserted into the product 2 (table) with update of the cell value.

    JS:

    var arr_f01 = [];

    () $("input[name='f01']").each

    function() {}

    If ($(this).) Val() > 0)

    {

    arr_f01.push ($(this).) Val());

    }

    });

    (apex). Server.Process

    "Update".

    {f01: arr_f01,}

    {the data type: 'text', success: function (pData) {alert (' ' data inserted into the Table Product ');}}

    } }

    );

    Thank you.

    Follow the steps below.

    Step 1: Give static id to the other columns in your interactive report

    Attributes of the region-> column-> Id static definition

    Step 2: change your Javascript code to read values of other columns

    check the line no 8, in this way, you can read the value of other columns and push that in table

    This is the static id of the column I given EMPNO.

    do the same for the other columns you want to insert.

    var arr_f01 = [];
    var arr_f02 = [];
    var empno;
    $("input[name='f01']").each(
    function() {
    if($(this).val() > 0)
    {
      empno = $(this).closest('tr').children('td[headers="EMPNO"]').text();
      arr_f01.push($(this).val());
      arr_f02.push(empno);
    }
    });
    
    apex.server.process (
      "Update"
    , {  f01: arr_f01, f02: arr_f02
      }
    , { dataType: 'text',success: function(pData){alert('Data Inserted in Product Table');
    } }
    );
    

    Step 3: use tables in your ajax process to insert the record., replace your table name and the columns

    begin
    for i in 1..apex_application.g_f01.count loop
    insert into test(A,B) values (APEX_APPLICATION.G_F02(i),APEX_APPLICATION.G_F01(i));
    commit;
    end loop;
    end;
    

    Hope this helps you,

    Kind regards

    Jitendra

  • Insert into MDQ_OLD select * from table (lt_monitorMdq);

    I'm trying to insert into a table that has only a single column, which is a column of a user defined type (UDT). The UDT is nested, that is one of the attributes of the UDT is an another UDT.

    I aim to insert into the table like this pseudo-code:

    INSERT INTO T1 SELECT * FROM THE UDT;

    CREATE TABLE MDQ_OLD (myMDQ UDT_T_MONITOR_MDQ)

    NESTED TABLE myMDQ

    (T1_NEW) ACE STORE

    THE NESTED TABLE MONITOR_MDQ_PRIM_RIGHTS

    STORE AS T2_NEW);

    The MONITOR_MDQ_CLI procedure. Read below returns the parameter lt_monitorMdq which is a UDT type as announced. The statement "insert into select MDQ_OLD * table (lt_monitorMdq);" fails, while the second insert statement works.

    Is it possible to get the first statement of work?

    I'm on Oracle 11 g 2.

    DECLARE

    lt_monitorMdq UDT_T_MONITOR_MDQ;

    BEGIN

    MONITOR_MDQ_CLI. Reading (TRUNC (SYSDATE),

    TRUNC (SYSDATE),

    NULL,

    NULL,

    "MILLION BTU.

    lt_monitorMdq); -Note lt_monitorMdq is an OUT parameter

    -This insert does not work

    Insert into MDQ_OLD select * from table (lt_monitorMdq);

    BECAUSE me in 1... lt_monitorMdq.count

    LOOP

    Dbms_output.put_line ('lt_monitorMdq: ' | .mdq_id lt_monitorMdq (i));

    -This integration works

    INSERT INTO MDQ_OLD (MYMDQ)

    VALUES (UDT_T_MONITOR_MDQ (UDT_R_MONITOR_MDQ)

    lt_monitorMdq (i) .gasday,

    1,

    NULL,

    NULL,

    NULL,

    NULL,

    NULL,

    NULL,

    NULL,

    NULL,

    NULL,

    NULL,

    NULL,

    NULL,

    NULL,

    NULL,

    () UDT_T_MONITOR_MDQ_PRIM_RIGHT

    () UDT_R_MONITOR_MDQ_PRIM_RIGHT

    1,

    NULL,

    NULL,

    NULL,

    NULL,

    NULL,

    NULL,

    NULL,

    (NULL)));

    END LOOP;

    END;

    have you tried:

    INSERT INTO MDQ_OLD (myMDQ) VALUES (lt_MonditorMDq);

    curiosity:

    Is there a particular reason, why you have created a table with a single column of type UDT instead of:

    CREATE TABLE... OF UDT_T_MONITOR_MDQ;

    I can tell you from experience that using a nested table, you can easily query the data in the nested table.

    MK

  • Problem with "select * from table" for dynamic IN the list

    I have a 'for loop' based a query that does not work. The query is supposed to return the name of the table, the data type and the name of the column in the columns poses a number of name filters. The problem I have is when I run the query into a TOAD with:

    schema_list value SCOTT, MED and the clause of 'in' as ' to (select * from table (DATAPUMP_UTIL.in_list_varchar2 (:schema_list))))»

    The query returns the expected lines.

    When I have it in my code as shown below it returns no rows. I don't know what hurts me, but any help would be great! I'm on Oracle 11.1.0.6.0.
    PROCEDURE export_schema_ondemand (schema_list VARCHAR2, encrypt_file NUMBER default 0, mask_sensitive_data NUMBER default 0) IS  
        ...
        schema_list_t := my_package.in_list_varchar2(schema_list);
        ... 
        for c1 in
           (
            with ok_to_mask as (
            select 
                owner,
                table_name, 
                column_name
            from   
               all_tab_columns
            where
                owner in (select * from table(schema_list_t))
            minus
            (SELECT 
                c.owner,
                p.table_name,
                cc.column_name
            FROM 
                all_cons_columns cc, 
                all_constraints p,
                all_constraints c
            WHERE 
                c.owner in (select * from table(schema_list_t))
                c.constraint_type = 'R'
                AND p.owner = c.r_owner
                AND p.constraint_name = c.r_constraint_name
                AND cc.owner = c.owner
                AND cc.constraint_name = c.constraint_name
                AND cc.table_name = c.table_name
            UNION ALL
            SELECT 
                c.owner,
                cc.table_name,
                cc.column_name
            FROM 
                all_cons_columns cc,
                all_constraints p,
                all_constraints c
            WHERE
                p.owner in (select * from table(schema_list_t))
                AND p.constraint_type in ('P','U')
                AND c.r_owner = p.owner
                AND c.r_constraint_name = p.constraint_name
                AND c.constraint_type = 'R'
                AND cc.owner = c.owner
                AND cc.constraint_name = c.constraint_name
                AND cc.table_name = c.table_name))
            select 
                atc.table_name as mask_tab, 
                atc.column_name as mask_col, 
                atc.data_type as mask_type
            from   
                all_tab_columns atc,
                ok_to_mask otm
            where
                atc.owner = otm.owner
                and atc.table_name = otm.table_name
                and atc.column_name = otm.column_name
                and atc.owner in (select * from table(schema_list_t))
                and 
                (
                atc.column_name like '%LAST%NAME%'
                or atc.column_name like '%FIRST%NAME%'
                or atc.column_name like '%NAME_LAST%'
                or  atc.column_name like '%NAME_FIRST%'
                or  atc.column_name like '%ENAME%'
                or atc.column_name like '%SSN%'
                or atc.column_name like '%DOB%'
                or atc.column_name like '%BIRTH%'
                )
                and atc.column_name not like '%PHYSICIAN_%'
                and atc.column_name not like '%DR_%'
                and atc.column_name not like '%PROVIDER_%'
                and atc.column_name not like 'PRESCRIBER_%'     
           )
          loop
             ...
    
    FUNCTION in_list_varchar2 (p_in_list  IN  VARCHAR2)  RETURN VARCHAR2_TT is
    
        l_tab   VARCHAR2_TT := VARCHAR2_TT();
        l_text  VARCHAR2(32767) := p_in_list || ',';
        l_idx   NUMBER;
            
    BEGIN
        LOOP l_idx := INSTR(l_text, ',');
            EXIT WHEN NVL(l_idx, 0) = 0;
            l_tab.extend;
            l_tab(l_tab.last) := TRIM(SUBSTR(l_text, 1, l_idx - 1));
            l_text := SUBSTR(l_text, l_idx + 1);
        END LOOP;
    
        RETURN l_tab;
            
    END in_list_varchar2;
    Published by: BluShadow on June 29, 2011 16:11
    addition of {noformat}
    {noformat} tags.  PLEASE READ {message:id=9360002} TO LEARN TO DO THIS YOURSELF.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                    

    Hello

    If you have a query that works well when you launch it directly, and that breaks down when you start with a procedure, this can be a problem of privileges.

    Points of view ALL_ * shows only the objects you have access, but using a procedure, privileges must be granted directly to the user and not with a role.

    You should check the SELECT privileges to your user through roles and give them directly to the user.

    Hope this will help.

    Sylvie

  • remove items from table 1 d

    I try to delete an element froma table 1 d usind the function "remove table".

    I don't know why, but the output is of the same length as the original with the element table I tried to remove - repplaced with a copy of another element.

    No one knows why this happens?

    Thank you


  • Oracle 10g - problem with "DELETE from TABLE WHERE ID in (1,2,3)" (use cfqueryparam)

    Hello, everyone.

    I have problems with executing a DELETE statement on an Oracle 10 g server.

    DELETE 
    FROM tableA
    WHERE ID in (1,2,3)
    

    If there is only a single ID for the IN clause, it works.  But if more than one ID is provided, I get an error message "SQL command" not correctly completed.  Here's the query as CF:

    DELETE 
    FROM TRAINING
    WHERE userID = <cfqueryparam cfsqltype="CF_SQL_VARCHAR" value="#trim(form.userID)#">
         AND TRAINING_ID in <cfqueryparam value="#form.trainingIDs#" cfsqltype="CF_SQL_INTEGER" list="yes">
    

    Someone at - it works with Oracle that can help me with this?  I'm a developer experienced in MS - SQL; Oracle is new to me.

    Thank you

    ^_^

    So much worse... a colleague just told me I should always use parentheses around the values in the IN clause.

  • Delete from Table on the Cancel button.

    Hello

    I am facing problem funny.

    I created the form with the report page and I have option to implement send e-mail on the Page so I am attaching a document more with email.if I attach document then it goes into the DUMY_DOC_FILE table.

    I create a process
     
    
    delete from DUMY_DOC_FILE .
    
    Process Point IS ON Submit -After Conmputation and Validation 
    
    
    On Page No 6.
    If I press the button Cancel Button.This is redirecting on page no. 4. not my Code on Page 6.
    page is redirect to 4 but the doc don't delete Table of DUMY_DOC_FILE.

    How to remove Doc from Table if I press the Cancel button.

    Thank you

    Published by: 805629 on January 13, 2011 05:25

    Published by: 805629 on January 13, 2011 23:49

    Published by: 805629 on January 14, 2011 12:43 AM

    This conditional validation_, so that it does not cancel send button (for example: *: ASK! = 'NAME of the BUTTON' *)

  • Extracting data from table.

    Hello and happy Friday.

    I'm reworking an instrument for an HP 662 X power driver.  We have four in the lab and I want to have a panel that writes the channel information and reads what the instrument setting are.  I have entries works fine, but I can't extract the data from the VI "read the voltage and current.  What I would like is the opposite of "build the Subvi where I can get a table and the elements come they channel to the next available line.  I have attached the control program which should give you a better understanding of what I'm trying to accomplish here.  Any ideas will be greatly appreciated.  The pilot, with that I started is one downloaded from NEITHER!

    Thank you

    Gary

    Also: why the property for the resource node VISA? Use a thread.

    Attached is a slight mod showing the use of a cluster instead of individual controls if you want to try it.

  • Save and write data from table to table - easy

    Hello

    I got this system delivered to me. I'm new to LabView and just save the data from the table "average voltage" (inside the while loop) to do some additional testing of our product.

    I will like to do similar to this.

    (1) save in excel file.

    (2) save only when a button button and save it then 5 ilteration.

    (3) save and manipulate the data, so it is displayed in 6 columns (each LED 1) instead of 1 long colum.

    I tried different things with structure business T/F, which resolved the buttom-request. But I am in doubt I should use, writing to the file of the measurement or write to us to the worksheet (by using labview 15.0)-delimited according to my offer the best possible?

    He also seems to be too much to handle when I try to write in txt file, because it pops up with and error that I do not know how to fix, but it says this:

    Error-200279

    Possible reasons:

    The application is not able to cope with the acquisition of equipment.

    Increase in the size of buffer, most frequently the reading of data or by specifying a fixed number of samples to read instead of reading all available samples would correct the problem.

    Property: RelativeTo

    Corresponding value: current playback Position

    Property: Offset

    Corresponding value: 0

    Task name: analog channel

    Thanks in advance

    I agree with Taki, but want to make some additional remarks:

    • LabVIEW is a data flow language.  Think of the "flow" of your data.  You talk about "save only when a key is pressed" and a finite set of data.  You are collecting before the press the button and everything just do not save?
    • Data are collected at some rate, and likely, you don't want to "Miss" data points.  This means that you shouldn't do anything in the loop of the Collection that takes a long time.  If your recovery rate is low and your treatment is fast, you can have everything in a single loop.  Otherwise, to use the technical stream (producer/consumer is a good) to process the data in a single loop in parallel with the collection in a loop independent (and asynchronously).
    • How do you write your data?  You want to write "on the fly", as it is, or can you wait, collect everything, any format and then write it "all at once"?
    • What do you mean by 'save the file in Excel?  Do you mean a 'native' Excel file, one with the extension .xls or .xlsx?  Do you mean a Comma-Separated Variable (.csv) file this reading peut of Excel (and, indeed, usually registers itself to read, change the icon of the .csv files to "look like" it is really an Excel file)?  If the first case, I recommend using the report generation tool.  But for the latter, you can also use write delimited spreadsheet, which can be easier to use.

    Bob Schor

  • Loading data from table to file odi.

    I need to load the data from the table to file without using the interface in odi.  How to do it.

    Hello

    using tool OdiSqlUnload, it loads the table directly.

    I tried the db oracle table to the file without using interfaces. I used with procedures.

    to achieve this, must create the procedure in odi. Select the control on the target technology: oditool.

    «OdiSqlUnload "-FILE = data procedure.txt D:\TEXT\Test" "-DRIVER = oracle.jdbc.OracleDriver" «-URL=jdbc:oracle:thin:@192.0.0.0:1521:odiuser ' '-USER = odiuser ""-PASS = hpfHiT7Ql0Hd79KUseSWYAVIA ""-FILE_FORMAT = VARIABLE ""-FIELD_SEP =, ""-ROW_SEP = \r\n ""-DATE_FORMAT = YYYY/MM/DD hh: mm: ""-CHARSET_ENCODING = ISO8859_1 "" "-XML_CHARSET_ENCODING = ISO-8859-1"»

    Select * from odiuser. DWT_SECTOR.

    I think this will help for you.

    Thnaks in advance,

    A.Kavya.

  • Find highest date from table

    Hi all!

    In the two tables (master and user), I want to find the casenum and the name of the user with date of highest routing (r_date).

    for example, "PHHY2009PV1001", username "Kerry" should come. I use below approach but his does not work.

    Can someone help me

    with master as

    (

    Select 'PHHY2009PV1001' AS CASE_NUM, 1001 AS user_id, 'Japan' under the name of countries, 10 May 1999 ' as r_date from dual union

    Select 'PHHY2009PV1001', 1002, "Korea", 10 September 1999 ' double Union

    ' Select 'PHHY2009PV1001', 1003, "Japan", may 10, 2005 "double Union

    Select 'PHHY2009PV1001', 1004, "Japan", 10 August 2006 ' double Union

    Select 'PHHY2009PV1007', 1005, 'WE', may 10, 2001 ' Union double

    Select "PHHY2009PV1007", 1006, 'USA', 10 June 2002 ' double Union

    Select "PHHY2009PV1002", 1007, "Ireland", 10 November 2003 the double Union

    Select 'PHHY2009PV1002', 1009, "Ireland", July 10, 2003 "Union double".

    Select 'PHHY2009PV1009', 1011, 'Pakistan', November 15, 2004 "double union

    Select 'PHHY2009PV1009', 1012, 'Pakistan', December 15, 2004 "Union double".

    Select 'PHHY2009PV1009', 1013, 'Pakistan', September 15, 2004 "of the double

    ),

    users

    (select 1001 AS user_id, 'John Abraham' as username of union double

    Select 1002, "Andréanne' Union double

    Select 1003, "Jawahar" Union double

    Select 1004, 'Kerry' Union double

    Select 1005, 'Kofi' Union double

    Select 1006, "Kofi1" Union double

    Select 1007, "Kofi2" Union double

    Select 1008, 'Kofi3' from dual

    )

    Select ms.case_num, u.username, ms.r_date

    Of

    (select case_num, max (r_date) as r_date, user_id of main group case_num, user_id) ms, user u

    where ms.user_id = u.user_id

    order by 1

    The first thing that came to my notice - they are not DATEs, but strings.

    Something like that?

    with master as

    (

    Select 'PHHY2009PV1001' AS CASE_NUM, 1001 AS user_id, 'Japan' under the name of countries, 10 May 1999 ' as r_date from dual union

    Select 'PHHY2009PV1001', 1002, "Korea", 10 September 1999 ' double Union

    ' Select 'PHHY2009PV1001', 1003, "Japan", may 10, 2005 "double Union

    Select 'PHHY2009PV1001', 1004, "Japan", 10 August 2006 ' double Union

    Select 'PHHY2009PV1007', 1005, 'WE', may 10, 2001 ' Union double

    Select "PHHY2009PV1007", 1006, 'USA', 10 June 2002 ' double Union

    Select "PHHY2009PV1002", 1007, "Ireland", 10 November 2003 the double Union

    Select 'PHHY2009PV1002', 1009, "Ireland", July 10, 2003 "Union double".

    Select 'PHHY2009PV1009', 1011, 'Pakistan', November 15, 2004 "double union

    Select 'PHHY2009PV1009', 1012, 'Pakistan', December 15, 2004 "Union double".

    Select 'PHHY2009PV1009', 1013, 'Pakistan', September 15, 2004 "of the double

    ),

    users)

    Select 1001 AS user_id, 'John Abraham' as username of union double

    Select 1002, "Andréanne' Union double

    Select 1003, "Jawahar" Union double

    Select 1004, 'Kerry' Union double

    Select 1005, 'Kofi' Union double

    Select 1006, "Kofi1" Union double

    Select 1007, "Kofi2" Union double

    Select 1008, 'Kofi3' from dual

    )

    ----

    SELECT x.case_num, us.username

    DE)

    SELECT case_num, user_id (FROM)

    SELECT case_num, user_id, RANK() over (PARTITION BY case_num ORDER BY TO_DATE(r_date,'dd/mm/yyyy') DESC) rnk

    THE MASTER

    ) ms WHERE rnk = 1

    ) x LEFT users JOIN us

    WE (x.user_id = us.user_id);

    Output-

    USERNAME CASE_NUM

    Kerry PHHY2009PV1001

    PHHY2009PV1007 Kofi1

    PHHY2009PV1002 Kofi2

    PHHY2009PV1009

    -Nordine

  • Load the data from table to table index

    Hello

    We need to load index per table to table data. The code below works fine.

    declare
    query varchar2(200);
    Type l_emp is TABLE OF emp%rowtype INDEX BY Binary_Integer;
    rec_1 l_emp;
    
    begin
    query :=' SELECT * FROM emp';
     EXECUTE IMMEDIATE query BULK COLLECT INTO rec_1 ;
    
    For ALL  i   in rec_1 .First .. rec_1 .Last
    Insert Into emp_b
            values rec_1 (i);
    end;
    /
    

    But data from the source table and the target table are dynamic.

    Ex:

    In code, above table emp (source) and target is emp_b are static.

    But for our scenario is dependent on the source table, target would change as below.

    If source is emp target is emp_b

    If source is emp1 target is emp_b1...

    create or replace procedure p(source in varchar2, target in varchar2)
    as
    query varchar2(200);
    source varchar2(200);
    Type l_emp is TABLE OF emp%rowtype INDEX BY Binary_Integer;
    rec_1 l_emp;
    
    begin
    query :=' SELECT * FROM ' || source;
     EXECUTE IMMEDIATE query BULK COLLECT INTO rec_1 ;
    
    For ALL i   in rec_1 .First .. rec_1 .Last
     execute immediate 'INSERT INTO ' || target || ' values ' ||rec_1(i);
    end;
    /
    
    
    

    His throw. How to implement this scenario... Please help with that?

    No particular reason to use to COLLECT EVERYTHING & BULK here? Why not ordinary

    INSERT INTO target
    SELECT * FROM source;
    

    However if that's what you need you need a dynamic PLSQL block, which comes with additional side effects (code SQL injection). Dynamic SQL is not here.

    Kind regards

  • Extracting data from table using the date condition

    Hello

    I have a table structure and data as below.

    create table of production
    (
    IPC VARCHAR2 (200),
    PRODUCTIONDATE VARCHAR2 (200),
    QUANTITY VARCHAR2 (2000).
    PRODUCTIONCODE VARCHAR2 (2000).
    MOULDQUANTITY VARCHAR2 (2000));

    Insert into production
    values ('1111 ', '20121119', ' 1023', 'AAB77',' 0002');

    Insert into production
    values ('1111 ', '20121122', ' 1023', 'AAB77',' 0002');

    Insert into production
    values ('1111 ', '20121126', ' 1023', 'AAB77',' 0002');

    Insert into production
    values ('1111 ', '20121127', ' 1023', 'AAB77',' 0002');

    Insert into production
    values ('1111 ', '20121128', ' 1023', 'AAB77',' 0002');

    Insert into production
    values ('1111 ', '20121201', ' 1023', 'AAB77',' 0002');

    Insert into production
    values ('1111 ', '20121203', ' 1023', 'AAB77',' 0002');

    Insert into production
    values ('1111 ', '20121203', ' 1023', 'AAB77',' 0002');

    Insert into production
    values ('1111 ', '20130103', ' 1023', 'AAB77',' 0002');

    Insert into production
    values ('1111 ', '20130104', ' 1023', 'AAB77',' 0002');

    Insert into production
    values ('1111 ', '20130105', ' 1023', 'AAB77',' 0002');


    Now, here I want to extract the data with condition as

    PRODUCTIONDATE > = the current week Monday

    so I would skip only two first rows and will need to get all the lines.

    I tried to use it under condition, but it would not give the data for the values of 2013.

    TO_NUMBER (to_char (to_date (PRODUCTIONDATE, 'yyyymmdd'), 'IW')) > = to_number (to_char (sysdate, 'IW'))

    Any help would be appreciated.

    Thank you
    Mahesh

    Hello

    HM wrote:
    by the way: it is generally a good idea to store date values in date columns.

    One of the many reasons why store date information in VARCHAR2 columns (especially VARCHAR2 (200)) is a bad idea, it's that the data invalid get there, causing errors. Avoid the conversion of columns like that at times, if possible:

    SELECT     *
    FROM     production
    WHERE     productiondate     >= TO_CHAR ( TRUNC (SYSDATE, 'IW')
                              , 'YYYYMMDD'
                           )
    ;
    

Maybe you are looking for