apex_items and apex global tables

Hello
I use apex_item.checkbox and apex_item.text via a report sql like this:
SELECT
id,
apex_item.checkbox(1,id) " ", 
apex_item.text(2,name) "name"
FROM APEX_APPLICATION_FILES
WHERE REGEXP_LIKE(name,'txt');
and an after submit processes like this:
DECLARE
BEGIN

--Loop through the selected id
FOR i in 1..apex_application.g_f01.COUNT
LOOP
  IF apex_application.g_f01(i) IS NOT NULL
    THEN 
       INSERT INTO INC9_TEST(t2)values(apex_application.g_f02(i));
       wwv_flow.debug('MY PROCESS:' || APEX_APPLICATION.G_F02(i));
  END IF;
END LOOP;
I have two lines as sample data:
Id  name
1   abc
2   def
When I select the checkbox for Id 2, it keeps returning Id 1 in the global table of apex_applicaiton.g_f01 instead of Id 2. But if I select the two check boxes, then it correctly she travels with the id of 1 and 2. Anyone know why this is happening and what is the fix for this strange behavior?

Thank you

OK - I explained that, on the thread I linked to. You must have the check box values set for line numbers. This can be done by using something like:

APEX_ITEM.CHECKBOX (1, '#ROWNUM#')

Now, if the user activates the boxes, the submitted values will be the line numbers.

You can then use this to retrieve the value of NAME to the same line:

DECLARE
 vROW NUMBER;
 vNAME VARCHAR2(100);
BEGIN
 FOR i IN 1..APEX_APPLICATION.G_F01.COUNT
 LOOP
  vROW := APEX_APPLICATION.G_F01(i);
  vNAME := APEX_APPLICATION.G_F02(vROW);
  ... etc...
 END LOOP;
END;

So, first get us the line number for each active element, and then use it to get the value of the corresponding row name.

Andy

Tags: Database

Similar Questions

  • How do I route out of the VRF to the global table

    How to build static routes (two-way) between the VRF and the overall table?

    Cat 6509

    12.2 (33)

    Single VRF, Full BGP. EIGRP inside the VRF.

    I do not have a 6509

    but on IOS, you attach the word 'global' key to the VRF road, and on the incoming Interfaces, I created a policy map to send traffic to the vrf.

  • ISE more licenses and APEX on version 1.2?

    Hello

    A customer has ISE1.2 and the license will be expired next week. They are not ready to upgrade v1.2 to the v2.1 before the expiry of the licence. The question is, can the newest and APEX license applied to ISE v1.2?

    Hello

    You can not install Apex on ISE 1.2.  It can be installed 1.3 in the rooms. Rest the existing license needs to be renewed.

    For reference:

    http://www.Cisco.com/c/en/us/TD/docs/security/ISE/1-2/user_guide/ise_use...

    http://www.Cisco.com/c/en/us/TD/docs/security/ISE/2-0/admin_guide/b_ise _...

    Concerning

    Gagan

    PS: rate if this can help!

  • Hide and display a table made by programming based on the State

    Hi all

    I have two tables in my page. I have a requirement to hide and show a table advanced programmatically based on a condition.

    I used the following statement to hide and show the edge table. But still the table appears on my page.

    If (TblBn! = null)

    {

    TblBn.setRendered (true);

    }

    If (TbBn! = null)

    {

    TbBn.setRendered (false);

    }

    Anyone has any idea on this?

    Yes, you could do this using SPELLS.

    Create a VO model with 2 attributes say "RenderTable1" and "RenderTable2" of type Boolean.

    Set the corresponding attributes in the attribute of the AdvancedTable.

    Write methods in AM below.

    public void createPVORow()
         {
              OAViewObjectImpl dummyPVO  = (OAViewObjectImpl)getXXDummyPVO1();
              if(dummyPVO != null)
              {
                   if(dummyPVO.getRowCount() == 0 )
                   {
                        Row dummyPVORow =  dummyPVO.createRow();
                        if(dummyPVORow != null)
                        {
                             dummyPVORow.setNewRowState(Row.STATUS_INITIALIZED);          
    
                             dummyPVORow.insertRow(dummyPVORowRow);
                        }
                   }
              }
         }
    
    public void setDummyPVOValues(String[] name,String[] val)
    {
       OAViewObject dummyPVO  = (OAViewObject)getXXDummyPVO1();
       if(dummyPVO != null )
       {
        dummyPVO.setRangeSize(-1);
        Row row = dummyPVO.getRowAtRangeIndex(0);
        if(row != null)
        {
      for(int i = 0; i < name.length; i ++)
      {
      if("Y".equals(val[i]))
      {
       row.setAttribute(name[i],Boolean.TRUE);
      }
      else
      {
       row.setAttribute(name[i],Boolean.FALSE);
      }
      }
        }
       }
    }
    

    Call the two methods in the PR with appropirate parameters.

    Call the second method with parameters of the useful, likely only in the PFR State-based.

    It will be useful.

    See you soon

    AJ

  • ORA-00054. Is possible to interrupt the cancellation and delete then table?

    Hello.

    I m import a long table with utility IMP.

    When has spent a lot of hours and the process was near complete I ORA-30036 and a restore operation was launched.

    I m trying to drop/truncate the table to avoid the wait, but the cause of the restore process I ORA-00054 and I have to wait for several hours to try to import again.

    My question is: there is a way to interrupt this restore operation and delete the table without inducing an inconsistent state of the database?

    I m working with Oracle 9i on Windows plataform

    concerning

    No, it isn't.

  • Concerning the import and export of Table

    Hi team,

    My question is what is quick way to export and import the data in the table because most of the time I import and export data activate I usually

    If data less than 10 hundreds I'll use sql developer only export to insert the . script

    But my question if we lack 17 records dealing with import and export a table

    Note:-If we do not have conditional sys.

    Current version is using 9i and 10g

    Thank you

    Suman

    Data pump works only on 10g:

    ORACLE-BASE - Oracle Data Pump (expdp and impdp) in Oracle Database 10g

    Have a look here for more options 'pre - 10 g':

    Ask Tom & quot; export and import the best option to use in... & quot;

  • Code and the database tables in different schemas

    Hello
    My version of db: database Oracle 11 g Enterprise Edition Release 11.2.0.3.0 - 64 bit Production

    I would like to understand the advantages and disadvantages of the following situation:
    We have only one request.
    The design of the db for this application has 44 paintings, of which 28 are base tables related by PK and FK relations. The other 16 are look upward and refer to tables that are not related to one.
    The team has decided to place the tables in a diagram 28 A and the other 16 in another schema B within the same database. (The reason for this is... because it's done in other projects to do it here too).
    Coming now to the code (stored procedures, functions, packages, etc.). Teams want to ask most of the code in the diagram B with the Ref 16 tables. (the reason being again the same).
    What are the advantages and disadvantages this?
    Please advice.

    PS:
    I have googled and find something on these lines:
    cons:  
    o harder to manage
    o harder to upgrade
    o harder to patch
    o harder to maintain
    o causes your shared pool size to increase 1,000 times (shared sql goes down the tubes)
    o takes more space
    o queries against the dictionary will be impacted
    o latching on the shared pool goes WAY up (latching = locks = serialization device = 
    slows you down)
    
    pros:
    o none that I can think of.

    From my experience, if you see natural divisions that give you smaller patterns, well contained, you should take advantage of them and start with separate schemas. If done correctly, it can greatly improve the maintainability of your application.

    Assume that your initial project will have 60 paintings and they might fall into three groups of 15, 20 and 25 tables respectively. There are also objects of code such as PL/SQL packages. Your choice is:
    -Use a single scheme of 60 paintings, suppose that with 100 objects in code.
    -Use 3 diagrams divided as described. The total code objects will probably be a little more that unique schema design, because they need to provide interfaces with others. So let's assume that schemas will have 30, 40 and 50 code objects each.

    Now, wait 3 years.

    The history of single-schema:
    -The single-schema is passed to 120 tables. There has been a corresponding increase in code, now more than 200 objects. But they are very complex. Over time, each piece of code will interact with the tables more and more and many other objects in code.
    -You (the architect or Manager) can not really apply any internal for this structure since no subsidy is required to add dependencies. Therefore, if you change a table there is a vast impact.
    -You can easily divide the responsibilities because everything affects everything else. You can't manage simultaneous development efforts because the impact widespread changes cause the same tables/components to be affected by multiple projects. If you have the project execution dependencies / as you have interdependencies technique/code.
    -You have a "big ball of mud" and it is very difficult to get out of this situation.

    Multi-schema history:
    -Each of the 3 original patterns has grown, but some more than others. A fourth was added a year ago, when there was a major expansion in a new district, as a result of "natural divisions.
    -On the inside of each of these schemas, it is always a challenge to impose internal structure. But those patterns more smaller and therefore more small problems.
    -Through schemes, subsidies are necessary to allow interaction, this is not a "wild west" where everyone can access everything (and introduce dependencies on everything). Teams to define interfaces (packages or the views that are specifically intended to be interface points) where the interactions with other scehmas are necessary.
    -If changes to a table are needed, the impact can be limited to the schema that contains naturally. You know that no one else has created dependencies at this table, because you were never issued grants on this.
    -The responsibility for these schemas was divided between 2 teams, each team having full responsibility for its 2 diagrams.
    -Teams can complete projects much more easily, even with a parallel development, because there is less than a "domino effect" with each change.

    Now, these benefits depend on "natural divisions that give you the diagrams smaller, well contained." If you make the wrong choice here, you will always have a mess. You design with the end in mind scenario.

  • Pre-built virtual machine with OBIEE and APEX

    Hi all

    I searched for a virtual machine with OBIEE and APEX pre-installed, but I can't find it... or information messages on this subject. Does anyone know if y at - it a selection of pre-designed developer programs VM with these two tools?
    I currently have the "database App Development VM' with APEX in there, but I can't understand how to install OBIEE inside.

    Any help will be appreciated.

    Please, if this is not the right category where I should ask the question, let me know and I'll move it.

    Thank you.
    Elena.

    PL see deployment guide in the link above - the model of computer virtual includes version of database 11.2.0.3, which should be the APEX installed by default in it. If this isn't the case, it will be much easier to install the APEX in the VM OBIEE.

    For sharing files between the host and the guest, pl see the Virtualbox documentation - http://www.virtualbox.org/manual/ch04.html#sharedfolders

    HTH
    Srini

  • Ask about the creation and filling I$ table on different condition

    Hello
    I have a question about the creation and filling I$ table on a different condition. In which condition the I$ table creation? And these conditions are given below:

    (1) * source and transit area * are on the same server (that is to say target is located on another server)
    (2) * gathering and target area * are on the same server (IE source is on another server)
    (3) * source, transit area and target * are * different 3 * Server
    Source 4), area transit and target are on the same server
    Thank you

    I'm not quite clear to your question. Always try my best to erase it.

    In your all over requirement I$ table will be created.
    If the same staging as target (a database, a user) then all temporary tables are created in this user
    If the scaffolding is different from the target (a database, two users (A, B)), then all temporary tables will be created under that user A (lets consider) and the data will be inserted into the target table that is present to user B

    Staging is different from the target (two database, two users (A1, A2), architecture not recommended) if all temporary tables will be created under that user A1 (A1 of the databases) and the data will be inserted into the target table which is present in user A2 (A2 data base)

    If the source, staging, the target will be under a database then no. LKM is required, IKM is sufficient to load the data into the target. Especially for this, you can see an example given by Craig.
    http://S3.amazonaws.com/ora/ODI-Simple_SELECT_and_INSERT-interface.swf

    Thank you.

  • ODI CKM and LKM temporary tables

    Hello

    I am new to Oracle Data Integrator part.
    I've found that I have$ and C$ temporary tables are created in the schema target databases?
    I just wanted to know is there a significance that these tables are created in target tables?
    These tables can be created in the other schema schema target?
    If so, how they can b created?

    Really need this information ASAP?

    Thanks for the Clarification.

    Thank you and best regards,
    Mahesh

    Hello Manu,

    Good to know you well understood.
    If you make another schema then temporary tables will not be created in you're your target schema. Thus, it will be clean of temporary tables.

    You shoudn't make another database because when you deal with millions of records, you will face a huge problem in time to load application consumption.
    I was suffering with this scenario, but now I use the other as I told you before.

    Thank you

  • difference between the external and UTL_File utility table

    can someone clearly explain me what is the difference between the external and UTL_File utility table? and what case to use utl_file and external table?



    Thanks in advance.

    Hello
    To get a clear idea on the check of the external table link below

    http://www.orafaq.com/node/848

  • Difference between error and DML error table in OWB 11 G

    Hello

    Could you please let me know the difference between the error and DML error table in OWB 11 G?
    To my knowledge, DML errors (such as value too large or non-null values) are stored in DML error tables and referential integrity errors are stored in the error table.

    Thank you
    Murali.

    Hi Murali

    Error tables are given OWB violations of rules that look more like errors of logic rather than pure physical errors for style DML errors.

    See you soon
    David

  • Partitioning and Clustering of Table

    I just inherited a table with 10 million discs. Data can be grouped against a key, every key with usually around 10-200 lines.

    Given against this key gets dragged over a large period of time. The separate keys in this table at present are ~.7 million.

    All our access to this table is based on this key. As you can image data are scattered in several blocks, so there is considerable I/O and it takes a long time for all queries to run. Ours can be described as instance Reporting however inserts and updates this table up-to-date produce 24/7 and access to the system is 24/7.

    My question is threefold

    Should I create a cluster from the unique table on that table with my key as the cluster column?

    Partitioning coexist with clusters?

    Other strategies exist to optimize in this case?

    Thanks in advance

    Daniel

    If each key corresponds to 10-200 lines, however, and it is a regular b-tree on the key index, the fact that there are millions of rows in the table should be (mostly) not relevant - Oracle will get 10-200 ROWID of the index and 10-200 one-piece extractions. This should be pretty quick, so I'd be interested in why you think that this distribution is causing significant delays in reporting instance.

    Are you a license for partitioning (it's an extra cost option on top of your enterprise edition license)? If you are partitioning table would seem reasonable. A cluster of the unique table or a table in index also seems to be reasonable. Cluster table, however, will slow down inserts and updates. However, you cannot partition a table cluster.

    Justin

  • Oracle HTTP server on a different machine than the database and APEX

    Hi guys!

    I wonder if its possible to have Oracle HTTP Server installed on a different machine then the database and APEX?

    With respect,

    PsmakR

    Hello

    exactly. Details have recently been examined here: {: identifier of the thread = 1955437}. You will find the link to the license document it.
    But OSH is not only registered with the database. If you have an OAS running somewhere, you can use the HTTP server that comes with this instance for APEX as well.

    Another option could be the APEX listener that runs on (almost) any J2EE container. The officially supported include the embedded GlassFish and the Open Source Edition of GlassFish, who both don't need no extra license.

    -Udo

  • How do to copy and paste a table from PDF to Excel using AcrobatX

    How to copy and paste a table of PDF to Excel using AcrobatX?

    I was able to do it easily using acrobat 9 but cannot do it in Acrobat10.

    Has the fgone option, or am I missing something?

    Tomas

    I found that the "selection of export that" both the functions 'copy with formatting' were able to go directly to Excel without using a Word, although it also works. The key in Excel is simply use ctrl + v to paste, do not right click and try to use a special dough or the default value, which went to "keep text only".

Maybe you are looking for

  • Libretto 50ct unbootable - screen remains blank

    Hello I have a 50ct booklet and it no longer starts. I hear the hard drive, but the screen remains blank. But if I push the powerbutton in combination with F12, then he comes in Flash mode. And if I push the BIOS disk, then it reads, and the laptop n

  • Change of operating system on Satellite C855-1TC

    I have a Toshiba Satellite C855 1TC and want to remove Windows 8.1 and replace it with Windows 7 Home Premium. Does anyone have advice on this change in operating system?... .namely drivers availability etc.

  • Adding memory and an extra hard drive

    I have a HP 595, which is a little less than a year and would like some advice on the following topics: (1) I currently have 6 GB of memory, using 3 of the 4 slots available. I understand that I have a MAX of 16 GB in the system. Can I replace one of

  • HP Core i5-5200U 2.2 GHz 4 GB 250: unable to connect to 5g on router

    HelloPlease can someone tell me how I get my laptop to recognize the signal of 5g of my wireless router? He can see the connection of 2g.I would have thought that a machine of 2015 would automatically see this.Concerning

  • My new HighDing SATA burner Blu - ray BD-R/RE Drive burner does not play Blu - ray discs.

    Have a HP Pavilion dv7-1245dx Entertainment Notebook PC running Win 7 Home Premium (64-bit). Experienced a DVD/CD drive failure and replaced by a new author HighDing SATA Blu - ray BD-R/RE drive drive. The new drive is anything but play Blu - ray dis