Dynamic (lazy) loading the source .as

I know that this is probably a pure AS3 question, but the purpose is related to the PlayBook I ask here.

Is it possible to code ActionScript3 'lazy-load "?  (It would be like what you would call a dynamic linking of libraries, rather than statically linking).

In the Python world, the 'import' statements are indeed true statements, transformed the runtime load (and even on the fly compilation if necessary) other modules.  You can even put 'import' inside functions, so that they will be run well after the loaded application, even by only if you call a function that is rarely used.

Is this also possible with AS3?  (I confess that I have not yet tried to 'import' in a function.)  If it is, it works the same way, where any code is not yet loaded from the file SWF (or wherever he may be) until the import statement execution?

A use case would be to allow a very wide application start execution, perhaps display a custom splash screen, as it continues loading the modules "in the background".  They have alluded to this in a few videos (first WebWorks webcast?) where they mentioned that the image of custom in the file of your blackberry - splash screen tablet.xml "could even be integrated into the application.  That makes no sense to me, unless there is a form any lazy loading were available.

You can use the Modules for this to dyanmically load views, code, services, what ever.  Unless she changed, imports are included during the compilation step not the enforcement stage (However, I was wrong once before).

Tags: BlackBerry Developers

Similar Questions

  • Question about the lazy loading the cache

    Hello

    First turn off please forgive my ignorance, this if my first time with a database that is put in cache.

    I'm trying to set up SQLFire so that it will be a cover for a traditional back-end RDBMS (p.6 that is applied on the RDBMS which will appear in the SQLFire cache).

    In the vFabric SQLFire user, Chapter 22 'Cache SQLFire strategies' Guide mentions that this can be done through deferred data loading into the cache. I do not understand completely how to set this up and I don't see any other details in the User Guide or anywhere else.

    Anyone would be able to point me to a more detailed explanation of lazy loading the cache (or be able to explain more)?

    Thank you.

    Hi, there are some problems with having obtained documentation out of sync from the GA version. Also based on your question, that I passed by this yesterday and hit a few challenges, so I wrote a blog post that shows how to use SQLFire for the caching of end to end. I hope it will be useful to you.

  • Modals declarative apex 5 and shares close post on the source page

    Hi, I have a question about the interaction between the source page and the new modals declarative apex 5.

    Let's say I have a region report on my source page that displays a set of data, and I have a page of the modal dialog with a form to add data to the underlying table.  When I create a new entry through the modal I want to refresh the region to report on the source page and triggers an alert indicating that a new line has been added after the modal has closed.

    What is the right way to manage the communication between pages?  Currently the source page must be refreshed to show the newly added lines, but I prefer to handle this by a partial page refresh.

    Looks I can create dynamic action shoot the source page in a closed event dialog box, but to manage the appropriate specific situation it seems that when I spring the dialogue box that I'll also need to follow the particular action to indicate what I wanted to do in the modal, specifically, say , a vs update has create, etc..  Then, when the dialog box is closed, I'll need search sort field that I used in the field of the source to determine what has been up to the spring and markup dialog box then appropriate measures through the DA.

    It is the opinion of the community is the best way to go, or is there an easier way to link an action directly in the dialog box modal back to the source page?

    Thank you

    OK, I think I understand how to do this.  I do not necessarily understand the mechanics of it, but now I can predict the outcome.

    Let's say I create a button on page 1 that opens a modal dialog box on page 2.  I also have a region of report with update also buttons pointing to the modal form on page 2.

    I can set up 2 dynamic actions of close dialogue on page 1.

    For the confirmation to create I can choose an event "Dialog closed", a selection of type "Button" and choose my button create, then the real action I can add my alert to successfully create.

    For confirmation of the update I can choose a selection type of 'Région' and choose my region to report that contains my Update buttons and add real action triggering my update alert.

    The designation of type and region/button/item selection is the differentiator that allowed me to adapt it accordingly.

    The modal dialog box is actually an iframe pointing to the page of dialogue that is dynamically added to the body of the page source when it is called.  The url contains a long 'cs =' string added to it.  The "source" button also includes a similar ' cs = "query string.  I guess that these values are what they are binding between them and allow the good communication between the source and iframe page to enable dynamic actions to synchronize properly.

  • Loading the XML file with the missing elements dynamically by ODI

    Hi guys,.

    I have the XML with two nodes Employee and address below. On a daily basis, sometimes the address element might not come from the source xml file, but my interface has columns mapped to the elements of the address, and that is why it may fail because of the source element is not found in the file or data could not get charged because the State 'and' in the sql query that is generated between the employee and address elements.  Is there a way where I can load the data dynamically where I can search in the file only for items (used) present and dynamically loading data only for these items?

    XML file:

    <? XML version = "1.0" encoding = "UTF-8"? >

    < EMP >

    < Empsch >

    < employee >

    < EmployeeID 12345 > < / EmployeeID >

    < original > t < / initials >

    John < name > < / LastName >

    DOE < FirstName > < / name >

    < / employee >

    < address >

    < > 12345 as WorkPhone < / as WorkPhone >

    < WorkAddress > test 234 < / WorkAddress >

    < / address >

    < / Empsch >

    < / EMP >

    Thank you

    Fabien Tambisetty

    I managed to solve it by using left outer joins, and in referring to the structure of the table of the XSD

  • Refresh the Source Action query dynamic triggers point report on Null

    I have a page that is updated in place on apex.oracle.com. It has one element of the shuttle and a report that is updated when change the values of the shuttle.

    When the value of the shuttle is not null when you enter the page, and then you move the left and right elements the report refreshes properly until there is no item in the right pane. At this point, it seems to go back to the original value of the loading of the page.

    The session state popup shows the empty value, the net tab in Firebug shows the xmlhttp sets request that the value empty and debugging output shows the value is set to white as well.

    What I noticed I look at all the different scenarios in the debug output is that once the value is null, the source of the element query is executed, so the reset of the State than to the loading of the page. The setting for the source query is always rather than when Null. Anyway I didn't know the source of the item to use in the course of an action dynamic updating of the report

    This is not expected behavior given that the query runs only when I put the value null. I have actually only supposed to run during the loading of the page and not during a refresh of the region by dynamic action.

    My solution is to remove the source item query and put it in the page loading area.

    You can check it out yourself

    Workspace: ferguson
    User: Tester/Testerperson
    App ID = 13546

    Run it and then click the appointment tab. From there just click on icon of change in one of the lines report. It has region named sellers then under it a region of Notes. Notes refreshes when you move items around the element of shuttle box suppliers.

    I have a copy of the application running on 4.1 and it happens here too.

    Greg

    I just changed a little bit, and if I understand you correctly, it will now keep the selected values if you refresh the page.

    What I have is:

    1 moved the motion of the source of the item to a calculation and subordinated to be executed only if the element is NULL

    What I couldn't know, is what process is implemented with the shuttle on change session state?

    Denes Kubicek
    -------------------------------------------------------------------
    http://deneskubicek.blogspot.com/
    http://www.Apress.com/9781430235125
    http://Apex.Oracle.com/pls/OTN/f?p=31517:1
    http://www.Amazon.de/Oracle-Apex-XE-Praxis/DP/3826655494
    -------------------------------------------------------------------

  • I clicked on the source page and remove nodes with the web developer and removal became permanent. EBay does not load pictures. How to fix?

    The problem is on my desktop and my laptop. I got a little happy click on my desktop and click the Web Developer. I tried to remove an ad really annoying flash on the side of the screen. I think I entered in the source page and remove the node. I really don't know what happened, but stopped loading for ebay pictures and everything is a list written on the side of the page. I deleted the cache and cookies. I have reset mozilla. I uninstalled and reinstalled mozilla. Laptop is not immediately affected, but now having the same problem.

    It is a very strange problem.

    Any changes made with the web developer tool will save and should be cleared when the page is reloaded.

    Try disabling graphics hardware acceleration. As this feature has been added to Firefox, it has gradually improved, but there are still some problems.

    You will have to perhaps restart Firefox for it to take effect, so save any work first (e.g. you compose mail, documents online that you are editing, etc.).

    Then perform the following steps:

    • Click on the orange top left Firefox button, then select the 'Options' button, or, if there is no Firefox button at the top, go to tools > Options.
    • In the Firefox options window, click the Advanced tab, and then select 'General '.
    • You will find in the list of parameters, the checkbox use hardware acceleration when available . Clear this check box.
    • Now restart Firefox and see if the problems persist.

    In addition, please check the updates for your graphics driver by following the steps in the following knowledge base articles:

    This solve your problems? The report please come back shortly.

  • Dynamically loading the file location

    Hi all

    Please give me an idea on dynamic locations. At present iam using the localization file it works fine locally, but is new scenario: I need to download the file location on the server side and dynamically change the value.

    Please help me if anyone has any idea on this?

    Thank you

    I'm not saying it's impossible, but I really don't see a way to use the built-in location facility and have the ability to dynamically load the additional locations.

    My only suggestion is to establish a similar device yourself.

    If you are looking for something on the basis of this, in addition to the Blackberry application, I was looking at the Android application that uses XML "translation."  Using this approach, you would be able to download a new XML file to get a new translation.

  • FDMEE 11.1.2.4 - how to load the same record source of 2 different custom in HFM members

    Hi Experts,

    We have the following situation:

    We need download a source document unique member 2 custom end balance as well as two rollforward Member custom movement 2 (additions, for example), the SAME account.

    In our GL, we have separate accounts for the additions, disposals, depreciation, etc.   In HFM, we use custom for these last members.

    I tried the multi-dimensions mappoing member card custom 2 individual based on the story.   It works fine, but when I add another condition by using the same account of source and card balance end Member, he only ended up filling a 2 custom members (according to what has been covered last importing).

    What would be the best way to load a single 2 custom members (same account) source document?

    I wanted to use the accounts of logic since I know that would work, but our financial team do not accept this approach.

    It is even possible to load the same record of 2 different HFM Member?  Even if we loaded the file even a second time in another place, I think that it still would replace what took the first time, instead of loading again to another custom on the same member account.  Even if we loaded to HFM on 'merge '?

    Any ideas at all would be appreciated.

    Thank you
    Mark Smith

    Why your financial team will not accept the approach of the accounts of the logic? He is the obvious solution to your problem. FDM or FDMEE, you can only load on the data at the intersection of a target element. If you want to load the same piece of data at several intersections in the target, then you need to have this piece of data repeated in data sources n then you will need something in the source that could identify how unique each copy so you can map correctly. As noted it is one of the main functions of the accounts of the logic, other approaches are likely to be messy, more prone to error and much harder to verify.

  • Generic procedure to load the data from the source to the table target

    Hi all

    I want to create a generic procedure to load data of X number of the source table to X number of the target table.

    such as:

    Source1-> Target1

    Source2-> Target2

    -> Target3 Source3

    Each target table has the same structure as the source table.

    The indexes are same as well. Constraint are not predefined in the source or target tables.there is no involved in loading the data from the business logic.

    It would simply add.

    This procedure will be scheduled during off hours and probably only once in a month.

    I created a procedure that does this, and not like:

    (1) make a contribution to the procedure as Source and target table.

    (2) find the index in the target table.

    (3) get the metadata of the target table indexes and pick up.

    (4) delete the index above.

    (5) load the data from the source to the target (Append).

    (6) Re-create the indexes on the target table by using the collection of meta data.

    (7) delete the records in the source table.

    sample proc as: (logging of errors is missing)

    CREATE or REPLACE PROCEDURE PP_LOAD_SOURCE_TARGET (p_source_table IN VARCHAR2,

    p_target_table IN VARCHAR2)

    IS

    V_varchar_tbl. ARRAY TYPE IS VARCHAR2 (32);

    l_varchar_tbl v_varchar_tbl;

    TYPE v_clob_tbl_ind IS TABLE OF VARCHAR2 (32767) INDEX OF PLS_INTEGER;

    l_clob_tbl_ind v_clob_tbl_ind;

    g_owner CONSTANT VARCHAR2 (10): = 'STG '.

    CONSTANT VARCHAR2 G_OBJECT (6): = 'INDEX ';

    BEGIN

    SELECT DISTINCT INDEX_NAME BULK COLLECT

    IN l_varchar_tbl

    OF ALL_INDEXES

    WHERE table_name = p_target_table

    AND the OWNER = g_owner;

    FOR k IN l_varchar_tbl. FIRST... l_varchar_tbl. LAST LOOP

    SELECT DBMS_METADATA. GET_DDL (g_object,

    l_varchar_tbl (k),

    g_owner)

    IN l_clob_tbl_ind (k)

    FROM DUAL;

    END LOOP;

    BECAUSE me IN l_varchar_tbl. FIRST... l_varchar_tbl. LAST LOOP

    RUN IMMEDIATELY "DROP INDEX ' |" l_varchar_tbl (i);

    DBMS_OUTPUT. PUT_LINE (' INDEXED DROPED AS :'|| l_varchar_tbl (i));

    END LOOP;

    RUN IMMEDIATELY ' INSERT / * + APPEND * / INTO ' | p_target_table |

    ' SELECT * FROM ' | '. p_source_table;

    COMMIT;

    FOR s IN l_clob_tbl_ind. FIRST... l_clob_tbl_ind LAST LOOP.

    EXECUTE IMMEDIATE l_clob_tbl_ind (s);

    END LOOP;

    RUN IMMEDIATELY 'TRUNCATE TABLE ' | p_source_table;

    END PP_LOAD_SOURCE_TARGET;

    I want to know:

    1 has anyone put up a similar solution if yes what kind of challenges have to face.

    2. it is a good approach.

    3. How can I minimize the failure of the data load.

    Why not just

    create table to check-in as

    Select "SOURCE1" source, targets "TARGET1", 'Y' union flag double all the

    Select "SOURCE2', 'TARGET2', 'Y' in all the double union

    Select "SOURCE3', 'Target3', 'Y' in all the double union

    Select "SOURCE4', 'TARGET4', 'Y' in all the double union

    Select 'Source.5', 'TARGET5', 'Y' in double

    SOURCE TARGET FLAG
    SOURCE1 TARGET1 THERE
    SOURCE2 TARGET2 THERE
    SOURCE3 TARGET3 THERE
    SOURCE4 TARGET4 THERE
    SOURCE.5 TARGET5 THERE

    declare

    the_command varchar2 (1000);

    Start

    for r in (select source, target of the archiving of the pavilion where = 'Y')

    loop

    the_command: = "insert / * + append * / into ' |" r.Target | ' Select * from ' | '. r.source;

    dbms_output.put_line (the_command);

    -execution immediate the_command;

    the_command: = 'truncate table ' | r.source | "drop storage."

    dbms_output.put_line (the_command);

    -execution immediate the_command;

    dbms_output.put_line(r.source ||) 'table transformed');

    end loop;

    end;

    Insert / * + append * / into select destination1 * source1

    truncate table SOURCE1 drop storage

    Treated SOURCE1 table

    Insert / * + append * / to select TARGET2 * in SOURCE2

    truncate table SOURCE2 drop storage

    Treated SOURCE2 table

    Insert / * + append * / into select target3 * of SOURCE3

    truncate table SOURCE3 drop storage

    Treated SOURCE3 table

    Insert / * + append * / into TARGET4 select * from SOURCE4

    truncate table SOURCE4 drop storage

    Table treated SOURCE4

    Insert / * + append * / into TARGET5 select * from source.5

    truncate table source.5 drop storage

    Treated source.5 table

    Concerning

    Etbin

  • get error message SCR ADE 4.5 E file is not found, the source file cannot be read.  Tried to withdraw approval of &amp; re allow including factory reset and re load but still can not read the library on my Kobo Touch book.  With the help of Windows 10...

    Hi, when you try to read the book from the public library on my Kobo Touch I get error message SCR ADE 4.5 E file is missing could not read the source code.   I tried to remove and re downloaded from the website of the public library (no luck), I did the resets, including plate and re authorizing with the reset factory and re loading (no luck).  This problem started after the last update to 4.5 ADE unfortunately as before no problem.   If anyone has any ideas, would be appreciated.   Thank you.

    Try it work when you uninstall 4.5 and install new 3.0: the Adobe Digital Editions software downloads

  • Get information for clip loaded in the source using ExtendScript monitor

    Hello

    So, the short version: I was wondering if there is a way to get information about the clip currently loaded in the source, using ExtendScript monitor. Specifically, I am interested in the name, from timecode and in/out points. I don't see anything obvious in the ExtendScript Toolkit data browser, but I thought I'd ask.

    The long version of what I'm trying to do: I have a clip in the source monitor Panel that I know is in the active sequence. I would take the points of entry/exit from the source element and those like entry/exit points in the sequence. It's kind of a hack to get around the fact that you can't associate speech analysis with a multicam clip so I cut with one of the audio clips in the multicam clip and then I am translating those points to the multicam editing sequence. I have a way to do it with an AppleScript macro, but I would find a more robust solution (and cross-platform).

    Thank you!

    There is no possibility of returning to an element of the Source monitor to real projectItem in Agency CC 2015.

    However, you are not the first to ask such a feature, and (ahem) I can neither confirm nor deny that it can be added in the near future.

  • OutOfMemoryError: Limit superior GC exceeded when loading directly the source using IKM sql for sql. Growing ODI_MAX_HEAP do not solve the problem.

    OutOfMemoryError: GC overhead limit at execution a loading interface directly sql for sql with no work table.

    I get the error message: error: exception OutOfMemoryError: higher GC limit exceeded when executing an interface making a direct using IKM SQL for SQL command load Append, source a 150millions lines table.

    I have increased the ODI_MAX_HEAP and the interface run longer and failed. I'm already at: ODI_MAX_HEAP = 12560 m I tested with ODI_MAX_HEAP = 52560 m and still error.

    I am following up to the memory of the server and I still have available memory...

    Apart from the problem of memory I know that this type of load should be possible because the step of data load on LKM SQL to Oracle is able to load the work table $ CAN. Ideally, I want to emulate this behavior by using SQL for SQL IKM.

    1 - What is the right path to follow here? (change the parameters of memory or modify the IKM?)


    2 - ideas on how to solve the OutOfMemoryError: GC overhead limit exceeded error? (GC means Garbage Collector)

    Execution of the IKM interface in the Simulator generates this code:

    Load (Source) command:

    Select

    source - tbl.col1 COL1,

    source - tbl.col2 COL2,

    source-tbl. "' COL3 ' COL3

    of public.source - tbl AS source-tbl

    where

    (1 = 1)

    Default command (Destination):

    insert into the source-tbl

    (

    col1,

    col2,

    COL3

    )

    values

    (

    : COL1,.

    : COL2.

    : COL3

    )

    My experience is very limited with ODI so I don't know about changing the code to the KMs

    Thanks in advance.

    Find a work around the error of generals limit exceeded GC:

    -in my case I was running without the IDE so that changes made to the odiparams.sh were not useful.

    -This means that I need to change the JVM settings to:

    $ODI_HOME/oracledi/client/odi/bin/odi.conf

    AddVMOption - XX: MaxPermSize = NNNNM

    $$ODI_HOME/oracledi/client/ide/bin/ide.conf

    AddVMOption - XmxNNNNM

    AddVMOption - XmsNNNNM

    Where NNNN is a higher value.

  • How to activate the connection pool dynamically during loading happens

    Hello

    I have two databases that contains the same data. IE Prod_db, Prod_db1,

    I want to move the connection pool dynamically during loading times

    Ex: During load happens I want to hit prod_db1, after full charge I want to hit prod_db. How to get there.

    Use the W_ETL_RUN_S of the situation of the table column to validate the loadtime

    PS mark correct or useful

  • SQL Loader failed to load the same source sometimes data file

    I meet different types of data, loading errors when you try to load data. What makes it really annoying, is that I'm not able to identify the real culprit since the success of load depends on the amount of lines in the source data file but not its content. I use dataset Toad export feature to create delimited data set. When I decided to take the first 50 lines then succeeds data to load into the target table. When I decide to take the first 150 lines then the data to load into the target table fails indicating that:
    Record 13: Rejected - Error on table ISIKUD, column SYNLAPSI.
    ORA-01722: invalid number
    I can see .bad file that the same line has been loaded successfully when the data file that contains 50 rows was used. The content has no column for this particular row is NULL (no chain).
    I suspect that the toad generates faulty delimited text file when taking 150 lines. File data for reasons of confidentiality, I can't show. What can I do? How can we all further investigate this problem? Size of the array that must be loaded by SQL Loader is almost 600 MB. I use Windows XP.

    Published by: totalnewby on June 30, 2012 10:17

    I do not believe that customer sqlloader 11g allows you to load a 10g database. You can run sqlldr on the server of database 10g itself? PL also post the rest I asked information

    HTH
    Srini

  • How do I know if a loaded SWF file have the source code?

    Hi!, I need to know if a loaded SWF file have the source code. I develop an application running on iOS and admin can upload SWF files that show in the APP, but I must know before show (mobile Adobe AIR APP) or activate (PHP script server), if this SWF file have the source code for no show or enable him not it.,.

    Thank you!

    I have it!

    Only that we need to do is:

    1. If the first tank is:

    F-the swf is uncompressed

    C - compressed with Deflate

    Z - compressed with LZMA

    2. If it is compressed, uncompress it with the right method

    3. search MainTimeline string, if concluded the SWF actionscript so it is not found, the code is not actionscript.

Maybe you are looking for