Sharing of Data Source with multiple components

I have a collection of ArrayCollection of time intervals that I want to use as the source for a series of ComboBoxes in several separate components.

What is the best way to implement this?

In fact, you can have global data objects. Since Flex is manufactured with components, it's kind of deprecated. But if create you a collection of public register in your home page, you can access it via the name of Application.application.var. When you type Application.application. It will show you what is available

Tags: Flex

Similar Questions

  • Combine data sources with different granularity in the same fact table?

    I have two operating tables 'Incident (157 columns)' and 'unit (70 Colums) '. For all the "incidents" happening there could be one or more records in the table of the 'unit '.

    As part of my design of data mart, I have merged the tables in one "makes the incident (227 columns)" and insert records from two tables with a join condition between them [incident. IN_NUM = Unit.IN_NUM].

    Is this correct, is my question? or am I mix data sources with different granularity in the same fact table. Appreciate your help.

    Best regards
    Bees

    Bees,
    Are the measures of the 'Incident', repeated during an incident given, in more than one record in the table of the unit? If so, then the sum (indicent.measure) will give an incorrect result?

    What is there to merge physically tables set outside OBIEE? With OBIEE you might have a table of 'facts' logic to present the user with report, which from tables separated units and Incidents and would stop the occurrence of incorrect aggregations. A common piece of modeling in the same way would be arrested in OBIEE headers and lines of command, quite common to have a logical fact 'orders' which contained the two header orders and order line, this translates into the Incidents-> relationship of units.

    To do what I mentioned, is relatively simple, you need a "Dim - Incident" at two levels, unit, mapp and Incident unique identifiers as keys to level and then use these levels to define the content of the levels correctly in your 2 tables logic sources logic "done", IE the LTS Incidents at incident level LTS units as level of units.

    Hope this helps, let us know if you get stuck.
    See you soon
    Alastair

  • Indicator of waveform data dashboard with multiple locations

    I have a chart in waveform with multiple locations on my main VI running.

    I use the Application Dashboard data 2.2.1 on my Ipad to monitor table of wave shape of my running app.  I placed a marker on my Ipad and also related waveform with my waveform array variable.  I can't play my data dashboard application because it shows unable to connect to the server.  I noticed that it is because of multiple plots being plotted on my table of waveform.

    I could only play the App data dashboard when it is just a single parcel related to an indicator on the App.

    My question is if it's possible to have a graphical indicator of waveform on a data dashboard that has several plots being plotted and not only a single square, and how to configure it?

    Thank you.

    Click on Bravo and select as an accepted answer.  You are welcome.

  • date max with multiple joins of tables

    Looking for expert advice on the use of max (date) with multiple joins of tables. Several people have tried (and failed) - HELP Please!

    The goal is to retrieve the most current joined line of NBRJOBS_EFFECTIVE_DATE for each unique NBRJOBS_PIDM. There are several lines by PIDM with various EFFECTIVE_DATEs. The following SQL returns about 1/3 of the files and there are also some multiples.

    The keys are PIDM, POSN and suff

    Select NBRJOBS. NBRJOBS.*,
    NBRBJOB. NBRBJOB.*
    of POSNCTL. Inner join of NBRBJOB NBRBJOB POSNCTL. NBRJOBS NBRJOBS on (NBRBJOB. NBRBJOB_PIDM = NBRJOBS. NBRJOBS_PIDM) and (NBRBJOB. NBRBJOB_POSN = NBRJOBS. NBRJOBS_POSN) and (NBRBJOB. NBRBJOB_SUFF = NBRJOBS. NBRJOBS_SUFF)
    where NBRJOBS. NBRJOBS_SUFF <>'LS '.
    and NBRBJOB. NBRBJOB_CONTRACT_TYPE = 'P '.
    and NBRJOBS. NBRJOBS_EFFECTIVE_DATE =
    (select Max (NBRJOBS1. NBRJOBS_EFFECTIVE_DATE) as 'EffectDate '.
    of POSNCTL. NBRJOBS NBRJOBS1
    where NBRJOBS1. NBRJOBS_PIDM = NBRJOBS. NBRJOBS_PIDM
    and NBRJOBS1. NBRJOBS_POSN = NBRJOBS. NBRJOBS_POSN
    and NBRJOBS1. NBRJOBS_SUFF = NBRJOBS. NBRJOBS_SUFF
    and NBRJOBS1. NBRJOBS_SUFF <>'LS '.
    and NBRJOBS1. NBRJOBS_EFFECTIVE_DATE < = to_date('2011/11/15','yy/mm/dd'))
    order of NBRJOBS. NBRJOBS_PIDM

    Welcome to the forum!

    We don't know what you are trying to do.
    You want all of the columns in the rows where NBRJOBS_EFFECTIVE_DATE is the date limit before a given date (November 15, 2011 in this example) for all rows in the result set with this NBRJOBS_PIDM? If so, here is one way:

    with         GOT_R_NUM     as
    (
         select       NBRJOBS.NBRJOBS.*,
                NBRBJOB.NBRBJOB.*     -- You may have to give aliases, so that every column has a unique name
         ,       rank () over ( partition by  NBRJOBS.NBRJOBS_PIDM
                                   order by      NBRJOBS.NBRJOBS_EFFECTIVE_DATE     desc
                          )             as R_NUM
         from          POSNCTL.NBRBJOB NBRBJOB
         inner join      POSNCTL.NBRJOBS NBRJOBS       on    (NBRBJOB.NBRBJOB_PIDM = NBRJOBS.NBRJOBS_PIDM)
                                            and      (NBRBJOB.NBRBJOB_POSN = NBRJOBS.NBRJOBS_POSN)
                                      and      (NBRBJOB.NBRBJOB_SUFF = NBRJOBS.NBRJOBS_SUFF)
         where       NBRJOBS.NBRJOBS_SUFF             != 'LS'       -- Is this what you meant?
         and        NBRBJOB.NBRBJOB_CONTRACT_TYPE   ='P'
         and       NBRJOBS.NBRJOBS_EFFECTIVE_DATE  <= to_date ('2011/11/15', 'yyyy/mm/dd')
    )
    select       *     -- Or list all columns except R_NUM
    from       GOT_R_NUM
    where       R_NUM          = 1
    order by  NBRJOBS_PIDM
    ;
    

    Normally this site does not display the <>inequality operator; He thinks it's some kind of beacon.
    Whenever you post on this site, use the other inequality operator (equivalent), *! = *.

    I hope that answers your question.
    If not, post a small example of data (CREATE TABLE and INSERT, only relevant columns instructions) for all the tables involved and the results desired from these data.
    Explain, using specific examples, how you get these results from these data.
    Always tell what version of Oracle you are using.
    You will get better results faster if you always include this information whenever you have a problem.

  • Advice on setting up multiple data sources with the health without double counting application.

    Someone has any advice on putting in place several sources of data with the application of health... and how to ensure that you are not double count data.

    I use an iPhone OS on 9.3.1 (my first iphone never). I use map my walk for specific tours. Garmin vívosmart HR for general steps, heart rate and sleep; and a balance of inside. I think that the iPhone itself also some general measures.

    Thank you

    Anthony

    The health app manages all the automatically.

  • Error 14031 on denormalised to physical time data sources with no real key (simple model POC)

    Hello

    OBIEE 11.1.1.6

    Modeling of a very simple physical model; -

    1 large table denormalised = > 1 time dimension

    My table denormalised lacking real keys so I'm flat modeling (total / detail) hierarchies of dimension using text based descriptors (this awaits the final version of the data warehouse) but I get a persistent error that manifests from dimension to dimension... i.e. If I recreate only one dimension, then the error does not appear on this dimension, but moves on the next , but he's now gone full circle...

    The error is; -14031 "filtering the contents of a logical table source: 'X' references multiple dimensions TNA 15001 error could not load the navigation for topic area 'Y" ".

    As I said if I delete and recreate the logic table 'X' as shown above, then the error will change in another dimension table.

    I saw a 'hit' saying that the problem is with joined him on the business model, someone can advise in my scenario the joins should be on the logical tables, physical source is (aside from the time Sun) all coming from the same physical source? (the only physical join real is my large denormalised in the time dimension table)

    Thanks for your comments,

    Robert.

    Hello

    found my own fix, for reference, if anyone facing this problem was that the details on the different Dimensions all came from the same source, so when I put the level of the source for the various files, in fact and in corresponding Sun at different levels, that was causing the error, so the error of compensation was a case of undo levels for all outside my separate time table.

    The error message is less useful...

  • How to create the data store with multiple LUNS

    Someone could please tell me that how can I have multiple LUNS in a single data store, is it possible?

    See the "Storage management" section in the document provided in the previous comment. You are probably looking for information on page 104.

  • Data source with 2 different data fields

    Hi all

    I have data of the file that has 2 fields of different data that belongs to 2 different members of the account dimension.
    I can load by using a rules file?
    My rule file is not validated with 2 data fields.
    Please guide

    Thank you

    To clarify a bit the post John, you would not define the columns 'Data' but rather to do the real accounts column header that you want to load in. It is a case where the title is not the name of dimension, but rather members in the dimension. (Note that I use the expression column, but the term is in the field)

  • Aggregation of data from multiple sources with BSE

    Hello

    I want to aggregate data from multiple data sources with a BSE service and after this call a bpel with a process of construction of these data.
    1 read data from the data source (dbadapter-select-call)
    2. read data from the data source B (dbadapter-select-call)
    3 assemble the data in xsl-equiped
    4. call bpel

    Is this possible? How can I get data from the first call and the second call to conversion data? If I receive data from the second call, the first call data seem to be lost.

    Any ideas?
    Gregor

    Gregor,

    It seems that this aggregation of data is not possible in the BSE. This can be done in BPEL too using only assigned but not using transformations. I tried to use transformations by giving the third argument to the function ora: processXSLT. But couldnot get the desired result.

    For more information on the passage of a second variable (of another schema) as a parameter to xslt pls refer to the post office

    http://blogs.Oracle.com/rammenon/2007/05/

    and the bug fix 'passage BPEL Variable content in XSLT as parameters'.

    Hope this helps you.

    Thank you, Vincent.

  • "BOLD" is supported by two streams of source with data source?

    I created a data source with two streams of source - one for audio and one for video. During the init data source, the streams of the two sources are created. When dataSource.start () is called, my application call start() of the two source Brooks.  When the player calls getStreams(), the soft returns an array containing the two water courses.

    However, once started, the player only calls the first stream's read() method. In other words, the read() method second flow is never called.

    Are several stream source by source of data supported by RIM, in particular on "BOLD"? I tested on v4.6.0.304 9000 "BOLD".

    Thanks for your thoughts/aid.

    It is not supported.

  • Error: Configuration of data source under planning

    Once the successful configuration

    (1) Foundation Services (Hyperion, SSP)
    (2) Essbase administration services
    (3) Server Essbase
    (4) hyperion reporting and analysis
    Schedule 5)-> options product, register with shared services, configure the database, deploy server applications (Appache), registration of the item Instance.

    I stuck with the Configuration of data source under planning-

    Steps in detail-
    cheked on the configuration of the source data, and then click Next
    Check create the data source, and then click Next
    name of the data source: essdb (my choice)
    DataSource Discription: essbase DB (my choice), click Next
    Select my instance name: plan (in the menu drop-down) click Next
    Select database: MS SQL server click Next
    Type of database: SQL
    Port: 1433 (default)
    Information on the database:
    Product: planning
    database: p2DB (new DB, SQL)
    user: p2user (new user, SQL)
    password: *.

    Click next

    The Essbase server information:
    Sever: my machine name (neeraj-pc)
    user: planning (created the new user of essbase, external)
    password: *.

    Click next


    Error:

    at java.awt.Component.processMouseEvent (unknown Source)
    at javax.swing.JComponent.processMouseEvent (unknown Source)
    at java.awt.Component.processEvent (unknown Source)
    at java.awt.Container.processEvent (unknown Source)
    at java.awt.Component.dispatchEventImpl (unknown Source)
    at java.awt.Container.dispatchEventImpl (unknown Source)
    at java.awt.Component.dispatchEvent (unknown Source)
    at java.awt.LightweightDispatcher.retargetMouseEvent (unknown Source)
    at java.awt.LightweightDispatcher.processMouseEvent (unknown Source)
    at java.awt.LightweightDispatcher.dispatchEvent (unknown Source)
    at java.awt.Container.dispatchEventImpl (unknown Source)
    at java.awt.Window.dispatchEventImpl (unknown Source)
    at java.awt.Component.dispatchEvent (unknown Source)
    at java.awt.EventQueue.dispatchEvent (unknown Source)
    at java.awt.EventDispatchThread.pumpOneEventForHierarchy (unknown Source)

    at java.awt.EventDispatchThread.pumpEventsForHierarchy (unknown Source)
    at java.awt.EventDispatchThread.pumpEvents (unknown Source)
    at java.awt.EventDispatchThread.pumpEvents (unknown Source)
    at java.awt.EventDispatchThread.run (unknown Source)



    Please suggest measures for centering properly

    Hello

    The problem is that it does not resolve the name of your machine, now I don't know if it's down to a problem of vista.
    Just out of curiosity you have anything in your hosts file, \Windows\System32\Drivers\etc\hosts

    If you try to create the data source with 127.0.0.1 work?

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • EJB WebService to consume a data source

    Hello

    I use JDeveloper 12.1.3.0.

    I have to develop a Web of EJB Service that uses a data source in the WebLogic.

    I googled and looked, but I found no examples / tutorials that explains how it works and how to do it.

    The only examples that I found was by using a direct connection to the database, like this one http://waslleysouza.com.br/en/2014/10/restful-web-service-in-jdeveloper-12c/

    Can you help me?

    Thank you

    Even with respect to MySQL. Add the class name to the following.

    1. in WebLogic Server to set a data source with the name JNDI (jdbc/OracleDS)

    2. in persistence.xml specify the data source as a JTA data source.

    http://www.w3.org/2001/XMLSchema-instance '.

    xsi: schemaLocation =

    'http://java.sun.com/xml/ns/persistence '.

    http://java.sun.com/xml/ns/persistence/persistence_1_0.xsd".

    xmlns ="http://java.sun.com/xml/ns/persistence" version = "1.0" > "

    org.eclipse.persistence.jpa.PersistenceProvider

    JDBC/OracleDS

      name of the class

    />

    value = "create tables" / >

  • 2 applications on the same data source vs different data sources

    Hello

    What is the difference between having 2 planning applications on the same data source with the same user on the DB and with each on a separate user name and the new data source?

    I have no problem space or privileges on the two schemas of the DB. I just want to know what the best technically?

    Thank you

    You need a separate data by application source otherwise each application will overwrite the other that they will share the same relational tables, if you want the planning application to work properly then two apps = two data sources

    See you soon

    John

    http://John-Goodwin.blogspot.com/

  • Is it possible to delete a data source?

    I created a data source with the same name. Is it possible to delete?

    By 'data source', I assume you mean a short data store (i.e., created by creating the short-cmd-ds).

    You can remove a short in this way data store:

    1. stop recording data (stop-ds short-cmd command).

    2 detach the data store (ds off short-cmd command). This unregisters the short server data store.

    3. go into where the data files have been created and delete the directory with the command delete from your operating system.

    The directory of data files will be named index (for example, testindexes). By default, they are created in the directory of data-server/short.

  • 11g: deployment of WebLogic failing (data sources)

    Hello!

    Now, I'm trying to deploy my application ADF BC/Faces RC to an external server in WebLogic.
    I have deploy a WAR file. But the data connection is missing and my application reports that the connection was not found.

    My WAR file inspection, I realized that the connections.xml file is not deployed... for some reason any. Also, the store of credentials is not deployed.
    Very well, I thought. So I don't have to convert the store of credentials on the external server. We will use a data source.

    So I tried to change the configuration of my AM to a data source (java: comp/env/jdbc/MEDORADS).

    In WebLogic, I have created the data source with JNDI name ' comp/env/jdbc/MEDORADS', set up the database (the test succeeded) and he attributed to target AdminServer. Then I transferred my application to AdminServer.

    When I now call the application, I get a: "failed JNDI. "Impossible to search for Source of data in the context of java: comp/env/jdbc/MEDORADS ' and a 'while trying to get comp/env/MEDORADS in app/webapp/cockpit.war/32181625."

    What am I doing wrong or what I I don't have it to do?
    I admit that I am new to sources of data in this context, since our web modules all use dynamic JDBC credentials.

    All I want to do is to deploy my application on a demo server without complaint.

    Please, any help is welcome.

    Sascha

    Hello

    If you have created the data source on weblogic properly the only thing is to change the java: comp/env/jdbc/MEDORADS for jdbc/MEDORADS in yout request module.

Maybe you are looking for