Loading the data into the ASO cube

Hello

Is it possible to load data to a member in the structure where there is a related formula?

For example:

Account - interest paid:

Budget must be charged to members
Real must be a calculated field

I tried something like:

CASE
WHEN [scenario]. CurrentMember IS [real] THEN-[finance costs (IFRS)]
END

but it does not work?

Is this possible?

Thank you

Hello

ASO, you cannot load data to a member who has a formula attached to it. One option is to have 3 accounts, 2 accounts system and an original. Load data for accounting system: VAC - Bud and VAC - law and then to set a formula to the original account to get the values of 2 accouts of system based on the scenario.

Let me know if it helps.

However, you can do it in ASF, loading the data to the Budget and have a calculation script calculate the value of actual expenditures to the Member of the account.

See you soon
RS

Tags: Business Intelligence

Similar Questions

  • Extract the ASO cube data slices

    Hello

    For Essbase version 11.1.1.2, is there a way (using scripts) to extract data slices in a text file?  We understood how the data in the ASO cube zero slices, but have none managed to extract the data.  Please keep in mind, we have no reporting tools for this, we use Excel to our reporting needs

    Thanks as always!

    There are pretty much two different (well, there are also a third way) ways to get out Essbase data:

    (1) write a report script

    (2) write an MDX query

    (3) buy Star Integration Server (now owned by IBM) and point to the comic in question.

    Not surprisingly, there is no quick fix with these choices.  I wrote a blog about report vs MDX scripts, and the answer is (i.e. that one is the best)... depends on:

    http://camerons-blog-for-Essbase-hackers.blogspot.com/2013/07/what-makes-Essbase-data-extraction-fast.html

    I hope to put some of the trials above against a box of Star Integration Server (no, I don't work for them) to see if he can better the time extracted in the near future.

    Kind regards

    Cameron Lackpour

  • shared members disappear automatically after the deployment of the ASO cube

    Hello

    What we were doing as below:


    the memerbers dimension are loaded by source for epma odi, and then, we deploy the aso and epma bso cubes.

    and the process type in the profile to load dimension members is defined as "merge as primary.


    the question as below:

    (1) initially, the hierarchy of the dimension as it:

    size AA:

    A

    A1

    A2

    A3

    This dimension is aso and aso cube deployed successfully.

    (2) the users change the hierarchy of the source table.

    changed to below: (just add a parent for A1 and A2 under A initial parent.

    A

    A'

    A1

    A2

    A3

    AFTE, this changes, and members are responsible for epma once again, the hierarhy of dimension in epma is as below:

    A

    A'

    A1

    A2

    A1 (Shared)

    A2 (Shared)

    A3

    and this process of loading and applications deploy processes run automatically overnight pocess.

    I thought that I delete the A1 A2 shared hierarchy and deploy the cube again after overnight charge and deploy processes.

    But what I saw in the structure of the aso cube after the party deployed the process is also that:

    There no common A2 A1 in the sketch, but just below members in the cube hierarchy in essbase.

    A

    A'

    A1

    A2

    A3

    It's really weird, why shared members disappear automatically after deploy and make different hierarchy between epma an essbase outline

    can you help me?

    I have no idea...

    EPMA is ASO-> size is set to "multiple hierarchies enabled" otherwise please turn on and then try to deploy.

    Thank you

    ~ KKT ~.

  • How to add a new dimension in the ASO cube without losing all the data

    Hello

    I have an ASO cube with 18 dimensions and I want to add a new (regular). When I add this new dimension he asked me that data all have cleared out before the restructuring can take place. If I click Yes then I would lose all data, regardless of the fact if I export Level0 or not. Because when I tried to reload the export in the cube, he gave me a cause of error all the sides were not present export. Does anyone know how I would be able to accomplish this task?


    Thank you
    Mickael

    You can try this work around solution.

    (1) export data from Lev0
    (2) delete the cube
    (3) add the new regular dimension.
    (4) add a default member Reg00 in the new dimension and save.
    (5) open Lev0 data in a text editor, insert 'Reg00' at the beginning of the file and enter.
    (6) Lev0 file will first record 'Reg00' and the remaining export file will be of the second disk.
    (7) load the file changed with a rules file.

    You can check all your history data will be loaded w.r.to the Reg00 of the new regular dimension.

  • Report of script in the ASO cube data

    question like this:

    ASO cube with dimensions, dimension is used.

    and we put a dimension attribute PayType to the dimensions of the employee with the value of the exempt /NonExempt attribute.

    now the question is: not all employees are good with attribute values, there are some employees without these PayType values.

    and we want to create a report script to extract data from these PayType No. employees.

    We tried to put

    Selection and output options for dimension rules: 'employee '.

    < WITHATTR ('PayType' free '<>')

    < WITHATTR ("PayType' '<>' rampart)

    but it did not work.

    I don't know if there are other controls can help to do.

    Can anyone help?

    Try:

    See you soon... !!
    Rahul S.

  • loading dynamic data into the component

    Hello
    Im having a problem with loading of text in the component. Until the new item came I was loading the data of a component called slideshowpro in a dynamic text field. my script looked like this (with the t_txt dynamic text field):

    Net.slideshowpro.slideshowpro import. *;

    function onImageData(event:SSPDataEvent) {}
    If {(event.type=="imageData)"}
    t_txt.htmlText = Event.Data.Caption;
    }
    }

    my_ssp.addEventListener (SSPDataEvent.IMAGE_DATA, onImageData);


    -I now want to load data into the component layout of text of the same name (t2_text). How would I change the script above for flow data into the component layout of text rather than the dynamic text field? THX.

    The author of the component might look at the example of ImportMarkup. This shows how to convert text from markup. TLF laboratories don't have html conversion. If its plain text look at the HelloWorld example.

  • Load xml data into the table

    Hi all

    I have an XML (emp.xml) with data below:

    -< root >
    -< row >
    < name > steve < / lastname >
    < 30 > < / Age >
    < / row >
    -< row >
    < name > Paul < / lastname >
    <>26 < / Age >
    < / row >
    < / root >

    I would like to create a stored procedure to store the xml data into the EMP table.

    EMP
    LastName age
    Steve 30
    Paul 26

    I tried to watch all threads related to this forum, but cannot find the right wire. Any help is greatly appreciated. Thank you

    With

    SQL>  select * from xmltable('root/row' passing xmltype('
    
    steve
    30
    
    
    Paul
    26
    
    ') columns lastname path 'lastname',
                       Age path 'Age')
    /
    LASTNAME   AGE
    ---------- ----------
    steve      30
    Paul       26   
    

    You can now just make a

    Insert into emp as select...

  • Copying for the ASO cube

    Hello.
    I would like to copy a cube of one application.database to another ASO

    The reason is I want to be able to refresh the application data source ASO & metadata, while the ASO target application has security against it and is used by users. Then, ideally, I want just to "publish" the application of the ASO at the application target using a copy of the source file.

    So far, I have had limited success. Should I be simply copying the temp, metadata, log, by default and < database name > file, or the file essbaseapp.instance so or .app & .apb files also?

    Any help appreciated

    Thank you

    In theory, if it's just data and contour changing you should be able to get away with the application, copy of metadata and default folders and the outline.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Codes of Member to the ASO cubes property

    Hi all

    In ASO cube, I have a dimension that is defined as multiple hierarchy

    Dimension-(multiple hierarchy Enable)
    Member1 (Stored)
    A1
    A2
    Member2 (Stored)
    B1
    B2
    Member3-(Dynamic)
    C1
    C2
    Member4-(Dynamic)
    D1
    D2

    I am trying to use X to define dynamic members in my source file (.) CSV format). I took this member property referance codes Table 40 http://docs.oracle.com/cd/E17236_01/epm.1112/esb_dbag/frameset.htm?dotrules.html#dotrules_2

    I think there at - it a chart like this for ASO or we can use the same table.

    Thanks in advance.

    VIC

    Published by: Vrockz on 16 January 2013 09:39

    You can find the information on this link http://docs.oracle.com/cd/E17236_01/epm.1112/esb_dbag/frameset.htm?alocare.html

    You can get more information at the http://docs.oracle.com/cd/E17236_01/epm.1112/esb_dbag/frameset.htm?ainaggr.html

    You should use was '-hope it helps

    Thank you
    Sunil

    Published by: sunil k on January 16, 2013 10:05

    Published by: sunil k on January 16, 2013 10:15

  • How to load HFM data into Essbase

    Hello

    How can bring us the HFM data into Essbase cube with using EAL, since we have performance using EAL - DSS as a source with OBIEE reporting problems.

    The expanded use of analytical, I heard we can only only level 0 of HFM Essbase data and need to write the currency conversion and the calculations of eliminating PKI in Essbase to roll the Member parent. Is this true?

    Also how we can convert security HFM to Essbase security.

    Please suggest me on this.

    Thank you
    Vishal

    Security will be a bit tricky as Essbase uses generally filters and HFM security classes. You can potentially use shared groups in Shared Services, but licensing issues soar depending on supply. Your best bet is maybe to look at export artifact LCM to handle.

  • Load xml data into an Oracle table

    Hello

    I went through some threads in the forums itself, but for my requirement that nothing comes closer, I write my request. I have a XML like this
    <? XML version = "1.0"? >
    < ACCOUNT_HEADER_ACK >
    < HEADER >
    < STATUS_CODE > 100 < / STATUS_CODE >
    Check < STATUS_REMARKS > < / STATUS_REMARKS >
    < / Header >
    < DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 2 < / SEGMENT_NUMBER >
    PR Polytechnic < REMARKS > < / COMMENTS >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 3 < / SEGMENT_NUMBER >
    < REMARKS > PR Polytechnic administration < / COMMENTS >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 4 < / SEGMENT_NUMBER >
    < REMARKS > rp Polytechnique finance < / COMMENTS >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 5 < / SEGMENT_NUMBER >
    < REMARKS > logistics Polytechnique rp < / COMMENTS >
    < / DETAILS >
    < / DETAILS >
    < HEADER >
    < STATUS_CODE > 500 < / STATUS_CODE >
    < STATUS_REMARKS > process exception < / STATUS_REMARKS >
    < / Header >
    < DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 20 < / SEGMENT_NUMBER >
    Basic Polytechnique < REMARKS > < / COMMENTS >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 30 < / SEGMENT_NUMBER >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 40 < / SEGMENT_NUMBER >
    Finance basic Polytechnique < REMARKS > < / COMMENTS >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 50 < / SEGMENT_NUMBER >
    Logistics base Polytechnique < REMARKS > < / COMMENTS >
    < / DETAILS >
    < / DETAILS >
    < / ACCOUNT_HEADER_ACK >

    Here is the xml structure of the master structure and child I want to insert data in Oracle tables using the sql * loader initially tried to create a control file, but I don't know how to terminate in the control file, so I created two control files

    load data
    INFILE 'acct.xml' ' str ' < / DETAIL >»»
    TRUNCATE
    in the xxrp_acct_detail table
    TRAILING NULLCOLS
    (
    dummy fill finished by "< DETAIL >."
    SEGMENT_NUMBER surrounded by '< SEGMENT_NUMBER >' and ' < / SEGMENT_NUMBER >, "
    REMARKS framed by 'Of REMARKS <>' and ' < / COMMENTS >.
    )


    load data
    ACCT.XML INFILE' "str" < / header > ' "»
    TRUNCATE
    in the xxrp_acct_header table
    fields terminated by '< HEADER >.
    TRAILING NULLCOLS
    (
    dummy fill finished by "< HEADER >."
    STATUS_CODE framed by '< STATUS_CODE >' and ' < / STATUS_CODE >. "
    STATUS_REMARKS surrounded by '< STATUS_REMARKS >' and ' < / STATUS_REMARKS >.
    )

    I refer to the same xml file in two control files, where with regard to the first control file, I was able to load the files but the second which I suppose as table header not able to load the records of rest. I get the below error.

    Sheet 2: Rejected - error on the XXRP_ACCT_HEADER, column DUMMY table.
    Field in the data file exceeds the maximum length
    Sheet 3: Rejected - error on the XXRP_ACCT_HEADER, column DUMMY table.
    Field in the data file exceeds the maximum length

    In fact if its possible to seggrate a control file so it will be very useful for me. I'm also open for the external table as option. Please help me in this regard.

    Thanks in advance.

    Concerning
    Mr. Nagendra

    Here are two possible solutions:

    (1) reading the headers and separate details using two XMLTables:

    DECLARE
    
     acct_doc xmltype := xmltype( bfilename('TEST_DIR','acct.xml'), nls_charset_id('AL32UTF8') );
    
    BEGIN
    
     insert into xxrp_acct_details (status_code, status_remarks, segment_number, remarks)
     select x1.status_code,
            x1.status_remarks,
            x2.segment_number,
            x2.remarks
     from xmltable(
      '/ACCOUNT_HEADER_ACK/HEADER'
      passing acct_doc
      columns header_no      for ordinality,
              status_code    number        path 'STATUS_CODE',
              status_remarks varchar2(100) path 'STATUS_REMARKS'
     ) x1,
     xmltable(
      '$d/ACCOUNT_HEADER_ACK/DETAILS[$hn]/DETAIL'
      passing acct_doc as "d",
              x1.header_no as "hn"
      columns segment_number number        path 'SEGMENT_NUMBER',
              remarks        varchar2(100) path 'REMARKS'
     ) x2
     ;
    
    END;
    

    All first (alias X 1) retrieves all headers in separate lines. The HEADER_NO generated column is used to keep track of the position of the header into the document.
    Then, we join a second XMLTable (X 2), passing HEADER_NO, so that we can access the corresponding items in DETAIL.

    (2) reading with one XMLTable, but somewhat more complex XQuery:

    DECLARE
    
     acct_doc xmltype := xmltype( bfilename('TEST_DIR','acct.xml'), nls_charset_id('AL32UTF8') );
    
    BEGIN
    
     insert into xxrp_acct_details (status_code, status_remarks, segment_number, remarks)
     select x.*
     from xmltable(
      'for $i in /ACCOUNT_HEADER_ACK/HEADER
       return
        for $j in $i/following-sibling::DETAILS[1]/DETAIL
        return element r {$i, $j}'
      passing acct_doc
      columns status_code    number        path 'HEADER/STATUS_CODE',
              status_remarks varchar2(100) path 'HEADER/STATUS_REMARKS',
              segment_number number        path 'DETAIL/SEGMENT_NUMBER',
              remarks        varchar2(100) path 'DETAIL/REMARKS'
     ) x
     ;
    
    END;
    

    Here, we use an XQuery query to extract the information that we need.
    Basically it's the same logic as above, but with two nested loops which access each header, then each RETAIL location immediately after in the order of the documents.

    Here is the link to the documentation XMLTable and XQuery in Oracle:
    http://download.Oracle.com/docs/CD/B28359_01/AppDev.111/b28369/xdb_xquery.htm#CBAGCBGJ

  • LOAD CSV DATA INTO A NEW TABLE

    I have a basic csv file-12 columns, 247 lines I tried to import or paste it into a new table and with each try, not all records get downloaded in the new table. Using the "Load Data" tool, I tried text data load, load the worksheet data, by import .csv and copy/paste. I put the records in the order PK ID of the table, I added dummy entries in all fields are null (that is, I added the word "None" for empty fields). But nothing works. I get about half of existing records.

    Why?

    The Max Size for VARCHAR2 is 4000. If you have a text which is longer, you must type clob column.

    If you wann download the entire file you need of a blob column you need to implement a uploadscript like this: http://www.dba-oracle.com/t_easy_html_db_file_browse_item.htm

  • Member to the ASO cube formula

    Hi all
    I'm trying to calculate the month member when a cube of ASO.
    This formula applies; ([CDA], [and CurrMonth])-([CDA], [et PrevMonth]) but this means that the substitution variables must be defined each month. Is there a better way to make this more dynamic formula? Maybe using the CurrentMember in the formula?

    Hello

    Yes, use CDA, CurrentMember - CDA, CurrentMember.Lag (1)

    Or & CurrMonth .lag (1) for a YEAR, & CurrMonth - CDA, depending on what you want to calculate exactly.

    Used on the period dimension, Lag (1) will return the previous month.

    Thank you
    JM

    Edited by: J.M. on January 7, 2013 12:04 AM

  • Work problem on the Member in the ASO cube formulas

    Hello

    I was asked to convert a planning cubic request in a single cube ASO BSO. I managed to convert a BSO Cube ASO through the Regional service console and added members in another cube throgh rulesfile OSB.

    Now, I must write formulas for Level0 member account dimension members. These are very simple as formulas

    If (@ISMBR ("New_Seats"))
    'Value '=' active Total Cost assets';
    on the other
    'Value '=' Asset_Value assets ';
    endif;

    and


    'Empty_Seats '= (("New Seat Additions"+"Available_Seats") -"Required_Seats");


    This is the first time that I'm working on ASO. I get this error when writing these formulas
    "Syntax Error error (1260052) in an mdx query to enter on line 1 to token '=' Empty_Seats... '. »

    Help me to write these formulas and also in the choice of appropriate member properties.

    Try something like

    CASE
    When IS ([%{dimname/}]. CurrentMember, [New_Seats]) THEN [assets Total cost]
    Else [Asset_Value]
    END

    and

    ([New headquarters] + [Available_Seats])-[Required_Seats]

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Load table data into Oracle Essbase

    Hi all

    I have a problem

    1 oracle table have 50 million records

    Essbase load data from that

    ODI error message:

    Caused by: java.sql.BatchUpdateException: means: Java heap space

    at org.hsqldb.jdbc.JDBCPreparedStatement.executeBatch (unknown Source)

    at oracle.odi.runtime.agent.execution.sql.BatchSQLCommand.execute(BatchSQLCommand.java:44)

    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:102)

    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:1)

    at oracle.odi.runtime.agent.execution.DataMovementTaskExecutionHandler.handleTask(DataMovementTaskExecutionHandler.java:87)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:577)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:468)

    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2128)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:366)

    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)

    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:292)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:855)

    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)

    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)

    at java.lang.Thread.run(Thread.java:662)

    I think that Agent Load data so great, if memory cannot load.

    How to fix?

    Please give me a solution.

    Thank you

    As Craig said, move the staging area out the SUNOPSIS MEMORY ENGINE initially. This should only be considered if you are mobile / transform small amounts of data (including 50 million lines isn't :-)).) Why you do not set the staging on your Source Oracle DB in this way, you remove an unnecessary data movement i.e. the LKM and don't rely on memory in engine.

Maybe you are looking for