Loading the data into the planning

I'm a little confused how loading data in the planning of the work of the application. I have created the new application of planning with the Application Wizard, created the database, etc.. Then, after documentation (the administrator Planning Guide), I loaded the metadata using contour loading utility (OutlineLoad). Now I want to load the data. Following a few steps from ducumentation I add DIRECT_DATA_LOAD = False parameter (there is no front) and the DATA_LOAD_FILE_PATH parameter points to a directory on the server (I tried creating the empty file and DATA_LOAD_FILE_PATH the value piont this file). After saving configuration and restart the schedule I used Adminsitation to load data to choose the size of data loading, Driver-Dimension, I recorded it and then... no new files in the DATA_LOAD_FILE_PATH catalog has been created. What I am doing wrong?

EPM 11.1.1.1.0 on Windows 2003 Server EE

Hello

Let me give you an example of loading data from the sample application planning.

Dimensions
Account, currency, entity, period, scenario, Version, year, Segments

Loading data Dimension = account

Driver dimension = scenario

Selected member real =

The format of the csv file is

Account, Point of view, data load Cube real name
330000, "USD, Jan, E05 NoSegment, work, AFA 08 ', Consol, 1000

I hope that gives you an idea?

See you soon

John
http://John-Goodwin.blogspot.com/

Tags: Business Intelligence

Similar Questions

  • loading dynamic data into the component

    Hello
    Im having a problem with loading of text in the component. Until the new item came I was loading the data of a component called slideshowpro in a dynamic text field. my script looked like this (with the t_txt dynamic text field):

    Net.slideshowpro.slideshowpro import. *;

    function onImageData(event:SSPDataEvent) {}
    If {(event.type=="imageData)"}
    t_txt.htmlText = Event.Data.Caption;
    }
    }

    my_ssp.addEventListener (SSPDataEvent.IMAGE_DATA, onImageData);


    -I now want to load data into the component layout of text of the same name (t2_text). How would I change the script above for flow data into the component layout of text rather than the dynamic text field? THX.

    The author of the component might look at the example of ImportMarkup. This shows how to convert text from markup. TLF laboratories don't have html conversion. If its plain text look at the HelloWorld example.

  • Load xml data into the table

    Hi all

    I have an XML (emp.xml) with data below:

    -< root >
    -< row >
    < name > steve < / lastname >
    < 30 > < / Age >
    < / row >
    -< row >
    < name > Paul < / lastname >
    <>26 < / Age >
    < / row >
    < / root >

    I would like to create a stored procedure to store the xml data into the EMP table.

    EMP
    LastName age
    Steve 30
    Paul 26

    I tried to watch all threads related to this forum, but cannot find the right wire. Any help is greatly appreciated. Thank you

    With

    SQL>  select * from xmltable('root/row' passing xmltype('
    
    steve
    30
    
    
    Paul
    26
    
    ') columns lastname path 'lastname',
                       Age path 'Age')
    /
    LASTNAME   AGE
    ---------- ----------
    steve      30
    Paul       26   
    

    You can now just make a

    Insert into emp as select...

  • Load data into Hyperion Planning problem with police? Thai tank.

    Hello

    Can someone help me I just load the ODI in Hyperion Planning data; However, it has char Thai in my data. When I see data in Hyperion Planning, char Thai converts? instead. So, how can I load data without problem of tank with ODI fonts? Any expert! Please help me for this

    http://img10.imageshack.us/img10/6087/20090316155228.th.png

    < img src = "http://img10.imageshack.us/img10/6087/20090316155228.th.png" border = "0" alt = "Free Image Hosting at the www.ImageShack.us" / >

    < img src = "http://imageshack.us/img/butansn.png" alt = "QuickPost" border = "0" > QuickPost this image to Myspace, Digg, Facebook, and others!

    In Journal of Hyperion, there is no error here

    2009-03-16 15:14:47, 847 [DwgCmdExecutionThread] INFO: Oracle Data Integrator adapter for Hyperion Planning - free 9.3.1.1
    2009-03-16 15:14:47, 847 INFO [DwgCmdExecutionThread]: connection to the planning application [Budget] on [192.168.3.20]: [11333] using [hypadmin] username.
    2009-03-16 15:14:47, 925 [DwgCmdExecutionThread] INFO: successfully connected to the planning application.
    2009-03-16 15:14:47, 941 INFO [DwgCmdExecutionThread]: loading for the charge of planning options
    Name of the dimension: location like Parent child: true
    Order By entry load: true
    Update the database: false
    2009-03-16 15:14:48, 019 INFO [DwgCmdExecutionThread]: beginning of the loading process.
    2009-03-16 15:14:48, 019 DEBUG [DwgCmdExecutionThread]: number of columns in the result set of source does not match the number of columns planning targets.
    2009-03-16 15:14:48, 066 [DwgCmdExecutionThread] INFO: type of load is a [member of the load dimension].
    2009-03-16 15:14:48, 285 INFO [DwgCmdExecutionThread]: circular reference detected Possible, abandonment of sort and continuing with load. 1368 possible circular reference documents found.
    2009-03-16 15:15:14, 660 INFO [DwgCmdExecutionThread]: load the process is complete.
    2009-03-16 15:27:45, 821 [DwgCmdExecutionThread] INFO: Oracle Data Integrator adapter for Hyperion Planning - free 9.3.1.1
    2009-03-16 15:27:45, 821 INFO [DwgCmdExecutionThread]: connection to the planning application [Budget] on [192.168.3.20]: [11333] using [hypadmin] username.
    2009-03-16 15:27:45, 883 [DwgCmdExecutionThread] INFO: successfully connected to the planning application.
    2009-03-16 15:27:45, 899 INFO [DwgCmdExecutionThread]: loading for the charge of planning options
    Name of the dimension: location like Parent child: true
    Order By entry load: true
    Update the database: true
    2009-03-16 15:27:45, 962 INFO [DwgCmdExecutionThread]: beginning of the loading process.
    2009-03-16 15:27:45, 962 DEBUG [DwgCmdExecutionThread]: number of columns in the result set of source does not match the number of columns planning targets.
    2009-03-16 15:27:45, 993 [DwgCmdExecutionThread] INFO: type of load is a [member of the load dimension].
    2009-03-16 15:27:46, 165 [DwgCmdExecutionThread] INFO: circular reference detected Possible, abandonment of sort and continuing with load. 1368 possible circular reference documents found.
    2009-03-16 15:28:14, 540 [DwgCmdExecutionThread] INFO: planing cube refresh initiated.
    2009-03-16 15:28:22, 993 INFO [DwgCmdExecutionThread]: planning of the cube refresh operation completed successfully.
    2009-03-16 15:28:22, 993 INFO [DwgCmdExecutionThread]: load the process is complete.

    Hello

    I'm glad you find the blog useful.

    I understand that you use 9.3 and to prove that it is not a question of ODI, log into planning from the web, add one manually with thai characters, then refresh the application to push information to essbase, then look at the Member in EA.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Load xml data into an Oracle table

    Hello

    I went through some threads in the forums itself, but for my requirement that nothing comes closer, I write my request. I have a XML like this
    <? XML version = "1.0"? >
    < ACCOUNT_HEADER_ACK >
    < HEADER >
    < STATUS_CODE > 100 < / STATUS_CODE >
    Check < STATUS_REMARKS > < / STATUS_REMARKS >
    < / Header >
    < DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 2 < / SEGMENT_NUMBER >
    PR Polytechnic < REMARKS > < / COMMENTS >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 3 < / SEGMENT_NUMBER >
    < REMARKS > PR Polytechnic administration < / COMMENTS >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 4 < / SEGMENT_NUMBER >
    < REMARKS > rp Polytechnique finance < / COMMENTS >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 5 < / SEGMENT_NUMBER >
    < REMARKS > logistics Polytechnique rp < / COMMENTS >
    < / DETAILS >
    < / DETAILS >
    < HEADER >
    < STATUS_CODE > 500 < / STATUS_CODE >
    < STATUS_REMARKS > process exception < / STATUS_REMARKS >
    < / Header >
    < DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 20 < / SEGMENT_NUMBER >
    Basic Polytechnique < REMARKS > < / COMMENTS >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 30 < / SEGMENT_NUMBER >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 40 < / SEGMENT_NUMBER >
    Finance basic Polytechnique < REMARKS > < / COMMENTS >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 50 < / SEGMENT_NUMBER >
    Logistics base Polytechnique < REMARKS > < / COMMENTS >
    < / DETAILS >
    < / DETAILS >
    < / ACCOUNT_HEADER_ACK >

    Here is the xml structure of the master structure and child I want to insert data in Oracle tables using the sql * loader initially tried to create a control file, but I don't know how to terminate in the control file, so I created two control files

    load data
    INFILE 'acct.xml' ' str ' < / DETAIL >»»
    TRUNCATE
    in the xxrp_acct_detail table
    TRAILING NULLCOLS
    (
    dummy fill finished by "< DETAIL >."
    SEGMENT_NUMBER surrounded by '< SEGMENT_NUMBER >' and ' < / SEGMENT_NUMBER >, "
    REMARKS framed by 'Of REMARKS <>' and ' < / COMMENTS >.
    )


    load data
    ACCT.XML INFILE' "str" < / header > ' "»
    TRUNCATE
    in the xxrp_acct_header table
    fields terminated by '< HEADER >.
    TRAILING NULLCOLS
    (
    dummy fill finished by "< HEADER >."
    STATUS_CODE framed by '< STATUS_CODE >' and ' < / STATUS_CODE >. "
    STATUS_REMARKS surrounded by '< STATUS_REMARKS >' and ' < / STATUS_REMARKS >.
    )

    I refer to the same xml file in two control files, where with regard to the first control file, I was able to load the files but the second which I suppose as table header not able to load the records of rest. I get the below error.

    Sheet 2: Rejected - error on the XXRP_ACCT_HEADER, column DUMMY table.
    Field in the data file exceeds the maximum length
    Sheet 3: Rejected - error on the XXRP_ACCT_HEADER, column DUMMY table.
    Field in the data file exceeds the maximum length

    In fact if its possible to seggrate a control file so it will be very useful for me. I'm also open for the external table as option. Please help me in this regard.

    Thanks in advance.

    Concerning
    Mr. Nagendra

    Here are two possible solutions:

    (1) reading the headers and separate details using two XMLTables:

    DECLARE
    
     acct_doc xmltype := xmltype( bfilename('TEST_DIR','acct.xml'), nls_charset_id('AL32UTF8') );
    
    BEGIN
    
     insert into xxrp_acct_details (status_code, status_remarks, segment_number, remarks)
     select x1.status_code,
            x1.status_remarks,
            x2.segment_number,
            x2.remarks
     from xmltable(
      '/ACCOUNT_HEADER_ACK/HEADER'
      passing acct_doc
      columns header_no      for ordinality,
              status_code    number        path 'STATUS_CODE',
              status_remarks varchar2(100) path 'STATUS_REMARKS'
     ) x1,
     xmltable(
      '$d/ACCOUNT_HEADER_ACK/DETAILS[$hn]/DETAIL'
      passing acct_doc as "d",
              x1.header_no as "hn"
      columns segment_number number        path 'SEGMENT_NUMBER',
              remarks        varchar2(100) path 'REMARKS'
     ) x2
     ;
    
    END;
    

    All first (alias X 1) retrieves all headers in separate lines. The HEADER_NO generated column is used to keep track of the position of the header into the document.
    Then, we join a second XMLTable (X 2), passing HEADER_NO, so that we can access the corresponding items in DETAIL.

    (2) reading with one XMLTable, but somewhat more complex XQuery:

    DECLARE
    
     acct_doc xmltype := xmltype( bfilename('TEST_DIR','acct.xml'), nls_charset_id('AL32UTF8') );
    
    BEGIN
    
     insert into xxrp_acct_details (status_code, status_remarks, segment_number, remarks)
     select x.*
     from xmltable(
      'for $i in /ACCOUNT_HEADER_ACK/HEADER
       return
        for $j in $i/following-sibling::DETAILS[1]/DETAIL
        return element r {$i, $j}'
      passing acct_doc
      columns status_code    number        path 'HEADER/STATUS_CODE',
              status_remarks varchar2(100) path 'HEADER/STATUS_REMARKS',
              segment_number number        path 'DETAIL/SEGMENT_NUMBER',
              remarks        varchar2(100) path 'DETAIL/REMARKS'
     ) x
     ;
    
    END;
    

    Here, we use an XQuery query to extract the information that we need.
    Basically it's the same logic as above, but with two nested loops which access each header, then each RETAIL location immediately after in the order of the documents.

    Here is the link to the documentation XMLTable and XQuery in Oracle:
    http://download.Oracle.com/docs/CD/B28359_01/AppDev.111/b28369/xdb_xquery.htm#CBAGCBGJ

  • LOAD CSV DATA INTO A NEW TABLE

    I have a basic csv file-12 columns, 247 lines I tried to import or paste it into a new table and with each try, not all records get downloaded in the new table. Using the "Load Data" tool, I tried text data load, load the worksheet data, by import .csv and copy/paste. I put the records in the order PK ID of the table, I added dummy entries in all fields are null (that is, I added the word "None" for empty fields). But nothing works. I get about half of existing records.

    Why?

    The Max Size for VARCHAR2 is 4000. If you have a text which is longer, you must type clob column.

    If you wann download the entire file you need of a blob column you need to implement a uploadscript like this: http://www.dba-oracle.com/t_easy_html_db_file_browse_item.htm

  • How to load HFM data into Essbase

    Hello

    How can bring us the HFM data into Essbase cube with using EAL, since we have performance using EAL - DSS as a source with OBIEE reporting problems.

    The expanded use of analytical, I heard we can only only level 0 of HFM Essbase data and need to write the currency conversion and the calculations of eliminating PKI in Essbase to roll the Member parent. Is this true?

    Also how we can convert security HFM to Essbase security.

    Please suggest me on this.

    Thank you
    Vishal

    Security will be a bit tricky as Essbase uses generally filters and HFM security classes. You can potentially use shared groups in Shared Services, but licensing issues soar depending on supply. Your best bet is maybe to look at export artifact LCM to handle.

  • Load table data into Oracle Essbase

    Hi all

    I have a problem

    1 oracle table have 50 million records

    Essbase load data from that

    ODI error message:

    Caused by: java.sql.BatchUpdateException: means: Java heap space

    at org.hsqldb.jdbc.JDBCPreparedStatement.executeBatch (unknown Source)

    at oracle.odi.runtime.agent.execution.sql.BatchSQLCommand.execute(BatchSQLCommand.java:44)

    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:102)

    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:1)

    at oracle.odi.runtime.agent.execution.DataMovementTaskExecutionHandler.handleTask(DataMovementTaskExecutionHandler.java:87)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:577)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:468)

    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2128)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:366)

    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)

    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:292)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:855)

    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)

    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)

    at java.lang.Thread.run(Thread.java:662)

    I think that Agent Load data so great, if memory cannot load.

    How to fix?

    Please give me a solution.

    Thank you

    As Craig said, move the staging area out the SUNOPSIS MEMORY ENGINE initially. This should only be considered if you are mobile / transform small amounts of data (including 50 million lines isn't :-)).) Why you do not set the staging on your Source Oracle DB in this way, you remove an unnecessary data movement i.e. the LKM and don't rely on memory in engine.

  • Error loading of data into Essbase of SQL Server

    Hello experts!

    I have another urgent and confusing issue. I am load data in a SQL Server view in Essbase (which is inverted with several columns of data) using ODI 11.1.1.5 and I get the following error at + "3-loading-SrcSet0-Load Data" + no:

    ODI-1227: SrcSet0 (load) task fails on the source of connection MICROSOFT_SQL_SERVER.
    + Caused by: java.sql.SQLException: incorrect syntax near the keyword 'View' [FMWGEN] [SQLServer JDBC Driver] [SQL Server] +.

    where 'View' is the name of the dimension (data) column in the SQL view.


    Please help me with advice! Thank you very much!

    Have you checked the generated SQL code, you can run the SQL generated directly in SQL server studio.
    If you use a column called view and one using a view?, if so this can be the question

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • help to load xml data into a loop

    Hola amigos, acudo a ustedes porque estoy doing a small application that muestre archivos para mi nueva zona of descargas, esta take utilizando diagramacion paint (gracias por los slideshows as against aqui) y los archivos los carga desde los datos a xml file.
    I have a movieclip cargado con addChild, y knew vez dentro carga tambien con addChild varios insert utilizando UN loop, the cosa are what who use ese mismo loop para as cada vez shouts the panel of the United Nations, are the cargue los datos xml, pero no puedo hacerlo, sale me el error 1009 respond el object are nulo. ACA the dejo parte del codigo, por if any of ustedes me can help con eso.
    Por adelantado the agradesco.

    Hello people, I come to you because I do a small application that shows the files for my new download area, it is made using liquid layout and files information loading from a xml file.
    I have a movieclip loaded with addChild and Interior load also with addChild several panels by using a loop, the thing is that I wanted to use the same loop for each time to add a new Panel, also load information to an xml file, but I can't do it every time show my 1009 error saying that the object is null (void). Here I leave you a part of the code for them if anyone of you can help me with this.
    Thank you in advance for the help, and here is a sample of the API: http://nesmothdesign.com/Media/home.swf

    set the XML

    var imgLoader:Loader;
    var xml;
    var xmlList:XMLList;
    var xmlLoader:URLLoader = new URLLoader();
    xmlLoader.load (new URLRequest ("listado.xml"));
    xmlLoader.addEventListener (Event.COMPLETE, xmlLoaded);
    function xmlLoaded(event:Event):void
    {
    XML = XML (event.target.data);
    xmlList = xml.children ();
    trace (xmlList.length ());
    }

    Add counter to the panns
    var miContenedor:contenedor;
    miContenedor = new contenedor();
    addChild (miContenedor);
    Tweener.addTween(miContenedor,{x:stage.stageWidth/2-465,time:1,transition:"easeIn"});)
    miContenedor.y = body_mc.y + 10;
    Add container´s children-
    var miPartA:panelTipoA;
    var miPartB:panelTipoB;
    for (var a: int = 0; has < = 3; a ++)
    {
    miPartA = new panelTipoA();
    miPartB = new panelTipoB();
    miContenedor.addChild (miPartA);
    miContenedor.addChild (miPartB);
    miPartA.y = a * miPartA.height + (a * 10);
    miPartB.y = a * miPartB.height + (a * 10);
    miPartB.x = miPartB.width + 15;
    imgLoader = new Loader();
    imgLoader.load (new URLRequest (xmlList [a] .attribute ("thumb")));
    miContenedor.miParteA.addChild (imgLoader);
    }

    Atention: las 3 of code lines should add the respective to the pannel xml file data.

    miContenedor.miParteA.addChild (imgLoader);

    is the problem.  There is no miContenedor.miParteA.  use:

    miParteA.addChild (imgLoader);

  • Loading the data into Essbase is slow

    Loading the data into Essbase is slow.
    Loading speed of 10 seconds on a record.

    It is used standard KM.

    How it is possible to optimize the loading of data into Essbase?

    Thank you

    --
    Gelo

    Just for you say the patch was released

    ORACLE DATA INTEGRATOR 10.1.3.5.2_02 UNIQUE PATCH

    Patch ID - 8785893

    8589752: IKM SQL data Essbase - loading bulk instead of rank by rank treatment mode when an error occurs during the loading

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Cannot load the metadata of the EBS

    Hi guys!

    I'm using 11.1.2.3.500 and try to shoot from 12 of the EBS metadata.

    Everything is well configured, (so far).

    I am trying to load 3 (account, entity and product in this order) dimensions form EBS and I got a green OK on Sun entity and one yellow triangle for others (account and entity).

    It seems that he loses the connection between planning. But what happens if its OK for entity dim?

    [Wed Jul 02 19:46:05 IS 2014] Execution of the refresh cube...

    [Wed Jul 02 19:46:05 IS 2014] An error occurred during the refresh cube operation: com.hyperion.planning.HspRuntimeException: unable to connect to a Hyperion Essbase.

    Check that Hyperion Essbase is running and check your network connections.

    Thank you

    RMarques

    Journal of the whole:

    ACCOUNT

    2014-07-02 19:45:20, 399 [AIF] INFO: beginning of the process FDMEE, process ID: 30

    2014-07-02 19:45:20, 399 [AIF] INFO: recording of the FDMEE level: 4

    2014-07-02 19:45:20, 399 [AIF] INFO: FDMEE log file: /u01/app/oracle/product/epm/EPMSystem11R1/HPL/EPRI/outbox/logs/HPL_DEV_30.log

    2014-07-02 19:45:20, 400 INFO [AIF]: user: admin

    2014-07-02 19:45:20, 400 INFO [AIF]: place: EBS_ACT_DL_AUD (Partitionkey:2)

    2014-07-02 19:45:20, 400 INFO [AIF]: period name: NA (period key: null)

    2014-07-02 19:45:20, 400 INFO [AIF]: category name: NA (category key: null)

    2014-07-02 19:45:20, 400 INFO [AIF]: rule name: 1 (rule ID:1)

    2014-07-02 19:45:23, 681 [AIF] INFO: Jython Version: 2.5.1 (Release_2_5_1:6813, September 26 2009, 13:47:54)

    [JRockit (R) Oracle (Oracle Corporation)]

    2014-07-02 19:45:23, 682 INFO [AIF]: Java platform: java1.6.0_37

    2014-07-02 19:45:33, 283 [AIF] INFO: COMM Dimension pre-processing - Multi process Validation - START

    2014-07-02 19:45:33: 293 [AIF] INFO: COMM Dimension pre-processing - Multi process Validation - END

    2014-07-02 19:45:33, 378 [AIF] INFO: LKM EBS/FS extracted members Table VS - process value Sets - START

    2014-07-02 19:45:33, 396 [AIF] INFO: LKM EBS/FS extracted members Table VS - process value Sets - END

    2014-07-02 19:45:33, 681 INFO [AIF]: extract members Ind VS EBS - fill step members Dim - START

    2014-07-02 19:45:33, 745 INFO [AIF]: extract from BSE members Ind VS - fill step members Dim - END

    2014-07-02 19:45:34, INFO 079 [AIF]: extract members Ind VS EBS - fill step members Dim - START

    2014-07-02 19:45:34, INFO 186 [AIF]: extract from BSE members Ind VS - fill step members Dim - END

    2014-07-02 19:45:34, 470 INFO [AIF]: extract members Ind VS EBS - fill step members Dim - START

    2014-07-02 19:45:34, 532 INFO [AIF]: extract from BSE members Ind VS - fill step members Dim - END

    2014-07-02 19:45:36, 383 [AIF] INFO: EBS/FS load Concat Sun - load Concat Dim members - START

    2014-07-02 19:45:36, 386 [AIF] INFO: EBS/FS load Concat Dim members - load Concat Dim member - END

    2014-07-02 19:45:41, 815 [AIF] INFO: EBS hierarchy processing Ind VS - process Insert details Ind VS - START

    2014-07-02 19:45:41, 824 [AIF] INFO: EBS hierarchy VS. IND. - Insert into process details Ind VS - END of treatment

    2014-07-02 19:45:42, 270 INFO [AIF]: extract EBS hierarchies Ind VS - fill step hierarchy Dim - START

    2014-07-02 19:45:42, 331 [AIF] INFO: from hierarchies Ind VS EBS - fill step hierarchy Dim - END

    2014-07-02 19:45:42, 897 INFO [AIF]: extract EBS hierarchies Ind VS - fill step hierarchy Dim - START

    2014-07-02 19:45:43, 044 [AIF] INFO: from hierarchies Ind VS EBS - fill step hierarchy Dim - END

    2014-07-02 19:45:43, 531 INFO [AIF]: extract EBS hierarchies Ind VS - fill step hierarchy Dim - START

    2014-07-02 19:45:43, 657 [AIF] INFO: from hierarchies Ind VS EBS - fill step hierarchy Dim - END

    2014-07-02 19:45:43, 803 INFO [AIF]: extract EBS hierarchies VS Table - fill step hierarchy Dim - START

    2014-07-02 19:45:43, 805 INFO [AIF]: extract EBS hierarchies VS Table - fill step hierarchy Dim - END

    2014-07-02 19:45:43, 891 INFO [AIF]: load COMM hierarchies unique Sun - hierarchies unique load Dim - START

    2014-07-02 19:45:45, 883 [AIF] INFO: COMM load single Dim hierarchies - load hierarchies only Dim - END

    2014-07-02 19:45:45, 989 [AIF] INFO: COMM load Concat Dim hierarchies - load Concat Dim hierarchies - START

    2014-07-02 19:45:45, 993 [AIF] INFO: COMM load Concat Dim hierarchies - load Concat Dim hierarchies - END

    2014-07-02 19:45:48, INFO 177 [AIF]: Member of Dimension COMM - treatment of the PKI attribute - attributes START

    2014-07-02 19:45:48, INFO 179 [AIF]: Member of Dimension COMM - treatment of the PKI attribute - attributes END

    2014-07-02 19:45:56, 815 INFO [AIF]: metadata HPL - Application support: HPL_DEV

    2014-07-02 19:45:56, 818 [AIF] INFO: number of dimensions to load in HPL_DEV: 3

    2014-07-02 19:45:56, 818 INFO [AIF]: query SQL building for the dimension account...

    2014-07-02 19:45:56, INFO 819 [AIF]: number of tables in HPL_DEV aliases: 1

    2014-07-02 19:45:56, 825 INFO [AIF]: loading of the dimension members: account

    2014-07-02 19:45:56, 825 INFO [AIF]: [HPLService] Info: OLU log file name is: /u01/app/oracle/product/epm/user_projects/epmdevexa1/tmp/aif_30_Account.log

    2014-07-02 19:46, 571 INFO [AIF]: property file arguments:-C /RIC: * /D:Account /RIU: * /RIR: * / RID: * / RIP: * /RIQ: * /IR

    No argument came from the command line. Subject (merged) command line:

    -C /RIC: * /D:Account /RIU: * /RIR: * / RID: * / RIP: * /RIQ: * /IR

    [Wed Jul 02 19:45:56 IS 2014] "The entry specified with the /RIQ switch request did not correspond to a key in the file properties of the"null"command if it is run as a sql query directly:

    $m.name SELECT 'account '.

    , m.PARENT 'Parent '.

    , m.DataStorage 'memory '.

    , m.VALUE1 ' Alias: Default.

    m.MemberValidForPlan1 'Plan Type (FinPlan)'

    m.MemberValidForPlan2 'Plan Type (RevPlan)'

    m.MemberValidForPlan3 'Plan Type (CapProj)'

    , m.AccountType "Account Type".

    , m.TimeBalance "time scale".

    m.VarianceReporting "Variance Reporting"

    m.SourcePlanType 'Source Plan Type'

    DE)

    SELECT mem.NAME

    CASE yesterday. PARENT WHEN '#root' NULL THEN ELSE yesterday. END OF PARENT PARENT

    yesterday. CHILD_DEPTH_NUM

    Mem. AccountType WHEN mem 'SavedAssumption' THEN 'submission saved' ELSE. AccountType END AccountType

    mem. SoldeTemps

    Mem. VarianceReporting WHEN mem "NonExpense" THEN "Non-charges" ELSE. VarianceReporting END VarianceReporting

    yesterday. MemberValidForPlan1

    yesterday. MemberValidForPlan2

    yesterday. MemberValidForPlan3

    yesterday. MemberValidForCapex

    yesterday. MemberValidForWorkforce

    yesterday. SourcePlanType

    CASE yesterday. Dtx200

    WHEN "NeverShare" THEN "never shared."

    WHEN "StoreData" THEN "Save".

    WHEN 'ShareData' THEN 'shared '.

    Another THING yesterday. Dtx200

    END Dtx200

    yesterday. HIERARCHY_ID

    HD. BASE_HIERARCHY_FLAG

    ,(Select prop.) VALUE

    PILLAR AIF_HS_DIM_PROPERTYARRAY

    WHERE prop. LOADID = mem. LOADID

    AND prop. DIMENSION = mem. DIMENSION

    AND prop. PROPERTY = "Alias."

    AND prop.nom = mem.NAME

    AND prop. KEY = "Default".

    ) VALUE1

    MEM AIF_HS_DIM_MEMBER

    Yesterday INNER JOIN AIF_HS_DIM_HIERARCHY

    ON yesterday. LOADID = mem. LOADID

    AND yesterday. DIMENSION = mem. DIMENSION

    AND yesterday. CHILD = mem.NAME

    LEFT OUTER JOIN AIF_MAP_HIERARCHIES HD

    ON HD. HIERARCHY_ID = yesterday. HIERARCHY_ID

    Mem WHERE. LOADID = 30

    AND mem. DIMENSION = "ACCOUNT1.

    ) m

    ORDER BY m.BASE_HIERARCHY_FLAG desc

    m.HIERARCHY_ID

    m.CHILD_DEPTH_NUM

    m.PARENT

    $m.name ".

    [Wed Jul 02 19:45:56 IS 2014] Trying to establish the connection of rdb entry.

    Source B.I. "EPM_REPO" on jdbc:oracle:thin:@server:1521/instace logged in successfully.

    [Wed Jul 02 19:45:56 IS 2014] Sign at the entrance to RDB successfully completed.

    [Wed Jul 02 19:45:56 IS 2014] Record header fields: Parent account, data storage, Alias: by default, Type of Plan (FinPlan), Plan Type (RevPlan), Plan Type (CapProj), Type of account, balance time, Variance Reporting, Source Plan Type

    [Wed Jul 02 19:45:56 IS 2014] Located by using 'Account' dimension and for the loading of data into the application 'HPL_DEV '.

    [Wed Jul 02 19:45:56 IS 2014] HspOutlineLoad::dateFormatSpecified is set to false, SessionHalDateFormat stored on session: null, sessionId: 886045942

    [Wed Jul 02 19:45:59 EST 2014] com.hyperion.planning.HspRuntimeException: exchange rate must be None if the Data Type is Non-monnaie, percentage, Smart List, Date or text.  Member: ATRDAY

    [Wed Jul 02 19:45:59 EST 2014] com.hyperion.planning.HspRuntimeException: exchange rate must be None if the Data Type is Non-monnaie, percentage, Smart List, Date or text.  Member: ATRDAY

    [Wed Jul 02 19:46:00 IS 2014] com.hyperion.planning.HspRuntimeException: an alias with the name of Trading member account already exists.

    [Wed Jul 02 19:46:00 IS 2014] com.hyperion.planning.HspRuntimeException: an alias with the name of Trading member account already exists.

    [Wed Jul 02 19:46:00 IS 2014] Loading dimension 'Account' has been successfully opened.

    [Wed Jul 02 19:46:00 IS 2014] A refresh of the cube operation will not be run.

    [Wed Jul 02 19:46:00 IS 2014] Create filters for safe operation will not be performed.

    Planning of vector data store finished loading processes. 1099 data records were read, 1099 data records have been processed, 1085 were accepted for loading (check the actual load with Essbase logs), 14 were rejected.

    [Wed Jul 02 19:46:00 IS 2014] Planning of vector data store finished loading processes. 1099 data records were read, 1099 data records have been processed, 1085 were accepted for loading (check the actual load with Essbase logs), 14 were rejected.

    2014-07-02 19:46, 571 [AIF] INFO: completed - account dimension load records read: 1 099, rejected records: 14, files: 1 099.

    ENTITY

    2014-07-02 19:46, 574 [AIF] INFO: building for the dimension entity SQL query...

    2014-07-02 19:46, 579 [AIF] INFO: loading of the dimension members: entity

    2014-07-02 19:46, 579 [AIF] INFO: [HPLService] Info: OLU log file name is: /u01/app/oracle/product/epm/user_projects/epmdevexa1/tmp/aif_30_Entity.log

    2014-07-02 19:46:04, 247 INFO [AIF]: property file arguments:-C /RIC: * /D:Entity /RIU: * /RIR: * / RID: * / RIP: * /RIQ: * /IR

    No argument came from the command line. Subject (merged) command line:

    -C /RIC: * /D:Entity /RIU: * /RIR: * / RID: * / RIP: * /RIQ: * /IR

    [Wed Jul 02 19:46:00 IS 2014] "The entry specified with the /RIQ switch request did not correspond to a key in the file properties of the"null"command if it is run as a sql query directly:

    $m.name SELECT 'entity '.

    , m.PARENT 'Parent '.

    , m.DataStorage 'memory '.

    , m.VALUE1 ' Alias: Default.

    m.MemberValidForPlan1 'Plan Type (FinPlan)'

    m.MemberValidForPlan2 'Plan Type (RevPlan)'

    m.MemberValidForPlan3 'Plan Type (CapProj)'

    DE)

    SELECT mem.NAME

    CASE yesterday. PARENT WHEN '#root' NULL THEN ELSE yesterday. END OF PARENT PARENT

    yesterday. CHILD_DEPTH_NUM

    Mem. AccountType WHEN mem 'SavedAssumption' THEN 'submission saved' ELSE. AccountType END AccountType

    mem. SoldeTemps

    Mem. VarianceReporting WHEN mem "NonExpense" THEN "Non-charges" ELSE. VarianceReporting END VarianceReporting

    yesterday. MemberValidForPlan1

    yesterday. MemberValidForPlan2

    yesterday. MemberValidForPlan3

    yesterday. MemberValidForCapex

    yesterday. MemberValidForWorkforce

    yesterday. SourcePlanType

    CASE yesterday. Dtx200

    WHEN "NeverShare" THEN "never shared."

    WHEN "StoreData" THEN "Save".

    WHEN 'ShareData' THEN 'shared '.

    Another THING yesterday. Dtx200

    END Dtx200

    yesterday. HIERARCHY_ID

    HD. BASE_HIERARCHY_FLAG

    ,(Select prop.) VALUE

    PILLAR AIF_HS_DIM_PROPERTYARRAY

    WHERE prop. LOADID = mem. LOADID

    AND prop. DIMENSION = mem. DIMENSION

    AND prop. PROPERTY = "Alias."

    AND prop.nom = mem.NAME

    AND prop. KEY = "Default".

    ) VALUE1

    MEM AIF_HS_DIM_MEMBER

    Yesterday INNER JOIN AIF_HS_DIM_HIERARCHY

    ON yesterday. LOADID = mem. LOADID

    AND yesterday. DIMENSION = mem. DIMENSION

    AND yesterday. CHILD = mem.NAME

    LEFT OUTER JOIN AIF_MAP_HIERARCHIES HD

    ON HD. HIERARCHY_ID = yesterday. HIERARCHY_ID

    Mem WHERE. LOADID = 30

    AND mem. DIMENSION = "ENTITY1.

    ) m

    ORDER BY m.BASE_HIERARCHY_FLAG desc

    m.HIERARCHY_ID

    m.CHILD_DEPTH_NUM

    m.PARENT

    $m.name ".

    [Wed Jul 02 19:46:00 IS 2014] Trying to establish the connection of rdb entry.

    Source B.I. "EPM_REPO" on jdbc:oracle:thin:@server:1521/instace logged in successfully.

    [Wed Jul 02 19:46:00 IS 2014] Sign at the entrance to RDB successfully completed.

    [Wed Jul 02 19:46:00 IS 2014] Record header fields: Parent, entity, Alias, data storage: by default, the Type of Plan (FinPlan), Type of Plan (RevPlan), Type of Plan (CapProj)

    [Wed Jul 02 19:46:00 IS 2014] Find and use the 'Entity' dimension for the loading of the data in the application 'HPL_DEV '.

    [Wed Jul 02 19:46:00 IS 2014] HspOutlineLoad::dateFormatSpecified is set to false, SessionHalDateFormat stored on session: null, sessionId: 872481248

    [Wed Jul 02 19:46:04 IS 2014] Loading dimension 'Entity' has been successfully opened.

    [Wed Jul 02 19:46:04 IS 2014] A refresh of the cube operation will not be run.

    [Wed Jul 02 19:46:04 IS 2014] Create filters for safe operation will not be performed.

    Planning of vector data store finished loading processes. 881 data records were read, 881 records have been processed, 881 have been accepted for loading (check the actual load with Essbase logs), 0 were rejected.

    [Wed Jul 02 19:46:04 IS 2014] Planning of vector data store finished loading processes. 881 data records were read, 881 records have been processed, 881 have been accepted for loading (check the actual load with Essbase logs), 0 were rejected.

    2014-07-02 19:46:04, 247 INFO [AIF]: load dimension complete entity - reading documents: 881, rejected records: 0 records processed: 881.

    PRODUCT

    2014-07-02 19:46:04, 249 [AIF] INFO: building SQL query for dimension product...

    2014-07-02 19:46:04, 253 [AIF] INFO: loading of the dimension members: product

    2014-07-02 19:46:04, 253 [AIF] INFO: [HPLService] Info: OLU log file name is: /u01/app/oracle/product/epm/user_projects/epmdevexa1/tmp/aif_30_Product.log

    2014-07-02 19:46:05, 556 INFO [AIF]: property file arguments: /C /RIC: * /D:Product /RIU: * /RIR: * / RID: * / RIP: * /RIQ: * /IR

    No argument came from the command line. Subject (merged) command line:

    /C /RIC: * /D:Product /RIU: * /RIR: * / RID: * / RIP: * /RIQ: * /IR

    [Wed Jul 02 19:46:04 IS 2014] "The entry specified with the /RIQ switch request did not correspond to a key in the file properties of the"null"command if it is run as a sql query directly:

    SELECT $m.name 'product '.

    , m.PARENT 'Parent '.

    , m.DataStorage 'memory '.

    , m.VALUE1 ' Alias: Default.

    DE)

    SELECT mem.NAME

    CASE yesterday. PARENT WHEN '#root' NULL THEN ELSE yesterday. END OF PARENT PARENT

    yesterday. CHILD_DEPTH_NUM

    Mem. AccountType WHEN mem 'SavedAssumption' THEN 'submission saved' ELSE. AccountType END AccountType

    mem. SoldeTemps

    Mem. VarianceReporting WHEN mem "NonExpense" THEN "Non-charges" ELSE. VarianceReporting END VarianceReporting

    yesterday. MemberValidForPlan1

    yesterday. MemberValidForPlan2

    yesterday. MemberValidForPlan3

    yesterday. MemberValidForCapex

    yesterday. MemberValidForWorkforce

    yesterday. SourcePlanType

    CASE yesterday. Dtx200

    WHEN "NeverShare" THEN "never shared."

    WHEN "StoreData" THEN "Save".

    WHEN 'ShareData' THEN 'shared '.

    Another THING yesterday. Dtx200

    END Dtx200

    yesterday. HIERARCHY_ID

    HD. BASE_HIERARCHY_FLAG

    ,(Select prop.) VALUE

    PILLAR AIF_HS_DIM_PROPERTYARRAY

    WHERE prop. LOADID = mem. LOADID

    AND prop. DIMENSION = mem. DIMENSION

    AND prop. PROPERTY = "Alias."

    AND prop.nom = mem.NAME

    AND prop. KEY = "Default".

    ) VALUE1

    MEM AIF_HS_DIM_MEMBER

    Yesterday INNER JOIN AIF_HS_DIM_HIERARCHY

    ON yesterday. LOADID = mem. LOADID

    AND yesterday. DIMENSION = mem. DIMENSION

    AND yesterday. CHILD = mem.NAME

    LEFT OUTER JOIN AIF_MAP_HIERARCHIES HD

    ON HD. HIERARCHY_ID = yesterday. HIERARCHY_ID

    Mem WHERE. LOADID = 30

    AND mem. DIMENSION = "DIM1.

    ) m

    ORDER BY m.BASE_HIERARCHY_FLAG desc

    m.HIERARCHY_ID

    m.CHILD_DEPTH_NUM

    m.PARENT

    $m.name ".

    [Wed Jul 02 19:46:04 IS 2014] Trying to establish the connection of rdb entry.

    Source B.I. "EPM_REPO" on jdbc:oracle:thin:@server:1521/instace logged in successfully.

    [Wed Jul 02 19:46:04 IS 2014] Sign at the entrance to RDB successfully completed.

    [Wed Jul 02 19:46:04 IS 2014] Record header fields: Parent, product, Alias, data storage: default

    [Wed Jul 02 19:46:04 IS 2014] Find and use the dimension 'Product' for the loading of the data in the application 'HPL_DEV '.

    [Wed Jul 02 19:46:04 IS 2014] HspOutlineLoad::dateFormatSpecified is set to false, SessionHalDateFormat stored on session: null, sessionId: 677666057

    [Wed Jul 02 19:46:04 IS 2014] java.lang.RuntimeException: com.hyperion.planning.DuplicateObjectException: an object with the name of WULC1 already exist.

    [Wed Jul 02 19:46:04 IS 2014] java.lang.RuntimeException: com.hyperion.planning.DuplicateObjectException: an object with the name of WULC1 already exist.

    [Wed Jul 02 19:46:05 EST 2014] com.hyperion.planning.InvalidDimensionMemberNameException: name of the Dimension Member 'ORDER' is a report Script command.

    [Wed Jul 02 19:46:05 EST 2014] com.hyperion.planning.InvalidDimensionMemberNameException: name of the Dimension Member 'ORDER' is a report Script command.

    [Wed Jul 02 19:46:05 IS 2014] Dimension 'Product' load has been successfully opened.

    [Wed Jul 02 19:46:05 IS 2014] Execution of the refresh cube...

    [Wed Jul 02 19:46:05 IS 2014] An error occurred during the refresh cube operation: com.hyperion.planning.HspRuntimeException: unable to connect to a Hyperion Essbase.

    Check that Hyperion Essbase is running and check your network connections.

    [Wed Jul 02 19:46:05 IS 2014] An error occurred during the refresh cube operation: com.hyperion.planning.HspRuntimeException: unable to connect to a Hyperion Essbase.

    Check that Hyperion Essbase is running and check your network connections.

    [Wed Jul 02 19:46:05 IS 2014] Impossible to get analytical information and/or perform a data load: an error occurred during the refresh cube operation: com.hyperion.planning.HspRuntimeException: unable to connect to a Hyperion Essbase.

    Check that Hyperion Essbase is running and check your network connections.

    [Wed Jul 02 19:46:05 IS 2014] Impossible to get analytical information and/or perform a data load: an error occurred during the refresh cube operation: com.hyperion.planning.HspRuntimeException: unable to connect to a Hyperion Essbase.

    Check that Hyperion Essbase is running and check your network connections.

    [Wed Jul 02 19:46:05 IS 2014] Trace of information: com.hyperion.planning.utils.HspOutlineLoad::parseAndLoadInputFile:1912, com.hyperion.planning.utils.HspOutlineLoad::halAdapterInfoAndLoad:304, com.hyperion.planning.utils.HspOutlineLoad::loadAndPrintStatus:4667, com.hyperion.planning.utils.HspOutlineLoad::outlineLoadAsyncImpl:3752, com.hyperion.planning.utils.HspOutlineLoad::outlineLoad:3692, com.hyperion.planning.utils.HspOutlineLoad::outlineLoad:3810, com.hyperion.aif.webservices.HPLService::loadMetadata:500, sun.reflect.NativeMethodAccessorImpl::invoke0:-2, sun.reflect.NativeMethodAccessorImpl::invoke:39, sun.reflect.DelegatingMethodAccessorImpl::invoke:25, java.lang.reflect.Method::invoke:597, com.hyperion.aif.servlet.ODIServlet::doPost:97, javax.servlet.http.HttpServlet::service:727, javax.servlet.http.HttpServlet::service:820,. weblogic.servlet.internal.StubSecurityHelper$ ServletServiceAction::run:227, weblogic.servlet.internal.StubSecurityHelper::invokeServlet:125, weblogic.servlet.internal.ServletStubImpl::execute:301, weblogic.servlet.internal.TailFilter::doFilter:26, weblogic.servlet.internal.FilterChainImpl::doFilter:56, oracle.security.jps.ee.http.JpsAbsFilter$ 1::run:119, oracle.security.jps.util.JpsSubject::doAsPrivileged:324, oracle.security.jps.ee.util.JpsPlatformUtil::runJaasMode:460, oracle.security.jps.ee.http.JpsAbsFilter::runJaasMode:103, oracle.security.jps.ee.http.JpsAbsFilter::doFilter:171, oracle.security.jps.ee.http.JpsFilter::doFilter:71, weblogic.servlet.internal.FilterChainImpl::doFilter:56, oracle.dms.servlet.DMSServletFilter::doFilter:163, weblogic.servlet.internal.FilterChainImpl::doFilter:56,. weblogic.servlet.internal.RequestEventsFilter::doFilter:27, weblogic.servlet.internal.FilterChainImpl::doFilter:56, weblogic.servlet.internal.WebAppServletContext$ ServletInvocationAction::wrapRun:3730, weblogic.servlet.internal.WebAppServletContext$ ServletInvocationAction::run:3696, weblogic.security.acl.internal.AuthenticatedSubject::doAs:321, weblogic.security.service.SecurityManager::runAs:120, weblogic.servlet.internal.WebAppServletContext::securedExecute:2273, weblogic.servlet.internal.WebAppServletContext::execute:2179, weblogic.servlet.internal.ServletRequestImpl::run:1490, weblogic.work.ExecuteThread::execute:256, weblogic.work.ExecuteThread::run:221

    [Wed Jul 02 19:46:05 IS 2014] Trace of information: com.hyperion.planning.utils.HspOutlineLoad::parseAndLoadInputFile:1912, com.hyperion.planning.utils.HspOutlineLoad::halAdapterInfoAndLoad:304, com.hyperion.planning.utils.HspOutlineLoad::loadAndPrintStatus:4667, com.hyperion.planning.utils.HspOutlineLoad::outlineLoadAsyncImpl:3752, com.hyperion.planning.utils.HspOutlineLoad::outlineLoad:3692, com.hyperion.planning.utils.HspOutlineLoad::outlineLoad:3810, com.hyperion.aif.webservices.HPLService::loadMetadata:500, sun.reflect.NativeMethodAccessorImpl::invoke0:-2, sun.reflect.NativeMethodAccessorImpl::invoke:39, sun.reflect.DelegatingMethodAccessorImpl::invoke:25, java.lang.reflect.Method::invoke:597, com.hyperion.aif.servlet.ODIServlet::doPost:97, javax.servlet.http.HttpServlet::service:727, javax.servlet.http.HttpServlet::service:820,. weblogic.servlet.internal.StubSecurityHelper$ ServletServiceAction::run:227, weblogic.servlet.internal.StubSecurityHelper::invokeServlet:125, weblogic.servlet.internal.ServletStubImpl::execute:301, weblogic.servlet.internal.TailFilter::doFilter:26, weblogic.servlet.internal.FilterChainImpl::doFilter:56, oracle.security.jps.ee.http.JpsAbsFilter$ 1::run:119, oracle.security.jps.util.JpsSubject::doAsPrivileged:324, oracle.security.jps.ee.util.JpsPlatformUtil::runJaasMode:460, oracle.security.jps.ee.http.JpsAbsFilter::runJaasMode:103, oracle.security.jps.ee.http.JpsAbsFilter::doFilter:171, oracle.security.jps.ee.http.JpsFilter::doFilter:71, weblogic.servlet.internal.FilterChainImpl::doFilter:56, oracle.dms.servlet.DMSServletFilter::doFilter:163, weblogic.servlet.internal.FilterChainImpl::doFilter:56,. weblogic.servlet.internal.RequestEventsFilter::doFilter:27, weblogic.servlet.internal.FilterChainImpl::doFilter:56, weblogic.servlet.internal.WebAppServletContext$ ServletInvocationAction::wrapRun:3730, weblogic.servlet.internal.WebAppServletContext$ ServletInvocationAction::run:3696, weblogic.security.acl.internal.AuthenticatedSubject::doAs:321, weblogic.security.service.SecurityManager::runAs:120, weblogic.servlet.internal.WebAppServletContext::securedExecute:2273, weblogic.servlet.internal.WebAppServletContext::execute:2179, weblogic.servlet.internal.ServletRequestImpl::run:1490, weblogic.work.ExecuteThread::execute:256, weblogic.work.ExecuteThread::run:221

    Planning of vector data store finished loading processes. 707 data records were read, 707 data records have been processed, 674 were accepted for loading (check the actual load with Essbase logs), 33 have been rejected.

    [Wed Jul 02 19:46:05 IS 2014] Planning of vector data store finished loading processes. 707 data records were read, 707 data records have been processed, 674 were accepted for loading (check the actual load with Essbase logs), 33 have been rejected.

    2014-07-02 19:46:05, 556 INFO [AIF]: charge of finished product dimension - reading documents: 707, rejected records: 33 files: 707.

    2014-07-02 19:46:05, 558 INFO [AIF]: metadata HPL charge ended with the status code returned: true

    2014-07-02 19:46:05, 653 [AIF] INFO: end process FDMEE, process ID: 30

    Hi all

    His works now.

    The Oracle guy suggested to change some variables in setCustomParamErpIntegrator.sh and it worked.

    As I got some time, I was trying to find the good exchange and found to LD_LIBRARY_PATH, which must include planning LD_LIBRARY_PATH.

    Francisco Amores

    See you soon,.
    Safiya

  • OutlineLoad utility not only loading text or dates in Hyperion Planning

    Good afternoon

    I tried several times to use the Outlineload utility to load the text and dates in an unsuccessful planning application. I use the Version of EPM: 11.1.2.1. Here are the steps I took:

    1. in EPMA, created a new Member, CommentsText, under the accounts dimension and all its data type 'Text' and then deployed the application.

    2. has created the following source file:

    Pilot Member, value, perspective, loading data Cube name

    CommentsText, go to my comments, here, ' Jan, FY10, USD, real, Final, C101 ", BUD_IS

    While preparing the source file, I made sure all the dimensions were represented by one member at the lower level.

    3. I ran the command below to the DOS command line:

    OutlineLoad S:Localhost A:PLN_GAL /U:admdem / m /I:d:\temp\loadtext5.csv /TR /L:d:\temp\Outlineload-log.log /X:d:\temp\Outlineload-exceptions.exc

    See the command to run successfully. This is what the log file:

    [Thu Sep 26 19:16:36 CDT 2013] Input file located and opened successfully "d:\temp\loadtext5.csv".

    [Thu Sep 26 19:16:36 CDT 2013] Record header fields: pilot Member, value, perspective, loading data Cube name

    [Thu Sep 26 19:16:36 CDT 2013] Finds and uses the 'OFADataLoadDimension' size for loading the data in the application 'PLN_GAL '.

    [Thu Sep 26 19:16:36 CDT 2013] The parent-child command option (switch/h) will not be performed: this option is not available for the dimension "OFADataLoadDimension."

    [Thu Sep 26 19:16:36 CDT 2013] Member from input command (switch/o) option file will not be performed: this option is not available for the dimension "OFADataLoadDimension."

    [Thu Sep 26 19:16:38 CDT 2013] A refresh of the cube operation will not be run.

    [Thu Sep 26 19:16:38 CDT 2013] Create filters for safe operation will not be performed.

    [Thu Sep 26 19:16:38 CDT 2013] Look at the files of newspapers of Essbase to status if Essbase data have been loaded.

    [Thu Sep 26 19:16:38 CDT 2013] Planning of vector data store finished loading processes. 1 data record has been read 1 record of data have been processed, 1 has been loaded successfully, 0 is rejected.

    But even if the log indicates that the comments were written, no text is written!

    Two possible problems, I've noticed are:

    1. the logs says: "located and using"OFADataLoadDimension"dimension for the loading of data into the application"PLN_GAL" I don't know what can be the dimension "OFADataLoadDimension."

    2 even if I put the text in EPMA data type, the data type is NUMERIC - NOT TEXT and more once I have deploy the application and check with the console service Regional. This blows my mind. I don't know why this is happening.

    Can someone please help? I tried A to Z for 2 days without success.

    Thank you!

    Luis

    I suppose that you have set the order of evaluation in EPMA - Parameter Data Type evaluation order

    To be honest that I have not used the utility outlineload to load the data when planning is EPMA mode, only in classic mode, in theory, it should work as it isn't metadata and data but not something that I tested.

    See you soon

    John

    http://John-Goodwin.blogspot.com/

  • load data into hyperion application

    Hello
    I have given in the essbase cube and wants to load this data into my Hyperion application?
    What tools should I use and what is it free?

    1. If you have data in your Essbase cube and assuming that this cube is created since the planning application (database create or update) then you should be able to see the data in your forms of planning data without any sort of copy of data.

    2. If the data in a cube completely separate Essbase, then you can export the data in this cube using the Dataexport (http://docs.oracle.com/cd/E12825_01/epm.111/esb_techref/dataexport.htm) and import the result into the Essbase cube that is created from the planning application.

    Doubts?

  • Loading data into Web Forms

    Quick question: can load us data into planning directly using web forms? I believe that we can. Apart from creating a web form and begins to break in numbers, is there anything else to do?

    Web forms are given inputing so the answer is Yes, you are really enter data into essbase through planning.
    Just to design your web form and you are ready to enter data.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • ODI can load data into the planning and REPLACE the existing data?

    Hello
    ODI (9.3.1.1) can load data into the planning and REPLACE the existing data?
    Currently of our ODI by "ADD" load to existing data data - so that is I accidentally run the load twice, it would be double planning numbers. I don't know how to do it to REPLACE so that running more than twice has no effect.

    With the help of the essbase adapter you would use a rule of load, in the State of charge, you can set whether to replace or add to existing values. It is not difficult to pass.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

Maybe you are looking for

  • New design of Skype on windows 7?

    Connected with an update forced at the start and got this 'new' Skype.

  • Check if double left click?

    Hello I need help to sovle this one. Now what happens is when use double click on a panel top does the left click event stuff and the Double left cick stuff. How to read both separately? Or how can I check if the use has been double-clicked left fron

  • Security Update for SQL Service 2005 Service Pack 3 KB2494113

    I did my regular updates but the SQL Service 2005 Service Pack 3 KB2494113 security update will not be installed.  I am running Windows XP.  I even tried to download it from the Windows Update Web site and install from there. However, without success

  • SA540 end of life

    Hello I tried to give a sense of end of life policy and notifications for the SA540 unit. The "last day of support: HW" is on 30 April 2016. This means that updates of firmware to address security issues will be available until this date, or firmware

  • Impossible to change the mouse settings to use multi-touch

    Hello. I own a Samsung Satellite C655D-S5209 and I can't use my multi touch on my touchpad control.  I recently tried to download a new driver (Synaptics PS/2 Port Touchpad Compatible) and when I click on device settings the only thing I can change i