HFM question data

Issue of HFM data

We have a problem, you can see the data for an entity when you run the EN report for an account. But when I open the POV in the DataGrid HFM, it shows nothing (Empty)


Since this is the account of elimination (data should be eliminated all the time), negative of the quantity of data that are displayed in the report en entered by the user. HFM is shwoing bad values after the consolidation. (it is consolidting all the entires elimination). Any organization facing this kind of problem before. HFM and EN are all two get the values of the same database.



Hello

Are you sure you look the exact same POV?

Kind regards

Thanos

Tags: Business Intelligence

Similar Questions

  • Scripts custom FDM HFM control (for example, perform a translate in HFM) after data FDM load via the batch Loader

    Currently, we have a Microsoft Access database + VBA Scripts that use the HFM API objects to connect to the HFM application to perform the following tasks:

    (1) delete the data according to a specific POV HFM

    (2) load some data from a text file to the request of HFM

    (3) run a Consolidation of data in HFM with 3 different POV

    (4) running a data translate in HFM with a specific POV.

    (5) when the process is complete, HFM send one connects e-mail to Hyperion administration team with newspapers.

    We want to replace this database MS Access with FDM.

    Custom script of FDM, I am able to run batch loader to load data at our request of HFM.

    However, FDM, I want to connect to our HFM application to perform a consolidation (in HFM) and also a Translate (in HFM) after that the data has been loaded by the loader of FDM data.

    The question I have is that I can't use the VB Scripts following FDM (this code works in MS Access)

    Function OpenHfmApp (sDomain As String, sUser As String, fun As String, sserveur As String, sApp As String)

    Dim customer as HsxClient

    Dim cSession as HsvSession

    Dim cServer As HsxServer

    Set customer = New HsxClient

    cClient.SetLogonInfoSSO sDomain, sUser, ' ", fun

    sApp sserveur, 'Financial management', cClient.OpenApplication, cSession cServer

    Set OpenHfmApp = cSession

    End Function

    FDM do not like the 'AS' - I should write:

    Function OpenHfmApp (sDomain, sUser, fun, sserveur, SAPP)

    Sun customer

    Dim cSession

    Dim cServer

    Set customer = New HsxClient

    cClient.SetLogonInfoSSO sDomain, sUser, ' ", fun

    sApp sserveur, 'Financial management', cClient.OpenApplication, cSession cServer

    Set OpenHfmApp = cSession

    End Function

    When I run this code of FDM, I get the following (of Financial Data Management Workbench) error message

    500 variable is undefined: 'HsxClient '.

    Online: 565

    My questions are:

    (1) is it possible to control via the Scripts VB FDM HFM realize some tasks such as claire HFM, Consolidation and HFM HFM translate?

    (2) if so, How can I reference objects in FDM HFM VB Scritp editor [custom general] (to use the HFM API via scripts VB FDM objects)?

    Thank you

    Claude

    Good to know that the 'out of the box' functionality meets your requirements. You can mark the thread as anawered now.

  • ODI error when loading classic HFM Application data

    Hi all

    I'm the one stage integration of the source (flat file) at the request of HFM, I created it and loading data integration, I get the following error: -.

    org.apache.bsf.BSFException: exception of Jython:

    Traceback (most recent call changed):

    File '< string >", line 3, in < module >

    at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:240)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    com.hyperion.odi.common.ODIHAppException: com.hyperion.odi.common.ODIHAppException: [CLEAR_ALL_METADATA_BEFORE_LOAD, REPLACE_MODE] properties are required.

    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)

    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)

    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)

    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)

    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)

    at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:577)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:468)

    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2128)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:366)

    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)

    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:292)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:855)

    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)

    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)

    at java.lang.Thread.run(Thread.java:662)

    Caused by: Traceback (most recent call changed):

    File '< string >", line 3, in < module >

    at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:240)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    com.hyperion.odi.common.ODIHAppException: com.hyperion.odi.common.ODIHAppException: [CLEAR_ALL_METADATA_BEFORE_LOAD, REPLACE_MODE] properties are required.

    at org.python.core.PyException.fillInStackTrace(PyException.java:70)

    at java.lang.Throwable. < init > (Throwable.java:181)

    at java.lang.Exception. < init > (Exception.java:29)

    to java.lang.RuntimeException. < init > (RuntimeException.java:32)

    to org.python.core.PyException. < init > (PyException.java:46)

    to org.python.core.PyException. < init > (PyException.java:43)

    at org.python.core.Py.JavaError(Py.java:455)

    at org.python.core.Py.JavaError(Py.java:448)

    at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:177)

    at org.python.core.PyObject.__call__(PyObject.java:355)

    at org.python.core.PyMethod.__call__(PyMethod.java:215)

    at org.python.core.PyMethod.instancemethod___call__(PyMethod.java:221)

    at org.python.core.PyMethod.__call__(PyMethod.java:206)

    at org.python.core.PyObject.__call__(PyObject.java:397)

    at org.python.core.PyObject.__call__(PyObject.java:401)

    to org.python.pycode._pyx15.f$ 0 (< string >: 6)

    to org.python.pycode._pyx15.call_function (< string >)

    at org.python.core.PyTableCode.call(PyTableCode.java:165)

    at org.python.core.PyCode.call(PyCode.java:18)

    at org.python.core.Py.runCode(Py.java:1204)

    at org.python.core.Py.exec(Py.java:1248)

    at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)

    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)

    ... 19 more

    Caused by: com.hyperion.odi.common.ODIHAppException: [CLEAR_ALL_METADATA_BEFORE_LOAD, REPLACE_MODE] properties are required.

    at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:240)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:175)

    ... more than 33

    Caused by: com.hyperion.odi.common.ODIHAppException: [CLEAR_ALL_METADATA_BEFORE_LOAD, REPLACE_MODE] properties are required.

    at com.hyperion.odi.hfm.ODIHFMUtil.validateRequiredProperties(ODIHFMUtil.java:87)

    to com.hyperion.odi.hfm.ODIHFMMetadataLoader$ OptionsMetadataLoad.validate (ODIHFMMetadataLoader.java:66)

    at com.hyperion.odi.hfm.ODIHFMMetadataLoader.validateOptions(ODIHFMMetadataLoader.java:188)

    at com.hyperion.odi.hfm.ODIHFMAppStatement.validateLoadOptions(ODIHFMAppStatement.java:168)

    at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:195)

    ... more than 38

    Any help appreciated

    Thank you

    Pratik

    Hi all

    Thanks for the help!

    I solved the error

    Solution:-j' used the double model i.e HFMData copy data, so what I thought was to use the original data model, IE HFMData.

    Generally, we use one duplicate, but here is not the way.

    Thank you

    Pratik

  • Strange question - data double on table 12 c JDev

    Hello

    I have a strange question in my table. I use Oracle JDeveloper 12 c. In my page, I use 2 VO and a secondary has a problem with the display of data - it displays two times. Heard that I deleted the table and put it in my page and always the same problem.nodatatodisplay.png

    No matter if I have all the data to my record in the table primary or not, when there are data to display then it shows twice. I do not know it can help, but for more information table is located in the popup.

    Anyone have this problem?

    Kind regards

    WK

    You must specify exact jdev version (there are two versions of 12 c)

    I had this problem in 12.1.2(a il y a longtemps, et à cause de trop de bugs dans cette version je n'a pas pris la peine de trouver la raison), but it seems that this problem went to 12.1.3(at moins, pour mon application)

    Dario

  • HFM/FDM data loading

    I inherited the company Hyperion and I am a user of essbase, which now is work of HFM.

    A user wants to support access to a specific entity in HFM. Make:

    (a) give the excess of writing user to the security group, which is part of the entity? or
    (b) only I'm in FDM and create a situation and that a link to HFM?

    The entity in which the person wishes to load data is not yet a situation to FDM (I added the new entity last week). (To aid (FM/FDM 11.1.2.1, db SQL, Windows operating system)

    Help, please!

    RIM,

    (a) Yes. Even if they have access FDM to a location that corresponds to the HFM entity, they will not be able to actually export/load HFM data without access to good security.
    (b) it depends a bit. If the new entity is loaded as a piece of a book / existing location, you will need to just update the map of the entity for the existing location. If it's a totally new company, then you can go the whole nine yards and create the location of FDM, etc. and entity HFM, cards.

  • HAL question Data Source

    We are still using HAL (I know that we are in the age of the stone here). I am trying to load members to the planning via HAL and can find no documentation on HAL anywhere. The specific question I have is that I don't know how to set the data source for the members to load. If I try to load a dynamic member as 'Dynamic' I get the error "com.hyperion.planning.InvalidHALValueException: data storage: Dynamics" is - anyone know where I can enumerate valid data storage settings to load?

    In theory, the properties must be the same as what is in the planning documentation, here is an example of account - http://docs.oracle.com/cd/E17236_01/epm.1112/hp_admin_11122/ch05s02s04s02.html

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Error loading of hfm application data

    Hello
    Friends I m totally new with this hfm, I created an application and I loaded the metadata, by loading data, it gives the error message like

    Range: 2, error: invalid cell for the period from January.
    Actual spending; 2009; January; Periodic report; Marina Beach; INR; Income; [ICP no]; Tea; [None]; [None]; [None]; 5000

    Line: 3, error: invalid cell for the period of July.
    Budget; 2010; July; Periodic report; White field; INR; Income; [ICP no]; Cofee; [None]; [None]; [None]; 4800

    Please help me debug this error

    The data seem to be ok, then I suggest you log in your application and check that the interception point is a point of data entry. Essentially in a grid view select the same members in the data row. This will tell you what is the matter...

  • small question data mart tools

    Sorry I'm pretty new to olap data!

    We have a small operating system that creates about 120K records a year in a single table with a couple of two level lookup tables. the component "hour" is stored with the measure that is already assembled to the desired granularity 'day '.

    CategoryLevel1-> CategoryLevel2-> measurement with date <-LocationLevel1 <-LocationLevel2

    We want to target a design BI "light" using Pentaho, Mondrian and Saiku against an Oracle database. If we need another schema then its ok to have the same database.

    We are simply considering using materialized views in the form of tables of facts and dimension for ETL as described here:

    http://WW1.ucmss.com/books/LFS/CSREA2006/IKE4645.PDF

    is this a common approach? are there disadvantages that are of importance to our effort?

    appreciate any idea that you can provide.

    I don't know if this will help, but there is a nice white paper on how the database Oracle OLAP option can be used to http://www.oracle.com/technetwork/database/options/olap/oracle-olap-11gr2-twp-132055.pdf.

    There are other benefits of OLAP to Oracle OLAP.

    -Ken Chin

  • 11.1.1.3 annotations HFM question

    I have a report that has 3 actual budget, forecast of columns, then later, I have three columns, say the current POV comments for years, months, then his calling the annotations for each scenario as seen earlier. Any reason why it's only the recovery on a column and not 3 separate for annotations?

    Thank you

    You are in the latest version of EN and HFM? There are several changes to the annotation of basic 11.1.1.3 to the latest Patch 14170443EN: Patch Set update: Oracle Hyperion Reporting and analysis 11.1.1.3.526 financial reports

    THX
    VIVEK

  • question data grid

    I have a DataGrid with 5 columns, I change the order of column in grid data by drag and drop. Now, I have a button reset on this click on the reset button I want to rearrange the order as the previous, it was. How to implement it, plesae hep me if anyone kows the answer.

    for example.

    I have A, B, C, D E column now, I changed the order of B, A, D, E, C,.

    now, when you click on the reset button, I want A, B, C, D, E of column order

    Hello

    columns is a table. After the dataprovider is set, save the column in a temporary table

    private var tempColumns:Array;
    tempColumns = datagrid.columns;
    

    When the user clicks the reset button, reset the tempArray columns.

     datagrid.columns = tempColumns;
    
  • Newbie question Date Conversion/comparison

    I am trying to compare the date of a file with the current time by using this code


    < cfset myPath = "C:\path\to\myfile.txt" >
    < cfset myExists = Fileexists (myPath) >
    < cfSet TIMEOUT = "#DateAdd("h","-6", Now()) #" >
    < cfset myFile = CreateObject ("java", "java.io.File") >
    < cfset myFile.init (myPath) >
    < cfset last_modified = myFile.lastModified () >
    < cfif (lt last_modified timeout) or (not myExists) >

    If myPath does not exist or it is aged at least 6 hours to regenerate the file

    < / cfif >

    My problem is the DateAdd function creates the date and time in a format {ts xxx}, while the .lastModified function creates a timestamp in a format of Ms.

    Which function to use to compare these two different time stamps?

    You might have more luck from cfdirectory and the datelastmodified field.  Here are some excerpts for one of my programs that date file base maintenance.

    ThisDate1 = DateAdd ("d",-90, now());


    Select the name of AllFiles
    where datelastmodified<>

    etc.

  • question data dba_segment

    Hello

    I run this query to see what size, every day.

    Select sum (bytes) / 1024/1024/1024 dba_segments;


    It is possible to get the bottom line, everything we got on yesterday.

    yesterday means that we have obtained 10002 next can we get 10000.


    Kind regards
    Brijesh

    Select sum (bytes) / 1024/1024/1024 dba_segments;

    yesterday, we got 10002 next can we get 10000.

    Why not? Do you drop something?

  • Simple question Date...

    I know it's pretty simple, but I can't just came up with a solution. I want to display the week of the month and I can't figure out how. I've done it before, but I can't remember how, silly me!

    I want to output: this is the 2nd week of October, or something to that effect.

    Thank you

    Handset-

    Edit 1: There was a bug in code that I'm working on. Transfer once it is fixed.

    Edit 2: Found... here is the code to work.

    CR

  • FDMEE question of burden HFM

    Hi all

    I am facing a problem loading the file from FDMEE to HFM (11.1.2.4 version). I am able to complete all the steps in the workbench.  However, once the export step towers work bench, if I even get overlaps existing in the HFM's ".dat" file, I can see nothing. FDMEE journal indicates that the records were assigned the HFM. I copied the log file for the load less than. I checked the load adjustment allow, it is set to Yes. I have an Access administrator for FDMEE and administrator of access applications in HFM. Is there anything else I should check in order to load data. Appreciate any thoughts on this issue.

    2016-01-11 11:29, 783 INFO [AIF]: process FDMEE: 77, log level: 4, log file: \\75612-AWA\fdmee1\Data\outbox\logs\SPDEV_77.log

    2016-01-11 11:29, 783 INFO [AIF]: place: NtzGL (Partitionkey:3)

    2016-01-11 11:29, 784 INFO [AIF]: name of the time: Jul-2015 (period key: 7/31/15 12:00 AM)

    2016-01-11 11:29, 784 INFO [AIF]: name of the category: real (category key: 1)

    2016-01-11 11:29, 785 [AIF] INFO: name of rule: NtzGL (rule ID:7)

    2016-01-11 11:29:02, 097 [AIF] INFO: Version FDM: 11.1.2.4.100

    2016-01-11 11:29:02, 097 [AIF] INFO: log the file encoding: UTF-8

    2016-01-11 11:29:04, INFO 009 [AIF]: - START IMPORT STEP -

    2016-01-11 11:29:04, 063 [AIF] INFO: - END IMPORT STEP -

    2016-01-11 11:29:04, 117 [AIF] INFO: - EXPORT STEP START.

    2016-01-11 11:29:04, 318 INFO [AIF]: file name: dummy.txt

    2016-01-11 11:29:04, 336 [AIF] INFO:

    Move the data for the period "Jul-2015".

    2016-01-11 11:29:04, INFO 365 [AIF]: run the following script: D:\Oracle\Middleware\EPMSystem11R1/products/FinancialDataQuality/bin/HFM_EXPORT.py

    2016-01-11 11:29:04, 668 INFO [AIF]: name of the Cluster: DEV3

    2016-01-11 11:29:04, 975 INFO [AIF]: run the following script: D:\Oracle\Middleware\EPMSystem11R1/products/FinancialDataQuality/bin/HFM_LOAD.py

    2016-01-11 11:29:05,084 INFO  [AIF]: ************************************************************************************

    2016-01-11 11:29:05, 084 [AIF] INFO: * HFM_LOAD.py started for LoadID: 77

    2016-01-11 11:29:05,085 INFO  [AIF]: ************************************************************************************

    2016-01-11 11:29:05, 294 INFO [AIF]: name of the Cluster: DEV3

    2016-01-11 11:29:05, INFO 438 [AIF]: connected to: SPDEV

    2016-01-11 11:29:09, 509 INFO [AIF]: load data started: 11/01/2016 11:29:05.

    2016-01-11 11:29:09, 509 INFO [AIF]: load data completed: 11/01/2016 11:29:05.

    2016-01-11 11:29:09, 522 [AIF] INFO: loaded documents: 2

    2016-01-11 11:29:11, 917 INFO [AIF]: load URL started: 11/01/2016 11:29:09.

    2016-01-11 11:29:11, 917 INFO [AIF]: loading the URL FDMEE_URL

    2016-01-11 11:29:11, 917 INFO [AIF]: URL for loading completed: 11/01/2016 11:29:09.

    2016-01-11 11:29:11,941 INFO  [AIF]: ************************************************************************************

    2016-01-11 11:29:11, INFO 974 [AIF]: - END EXPORT STEP -

    2016-01-11 11:29:12, INFO 030 [AIF]: run the following script: D:\Oracle\Middleware\EPMSystem11R1/products/FinancialDataQuality/bin/HFM_CONSOLIDATE.py

    2016-01-11 11:29:12, 282 [AIF] INFO: name of the Cluster: DEV3

    2016-01-11 11:29:12, 588 [AIF] INFO: end process FDMEE, process ID: 77

    Kind regards

    Nik

    OK, I think you need to open a SR of Oracle.

    It could be FDMEE/ODI does not HFM loading operations, may be a communication problem.

    Can you confirm that if you had load manually the data up to the intersection through the Smart view and ran the FDMEE process data would be replaced?

    Have you ever been able to load properly in this environment?

  • How to load HFM data into Essbase

    Hello

    How can bring us the HFM data into Essbase cube with using EAL, since we have performance using EAL - DSS as a source with OBIEE reporting problems.

    The expanded use of analytical, I heard we can only only level 0 of HFM Essbase data and need to write the currency conversion and the calculations of eliminating PKI in Essbase to roll the Member parent. Is this true?

    Also how we can convert security HFM to Essbase security.

    Please suggest me on this.

    Thank you
    Vishal

    Security will be a bit tricky as Essbase uses generally filters and HFM security classes. You can potentially use shared groups in Shared Services, but licensing issues soar depending on supply. Your best bet is maybe to look at export artifact LCM to handle.

Maybe you are looking for