Process of HFM

Hello

I have 2 requests of HFM, but I see 7 HsvDatasource.exe of the processes running on the server. Can you tell me why is the differences.

Appreciate your answer...

Hello

You can see the hsveventlog.log for the application name for which the hsvdatasoure.exe is created. In order to verify that it points to any other application different EPS.

Kind regards

Abdul

Tags: Business Intelligence

Similar Questions

  • Control of process data HFM

    Hello

    I use HFM 11.1.2.3 requests. I have a question about the control of process data HFM.

    For example, Japan entity has 3 Level0 member (A, B, C). I would like to present the data for only two members (A and B). Is it possible to start the process to ensure that the members of A and B. I want to ignore C member.

    I've known of all members of the child must be at the same level to promote at the level of the examination. But I need to know if there are no opportunities.

    Thank you

    Michel k

    Yes, you can start only A and B, but when you are finished with your consolidation and want to submit, to publish and to lock the parent (Japan) entity, you will also need to submit, to publish and to lock the C entity (for example, disabling his account of validation)

  • The process of HFM management

    Question of management process:

    If a user makes the promotion of data at the local level and then later adjustments are made for it at a ministerial level (now faced by all units processing), can the local user then re - examine and promote? A thought is to give the user local status of supervisor to review to approve since Submit seems to end access to the level of the exam. This seems to be too much access to a local user well.

    Any thoughts?

    Jeo123,

    Thanks for the reply. I have implemented GLdata, LocalAdjs, and CorporateAdj members in a local instance of HFM.  These three members would be the children of AllCustomX (with rules for calculating data combined in this parent)?  Also, users then have a 4th Phase of submission for AllCustomX to promote, submit, etc. GLdata load over the Adjs?

  • How to select the entity for the management of the process of HFM?

    Hi all

    I have a confusion as I know, a process allowing management to the statement my Scenerio metadata property. Now let's say I have an application with more than 800 members of the entity in the entity Dimension. Now that all entities will require to have process management, how can I decide?

    Help, please.

    Thank you

    Zitouni

    Hello. Once you enable for the scenario of all entities will require from, promotion, etc.. Process control screen allows you to apply patches to the selected entities and his descendants, so it is not difficult.

    Eric

  • EN error: * Error: invalid report object *.

    Hi all

    When we run the report we get the error "* Error: invalid report object *.» It's for a specific entity basis for some of the user level. Can he the other user capable of displaying data at this level. All users have the same level of access. If anyone can help...

    Thanks in advance.

    Ministry of health

    Dear Ministry of health,

    Have you implemented the management of processes in HFm for your application, if yes, you need to start the entity.

    Bravo,
    Hanane

  • Copy data between the custom dimension members or scenario

    Dear Experts,

    I'm back with my silly question.
    Is there anyone know how to copy all the data (not only entity currency) between members in the custom dimension? Our HFM application has a different type of data because the type declaration (legal and audit). Whenever we arrived with our legal data, we copy our legal consolidated data for verification. We tried to create a formula (using hs.exp) to move data from legal to check, but apparently, this formula should be write inside Sub calculate (). Is not supposed, we do not want to run the copy data every time that the user click on calculate.

    We also perform copy data between the scenario, as actual to Budget. This activity also needs all the data consolidated and final.

    So the condition for the copy of data are:
    -data must be consolidated
    -Copy all the data, including the adjustment

    Is there a formula or a way to copy the data that can be triggered individually (click 1 rules that run only copy data) by the user?

    Thank you very much for your kind response,
    -Anna

    Hi Anna,.
    As you say, you cannot trigger different parts of rules to run in HFM. Instead, you use a condition. In your case this condition might be guided by the management of the process. You can use a combination of GetSubmissionPhase, review status , and ReviewStatusUsingPhaseID functions that returns the current process management level and presentation details: phase. To me, it seems that as soon as you have finished legal, you complete a part of your process cycle and enter another stage of verification. Here are the parts of your publishing process, if you have defined a process in HFM management or not, which means that you should consider using it.

    -Kostas

  • FDM help urgently...

    Experts,
    Can you please confirm my understanding below if they are correct or not. I'm afraid that on FDM (without ERPi). Appreciate if can be discussed point by point.

    1 FDM is a tool for loading data Essbase and HFM
    2 FDM can extract data from flat files and load data to Essbase and HFM
    3 FDM can extract data from an Oracle table and load the data to Essbase and HFM
    4 FDM cannot connect to Essbase to extract data directly. Rather some native as data calculation script process export first extracted data to flat file and FDM can take this flat file and load from other database Essbase and HFM based on the need.
    5 FDM cannot connect to HFM to extract data directly. Rather some native process like data export process in HFM first extracts data to flat file and FDM can take this flat file and load other databases to Essbase and HFM Application.
    6 FDM can never load the data to the Oracle table.

    I really struggle with these points and not get a definitive answer. If anyone of you knows what the answer will be a great help.

    Thanks for your help.
    Dev

    Published by: Dev on June 25, 2010 03:58

    Experts,
    Can you please confirm my understanding below if they are correct or not. I'm afraid that on FDM (without ERPi). Appreciate if can be discussed point by point.

    1 FDM is a tool for loading data Essbase and HFM

    FDM is a data transformation tool that is used to map the source data for the General ledger for the members of the target system. The target system could be HFM or Essbase, that's right.

    2 FDM can extract data from flat files and load data to Essbase and HFM

    FDM don't "extract" data to a flat file, it analyzes a flat file set via the import format in the application then maps the data and load to Essbase and HFM.

    3 FDM can extract data from an Oracle table and load the data to Essbase and HFM

    FDM is able to pull directly from a database table by using a script of integration.

    4 FDM cannot connect to Essbase to extract data directly. Rather some native as data calculation script process export first extracted data to flat file and FDM can take this flat file and load from other database Essbase and HFM based on the need.

    FDM does not extract the essbase data, it only loads data of Essbase.

    5 FDM cannot connect to HFM to extract data directly. Rather some native process like data export process in HFM first extracts data to flat file and FDM can take this flat file and load other databases to Essbase and HFM Application.

    FDM does not extract data from form HFM, it only loads data from HFM

    6 FDM can never load the data to the Oracle table.

    FDM can extract the data in the database of FDM in a flat to load anywhere by using the data adapter Mart file.

  • HFM 11.1.2.2 HsvDatasource for application ABC with a process ID of 5424 on server HfmServer could not start!

    Hi all

    We improve our 11.1.1.3 to 11.1.2.2 HFM application using hfm copy utility.

    We also have run utility upgrade schema hfm and then save it with shared services.

    but we can see the error in hfm as system messages

    HsvDatasource for application ABC with a process ID of 5424 on server Hhfserver could not start!

    Server: HIBTV-HYP1T with ID 0 has been removed from the list.

    Unspecified error

    It seems that HsvDatasource hfm application does not start even in the Task Manager

    have idea about it

    Kind regards

    Dattatray Mate

    Hello

    That's how oracle has solved this problem.

    We had rules of /HFM of calc scripts in calculation for this HFM application manager.

    If oracle has asked us to remove references by running as a result of the SQL on the schema from table binaryfiles HFM hfm

    delete ABC_binaryfiles where label like '% CalcRules % ';

    After execution of this statement, we could open our HFM application successfully without any problem!

    Kind regards

    -Matt Dattatray

  • How HFM process records

    On a learning curve, I'm playing with the consolidation rules and debugging.

    In order to understand the functioning of HFM, I added the following rule of consolidation, with the appropriate WriteToFile sub().

    Void Consolidate()

    Dim MyDataUnit

    Dim lNumItems

    Set MyDataUnit = HS. OpenDataUnit("")

    lNumItems = MyDataUnit.GetNumItems

    for i = 0 to lNumItems-1

    Call MyDataUnit.GetItem (i, strAccount, strICP, strCustom1, strCustom2, strCustom3, strCustom4, dData)

    Call WriteToFile ("Line 1;" & lNumItems & ";" & i & ";" & strAccount & ";" & strICP & ";" & strCustom1 & ";" & strCustom2 & ";" & strCustom3 & ";" & strCustom4 & ";" & dData)

    next

    EndSub

    After running a consolidation my result file looks like this

    Line 1; 258; 1; Sales; [ICP no]; P3000_Phones; Comma_Phone_Stores; Retail_Direct; [None]; 15887.488716; 08/08/2013-13:36:40

    Line 1; 258; 2; Sales; [ICP no]; Comma_PDAs; Electronic_City; National_Accts; [None]; 19239.0919236; 08/08/2013-13:36:40

    Line 1; 258; 3; Sales; [ICP no]; Comma_PDAs; Radio_Central; National_Accts; [None]; 21959.2720157; 08/08/2013-13:36:40

    Line 1; 258; 4; Sales; [ICP no]; Comma_PDAs; Western_Electronics; National_Accts; [None]; 24728.9099251; 08/08/2013-13:36:40

    Line 1; 21; 0; Wages; [ICP no]; [None]; [None]; [None]; [None]; 5997.2854618; 08/08/2013-13:36:40

    Line 1; 258; 5; Sales; [ICP no]; Comma_PDAs; Freds; National_Accts; [None]; 4955.7155768; 08/08/2013-13:36:40

    Line 1; 21; 2; Legal; [ICP no]; [None]; [None]; [None]; [None]; 237.3450281; 08/08/2013-13:36:40

    Line 1; 21; 3; Housekeeping; [ICP no]; [None]; [None]; [None]; [None]; 84.0907017; 08/08/2013-13:36:40

    Line 1; 258; 6; Sales; [ICP no]; Comma_PDAs; Power_Price; Distributor_Sales; [None]; 13046.6934218; 08/08/2013-13:36:40

    ...

    My first entity has (INumItems) 258 records, 21 second.

    I expect the system to work through each feature in a sequential order, but I see them intertwined. So my first question

    (1) is due to a parallel processing?

    the numbers I have loaded are different. For example, for the first line, I loaded the amount 15887.48871 and not 15887.488716

    (2) where is this 6 coming from?

    Thanks in advance for any idea,

    Patrick

    Yes, I could say, HFM uses a PARALLEL treatment.  It's difficult handwriting of newspapers.  I think that the treatment is shared between the entities, although I have not really dug too deep into it.  You will need to modify the writetofile routine to write a separate log for each entitiy confirm.

    As that 6 in the end, you are course that is has not been responsible for this way in the system?  In addition, it is the routine of consolidation, which means that it is to translate.  Is it possible that is the result of exchange rates?

    Go to a grid, explore the this intersection, right click and click on information of the cell.  At the bottom it will show you the data displayed, stored data and data full resolution.  The last two are probably the 6 extra.

  • the user running the export / HFM process

    Hello world

    We need export data on a network share to HFM.
    Informing Department they need to give him access to the network share that I need to specify (in general) - out of HFM, let's use vbscript Rulesfile to create a file and write lines in there.

    In general: what user SYSTEM (for example in taskmanager the user running the process in the operating system and of course not the user of shared services) will write this file? For example, in essbase, it is the user who runs the process essbase = > here I know;).

    -C' is the user who runs the HFM process? Is there any special device for export / when the rules file is run?
    -It is usually the user which was used during the installation?
    -is the general system user?

    I am unfortunately not able to log the servers directly to have a glance - but I just need to ask the right questions of the installation team.

    All responses are much appreciated.

    Thanks & BR,
    Julius

    If you use WriteToFile in HFM rules, the file is generated by the user DCOM of HFM. Thus, the DCOM user must have write access to any network share, you want the output files to, and then all human users should have access to the same share.

    -Chris

  • HFM process control

    Hi all, I have v.11.1.2.1 HFM

    I tried to generate custom through the HFM database reports, but I need some help with a few tables, that I have not been able to find.

    While in the process control panel, each time I right-click a phase that the "History of process flow" option appears. If I choose what it a pop window with the historical records of this phase are displayed. I was wondering if it was possible to find that exact or tables so that I can manually build a query.

    Any help?

    Thank you guys

    I bet they are ID numbers that correspond to a level of process just as there are for different tasks of HFM identification numbers.

    I know that in order for me to come up with the associations of the name and the ID number of the task, I went in fact through ASP pages. For example, when you go to the review task page, there a drop down which shows all types of tasks. Guess what... If you look at the source code of this page, there is a list containing the names and identification numbers.

    I'd be willing to bet that you can do the same thing to determine the process ID / name of the associations.

    While I'm not a super pro on this product, I stayed in a Holiday Inn Express last night and I hope that this opinion is not send you on another goose hunt.

    Charles

  • HFM: NoAccess while process is submitted

    We noticed that we are not able to see the data in HFM (reception NoAccess) when process is submitted unless the user has the role of the issuer or the role of supervisor for review. Is this correct? Or is there another setting that we are missing?

    Any input is greatly appreciated.

    Thank you!

    I may be misinterpreting your response, but I think you misunderstand what "submitter" actually means. If the user must load data, they must have a review role. In other words, at least 1 the examiner. "Author" is a term that means that the data has been submitted for final exam before becoming published. It has no connection with the Act of sending data (loading).

    -Chris

  • HFM 11.1.2.3 historical process is empty for previous years

    Hi people,

    In the Production, when we right click on select the history of process it shows white for 2014 and prior. But he shows us the newspaper correctly for the year 2015.

    Developing, we don't have this problem.


    We have improved the production and development of 11.1.1.2 to 11.1.2.3 January environments. But there is problem in production that too only deferred.

    Do you have anyone of you seen in your applcations? And how did you rectfy question?

    Your help is appreciated.

    Kind regards

    Chavigny

    The workflow process stored in the tables PFLOW history. My hypothesis is that tables PFLOW could not correctly copied from 11.1.1.2.

    To resolve this problem, run the utility to copy 11.1.2.3 app version to copy the app again. Then, run the upgrade of the schema. I strongly suggest you to apply 11.1.2.3.700 Patch and do the steps of migration (update App copy and schema)

  • HFM: Cannot lock entities due to the control of process state

    Hello

    I want to block some previous years, e.g. 2000-2010.

    In the settings of the metadata for the scenario property 'PhasedSubmissionStartYear' is defined in 2013.

    If I try to lock the 'old' years before 2013, an error has occurred: the processing unit can be locked, because his level of process is not "published".

    Why is that?

    So I need to go through all the necessary steps to control the process for all years?

    Thanks in advance!

    Hello. The short answer is Yes, you have to cross and promote annually to publish before locking (start, present and publish). The year of beginning of gradual submission to use presentations by steps (several approvals by period) only.

    A suggestion: lock only the entities of base level, not the parents. The parents of locking that will prevent the data change when the hierarchy is changed, but if you ever have a change which must be retroactive so all changes are applied in any case. Org by period is better functionality to keep the old hierarchies in place. Do not forget that the accounts of parent and parent's customs are calculated in memory so they do not get locked in place; just the entities.

    Eric

  • Disable the HFM management process

    Dear Experts,

    What are the consequences on historical data if I disable management of the process for all scenarios except REAL?

    Thanks for your help.

    Benoit

    Hello Thanos,

    We will be locked entities for periods prior to any scene.

    It takes too long to administer for scenarios that are not often used. Also we cannot PM for these scenarios.

    Thanks for your help!

    Best regards

    Benoit

Maybe you are looking for