Loading data - CDA vs. periodic

Just to confirm... you can not load some data as for a YEAR and some like MTD, correct? Same app, same scenario.

Within a data file, you can mix (different accounts) documents with CDA and periodical. But I have not tried to mix the CDA and periodic reports to the same account and load it with option "Accumulate in the queue".

Tags: Business Intelligence

Similar Questions

  • Load data for a YEAR in a scenario of periodicals

    Dear,

    I think that it is an easy one, but I want to be 100% sure of what I should expect from HFM.

    Need to load balances for a YEAR in a regular scenario (periodic = default view, zeroview = periodic, consolidate the CDA = no)

    I have to specify the "CDA" string in the file, that I download?

    Periodic data are automatically calculated as the difference between consecutive CDA balances?

    Is there some advise/alert should I consider?

    A big thanks to all the world.

    Yes, you must specify the CDA in the files that you upload.  As a general rule, you want to avoid to load data to a recurring scenario for a YEAR, and vice versa.  In your example, a question if you have loaded data from one YEAR to a flow account and then the amount of your VOE in the month following to zero resulting in your not having data file is not a record for this intersection next month.  For example, if you have loaded $1,000 in January to sell your account but then the amount of the CDA in February is zero (rare but it happens) then you would not have a record of this account data from sale in February so that after loading you expect to see a zero number CDA in HFM.  However, with the zeroview the value of periodic, HFM expects a periodic amount and when you're not busy it assumes that the periodical (not a year) is zero and that it derives from the number of CDA.  In this case, it would show a value of February of the periodical of $0 and $1,000 for a YEAR when you expect $0 CDA and - periodical of $1,000.  Bottom line is that you should take periodic data to a periodic script and vice versa.

  • Error in loading data FDMEE

    Hello all i am trying to load data using FDMEE but I got these errors but I do not understand what it is

    "

    2016-01-25 12:34:31, 792 INFO [AIF]: beginning of the process FDMEE, process ID: 104

    2016-01-25 12:34:31: 792 [AIF] INFO: recording of the FDMEE level: 4

    2016-01-25 12:34:31, 793 [AIF] INFO: user:

    2016-01-25 12:34:31, 793 INFO [AIF]: place: APC_Data_location (Partitionkey:11)

    2016-01-25 12:34:31, 793 [AIF] INFO: period name: Dec-2015 (period key: 12/31/15 12:00 AM)

    2016-01-25 12:34:31, 793 INFO [AIF]: name of the category: real (category key: 1)

    2016-01-25 12:34:31, 793 [AIF] INFO: name rule: Data_loadRule_1 (rule ID:13)

    2016-01-25 12:34:33, 792 [AIF] INFO: Version FDM: 11.1.2.4.000

    2016-01-25 12:34:33, 792 INFO [AIF]: connect the file encoding: UTF-8

    2016-01-25 12:34:34, 997 [AIF] INFO: - START IMPORT STEP -

    2016-01-25 12:34:35, 157 INFO [AIF]: run the following script: /u02/Oracle/Middleware/EPMSystem11R1/products/FinancialDataQuality/bin/FusionCloud/FusionCloudAdapter.py

    2016-01-25 12:34:35, INFO 171 [AIF]: FusionCloudAdapter.importDataFromFusion - START

    2016-01-25 12:40:57, 601 INFO [AIF]: output Standard: INFO: from script FusionCloudAdapter...

    Proxy configuration for deployment in Production SEEP

    Main program of starting FusionAdapter...

    mode of production: importDataFromFusion, pid: 104

    FusionAdapter initialized during initialization.

    fusionGlWebServiceWsdl: http://

    fusionGlWebServiceUser: sysadmin

    fusionProductType: GL

    The main program of FusionAdapter is complete.

    INFO: The FusionCloudAdapter script failed as described above.

    2016-01-25 12:40:57, 601 INFO [AIF]: SD: com.sun.xml.ws.wsdl.parser.InaccessibleWSDLException: 2 InaccessibleWSDLException heads.

    java.io.IOException: unable to proxy tunnel. Proxy returns "HTTP/1.1 502 cannotconnect.

    java.io.IOException: unable to proxy tunnel. Proxy returns "HTTP/1.1 502 cannotconnect.

    at com.sun.xml.ws.wsdl.parser.RuntimeWSDLParser.tryWithMex(RuntimeWSDLParser.java:182)

    at com.sun.xml.ws.wsdl.parser.RuntimeWSDLParser.parse(RuntimeWSDLParser.java:153)

    at com.sun.xml.ws.client.WSServiceDelegate.parseWSDL(WSServiceDelegate.java:284)

    to com.sun.xml.ws.client.WSServiceDelegate. < init > (WSServiceDelegate.java:246)

    to com.sun.xml.ws.client.WSServiceDelegate. < init > (WSServiceDelegate.java:197)

    to com.sun.xml.ws.client.WSServiceDelegate. < init > (WSServiceDelegate.java:187)

    to weblogic.wsee.jaxws.spi.WLSServiceDelegate. < init > (WLSServiceDelegate.java:86)

    to weblogic.wsee.jaxws.spi.WLSProvider$ ServiceDelegate. < init > (WLSProvider.java:632)

    at weblogic.wsee.jaxws.spi.WLSProvider.createServiceDelegate(WLSProvider.java:143)

    at weblogic.wsee.jaxws.spi.WLSProvider.createServiceDelegate(WLSProvider.java:117)

    at weblogic.wsee.jaxws.spi.WLSProvider.createServiceDelegate(WLSProvider.java:88)

    to javax.xml.ws.Service. < init > (Service.java:77)

    to com.hyperion.aif.ws.client.ErpIntegrationService.ErpIntegrationService_Service. < init > (ErpIntegrationService_Service.java:74)

    to com.hyperion.aif.fusion.FusionAdapter. < init > (FusionAdapter.java:177)

    at com.hyperion.aif.fusion.FusionAdapter.main(FusionAdapter.java:85)

    2016-01-25 12:40:57, ERROR 602 [AIF]: the script failed to run:

    2016-01-25 12:40:57, FATAL 605 [AIF]: error in Comm.executeJythonScript

    Traceback (most recent call changed):

    Folder "< string >", line 528, in executeJythonScript

    File "/ u02/Oracle/Middleware/EPMSystem11R1/products/FinancialDataQuality/bin/FusionCloud/FusionCloudAdapter.py", line 163, < module >

    fusionCloudAdapter.importDataFromFusion)

    File "/ u02/Oracle/Middleware/EPMSystem11R1/products/FinancialDataQuality/bin/FusionCloud/FusionCloudAdapter.py", line 60, in importDataFromFusion

    raise RuntimeError

    RuntimeError

    2016-01-25 12:40:57, 692 FATAL [AIF]: error in import data Pre COMM

    2016-01-25 12:40:57, 697 [AIF] INFO: end process FDMEE, process ID: 104

    Thank you

    The script attempts to connect to a WSDL URL and therefore cannot error

    "java.io.IOException: unable to tunnel by proxy." «Proxy returns «HTTP/1.1 502 cannotconnect»»

    I see that it is from SEEP, maybe you have not set the correct details for Fusion Cloud, are you sure that you configure the WSDL URL appropriate in the connection source Configuration section in FDMEE

    See you soon

    John

  • Error while loading data in Planning

    Failed to load the data in the planning of the 11.1.2.3.200 using ODI 11.1.1.7

    Please find the errors at the bottom of newspapers:

    INFO [SimpleAsyncTaskExecutor-2]: Oracle Data Integrator adapter for Hyperion Planning

    INFO [SimpleAsyncTaskExecutor-2]: Connection to the planning your application [xxxxxxx] on [xxxxxxxxxxx]: [xxxx] using [admin] username.

    [SimpleAsyncTaskExecutor-2] INFO: Successfully connected to the planning application.

    INFO [SimpleAsyncTaskExecutor-2]: Loading for the charge of planning options

    Name of the dimension: account type Parent child: false

    Order By entry charge: forgery

    Update the database: false

    INFO [SimpleAsyncTaskExecutor-2]: Beginning of the loading process.

    DEBUG [SimpleAsyncTaskExecutor-2]: number of columns in the result set of source does not match the number of columns planning targets.

    INFO [SimpleAsyncTaskExecutor-2]: Type of load is a [member of the load dimension].

    ERROR [SimpleAsyncTaskExecutor-2]: file [[A603010, null, null, null, null, null, null, null, null, null, null, null, xxxxx,-100, F3E0, C011, E7172_93275, FY17, Stage 1, level of current Service, Jul, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null]] was rejected by Planning Server.

    ERROR [SimpleAsyncTaskExecutor-2]: file [[A601060, null, null, null, null, null, null, null, null, null, null, null, xxxxx,-250, F3E0, C011, E7172_93275, FY17, Stage 1, level of current Service, Jul, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null]] was rejected by Planning Server.

    log. Err

    Account, name of Cube to load data, Budget, Point of view, Error_Reason

    A603010, xxxxx,-100, F3E0, C011, E7172_93275, FY17, Stage 1, current service level, Jul, cannot load the dimension member, error message is: RemoteException occurred in the server thread; nested exception is:

    java.rmi.UnmarshalException: not recognized hash method: method not supported by the remote object

    A601060, xxxxx,-250, F3E0, C011, E7172_93275, FY17, Stage 1, current service level, Jul, cannot load the dimension member, error message is: RemoteException occurred in the server thread; nested exception is:

    java.rmi.UnmarshalException: not recognized hash method: method not supported by the remote object

    Journal FDMEE:

    : [AIF] error: no record exists for period "Pd 2 - 01-08-2014".

    : [AIF] error: no record exists for period "Pd 3-2014-09-01"

    FDMEE logging level is set to 5

    The PES of planning that you applied contains a new version of the HspJS.jar so that could be a possible way of this error could have surfaced. For me personally, I think you'd better get everything patched up to the 11.1.2.3.500 PSU continuing at least before that it is a known problem in this version and there are the notes I have mentioned previously, to help patching.

    It is clear from the error that there is a mismatch in versions of the Agent of ODI FDMEE with the Planning Server jar files. One thing you might be able to try on this front would be to save the current file HspJS.jar in the C:\Oracle\Middleware\odi\oracledi.sdk\lib House FDMEE ODI and place a copy of the same file of your planning server in the C:\Oracle\Middleware\EPMSystem11R1\products\Planning\lib folder (or equivalent).

    I've not personally seen this error before where the 500 patch had not been implemented well. Decide which approach you take will be up to you really, but I suggest to patch to 500 as best as possible and go from there.

    Concerning

    Craig

  • Problem loading data HFM

    Hi all

    We had a request for EPMA Type HFM sizes have been local, validated and successfully deployed application.

    We tried to load data into the application of HFM and loading the data has been a success.

    Then, we decided to convert all the local dimension of the HFM application mentioned above as shared dimensions. After the conversion of all of the dimensions dimension shared with success that we get the error when loading data in the same application of HFM (the app is valid and can be deployed after a change)

    The error log is below:

    Load data started: 29/11/2014-10:53:15.

    Line: 216, error: cell invalid for the period Oct.

    REAL; 2014; Oct; FOR A YEAR; E_2100; < entity currency >; 89920000; [ICP no]; CORP.; [None]; [None]; FARM21000; 11979

    > > > > > >

    Line: 217, error: cell invalid for the period Nov.

    REAL; 2014; Nov; FOR A YEAR; E_2100; < entity currency >; 89920000; [ICP no]; CORP.; [None]; [None]; FARM21000; 23544

    > > > > > >

    Online: 218, error: invalid cell for the period dec.

    REAL; 2014; Dec; FOR A YEAR; E_2100; < entity currency >; 89920000; [ICP no]; CORP.; [None]; [None]; FARM21000; 58709

    > > > > > >

    Line: 219, error: cell invalid for the period Oct.

    REAL; 2014; Oct; FOR A YEAR; E_2100; < entity currency >; 28050000; E_6000_20; [None]; [None]; [None]; FARM21000;-11979

    > > > > > >

    Line: 220, error: cell invalid for the period Nov.

    REAL; 2014; Nov; FOR A YEAR; E_2100; < entity currency >; 28050000; E_6000_20; [None]; [None]; [None]; FARM21000;-11565

    > > > > > >

    Wanted to know if there is something I might have missed if conversion of local dimension in sharing (if there is any sequence to do, or no constraint that I am not aware, although the conversion looks good, as demand is validated and deployed after changes)

    What can be the reason for missed data load, can anyone help?


    Thank you

    Rachid

    Hello

    I could watch the properties of account for this account (89920000) and see the TopCustom1... 4Member. you will find the reason behind the invalid cells.

    When you convert the local shared dimensions, have you checked the "Dimension Association' accounts and society?

    He does not lose the dimension association if a proper sequence is not respected.

    Kind regards

    S

  • FDM 11.1.2.2 EFFECTS in the data after loading data

    Hello

    I'm loading data for planning an application of FDM for 30/04/12 to 31/3/12 (April) FY13 but perform 30/04/13 to 31/03/13 (April) exercise 14 in planning and for the 13 year data are deleted automatically

    I use replace when exporting data option

    I guess that the FDM cannot remove data that are responsible for the planning application

    can someone help me solve this problem.

    Thank you

    Answer: The correct data is loaded to April 14 exercise

    Have you reviewed your Control Board for the periods? Maybe it's that you map Apr-2013 to APR-FY14

    Answer: Yes mapping for April 2013 is mapped to APR-FY14 > This is the reason why the data is loaded to FY14. You need to update your table of control periods.

    Then your data for FY13 are deleted in planning when using replace. When you load mode replace, FDM runs a clear script before loading data. Which erases all intersections for category (scenario), period and entity (all descendants of your target entity)

    Question why the 13 year data are deleted in the planning if we choose the option replace > by default the CLEAR of the LOAD ACTION script do not include the year dimension. This is why the data are deleted for all years. You must include in the script

  • How to load files from several periods of time in FDMEE?

    Hello

    Is there a way to FDMEE to load the file from several periods given in FDMEE with the period amounts in separate columns? (example below). At present, I can load several files at the same time condition each period in a separate file, but could not find a way to load the file below.

    Entity

    Account

    Amount-Jan

    Amount-Feb

    Amount-Mar

    Amount-Apr

    100

    3921

    110

    140

    145

    180

    Thank you and best regards,

    Sandeep.

    You don't need the entire script for files from several periods.

    Could have read the files having a column for several periods.

    In your case, you have multiple columns for different periods and this can be achieved using the standard period multi Format importing and loading data rules.

    Everything is detailed in the Administrator's guide. You can also take a look at fishing with FDMEE: FDMEE PSU2 published (11.1.2.3.200)

    Concerning

  • ODI - Hyperion Essbase - loading data

    Hello

    There is an option to run a calc script after a data loaded so that you can AGG period etc.   What running a until the data load to ERASE the period that you load?  Or you create another load of 'dummy' data and clear after a load and then run your data load and running the AGG?  I was wondering if there is a way to add an option for the KM, a little like the 'ORDER_BY '.

    Thank you

    You can create a set of interface of the RUN_CALC_SCRIPT_ONLY option to yes.

    See you soon

    John

    http://John-Goodwin.blogspot.com/

  • Error of the ODI - running MaxL in loading data ODI

    I have an ODI Interface that works very well to load data in an Essbase Cube.  However I had to add a stage that could run a calc script before loading to clear data from the current year and the period.   I built the MaxL script and successfully tested and it works ok. However in the options on my target in the flow section, I added this entry:

    PRE_LOAD_MAXL_SCRIPT: C:\ODI_Data\Scripts\MaxL\clr_act.mxl

    When I try and run I get the below error.  Any ideas what it could be?  He says the full path of the MaxL script so I thought that's the way they wanted.  That's the problem, I have no reference correctly?

    org.apache.bsf.BSFException: exception of Jython:

    Traceback (most recent call changed):

    File "< string >", line 89, < module >

    at com.hyperion.odi.essbase.ODIEssbaseConnection.executeMaxl (unknown Source)

    at com.hyperion.odi.essbase.AbstractEssbaseWriter.beginLoad (unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: error occurred while running script maxl. Error message is:

    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)

    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)

    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)

    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)

    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)

    at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)

    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:366)

    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)

    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:292)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:855)

    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)

    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)

    at java.lang.Thread.run(Thread.java:662)

    Caused by: Traceback (most recent call changed):

    File "< string >", line 89, < module >

    at com.hyperion.odi.essbase.ODIEssbaseConnection.executeMaxl (unknown Source)

    at com.hyperion.odi.essbase.AbstractEssbaseWriter.beginLoad (unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: error occurred while running script maxl. Error message is:

    at org.python.core.PyException.fillInStackTrace(PyException.java:70)

    at java.lang.Throwable. < init > (Throwable.java:181)

    at java.lang.Exception. < init > (Exception.java:29)

    to java.lang.RuntimeException. < init > (RuntimeException.java:32)

    to org.python.core.PyException. < init > (PyException.java:46)

    to org.python.core.PyException. < init > (PyException.java:43)

    at org.python.core.Py.JavaError(Py.java:455)

    at org.python.core.Py.JavaError(Py.java:448)

    at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:177)

    at org.python.core.PyObject.__call__(PyObject.java:355)

    at org.python.core.PyMethod.__call__(PyMethod.java:215)

    at org.python.core.PyMethod.instancemethod___call__(PyMethod.java:221)

    at org.python.core.PyMethod.__call__(PyMethod.java:206)

    at org.python.core.PyObject.__call__(PyObject.java:397)

    at org.python.core.PyObject.__call__(PyObject.java:401)

    to org.python.pycode._pyx0.f$ 0 (< string >: 89)

    to org.python.pycode._pyx0.call_function (< string >)

    at org.python.core.PyTableCode.call(PyTableCode.java:165)

    at org.python.core.PyCode.call(PyCode.java:18)

    at org.python.core.Py.runCode(Py.java:1204)

    at org.python.core.Py.exec(Py.java:1248)

    at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)

    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)

    ... 19 more

    Caused by: com.hyperion.odi.essbase.ODIEssbaseException: error occurred while running script maxl. Error message is:

    at com.hyperion.odi.essbase.ODIEssbaseConnection.executeMaxl (unknown Source)

    at com.hyperion.odi.essbase.AbstractEssbaseWriter.beginLoad (unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:175)

    ... more than 33

    Caused by: com.essbase.api.base.EssException: error occurred while running script maxl. Error message is:

    at com.hyperion.odi.essbase.wrapper.EssbaseConnection.executeMaxl (unknown Source)

    ... more than 40

    Log in to Oracle Support, and then search for the document 1152893.1

    See you soon

    John

    http://John-Goodwin.blogspot.com/

  • Problem with loading data into the cube with members twice in different dimensions

    Have a cube with members dublicate allowed only in two dims (period1 and periode2).

    Saves the contours without error, but when I try to load data, I got some errors:

    "Member 2012-12 is a member in doubles in the sketch.

    A6179398-68BD-7843-E0C2-B5EE81808D0B 01011 cd00 st01 2905110000 EK fo0000 2012 NNNN-NN-NN-12 cust00000$ $$ 1 "

    The dims are two similar periods. I'm changing the names of members and the alias a little in one of them (f.e. yyyy1-mm-> yyyy1 - mm1)?

    Users wouldn't like it...

    Period.1

    yyyy1

    yyyy1/q1

    yyyy1-mm1

    yyyy1-mm2

    yyyy1-mm3

    yyyy1/q2

    ....

    ....

    Periode2

    yyyy1

    yyyy1/q1

    yyyy1-mm1

    yyyy1-mm1-dd1

    yyyy1-mm1-dd2

    ......

    yyyy1-mm2

    yyyy1-mm3

    yyyy1/q2

    ....

    ....

    Tnanx

    Yes your records of entry must have member of full names

  • EAS load data and contour review

    Dear all,

    I am currently using Hyperion planning 11.1.2.2, I created an application with appropriate (see below) outline

    HSP_Rates/account/period/year/scenario/Version/currency/entity/brand/product/customer/measure

    and created the corresponding tables in essbase I want to load the data via Eas, I created a file to load data (a table) with exactly the same column names as my plan dimensions names EXCEPT HSP_Rates which is absent in my data file. I'm sure all the dimension members are correct / exist and there is no empty fields.

    When loading the file, the following error message appears:
    Value data [2056] met before that all selected Dimensions, complete Records [2]
    Unexpected error Essbase 1003007

    Question: I need to add a column named HSP_Rates in my data file? should what value I put under this column?


    PS: workload data file is an Excel file

    Thanks in advance for your help.
    Best regards.

    the problem is resolved:

    Load data into Essbase using the rules file

    the answer is Yes, create a column named HSP_Rates with 'HSP_InputValue' as values in the data file.

    Thank you
    you

  • Problem loading data

    Dear,

    We strive to distrigute data feature to load basic users.

    However, it seems that in our tests of this kind of users can only use the option "replace it with security.

    If they try to run the load of data without selecting this option, the system returns the following error:

    "No write access to the period December."

    What is very odd that the period of that December is available for entry to the same users through data entry forms.

    Help, please.

    There are no roles that control. The only role that comes close is the role of reviewer access of management control process.

    Other than that, it is the safety of the classes that control where you can load data. But if you have access to an intersection, you can load.

  • Failed to load data using the contour load utility

    I am unable to load data using contour utility charge.
    Assigned data loading dimension such as: account
    Driver-Dimension such as: period
    Member as a pilot: Jan

    Connection file:

    Account, Jan, Point of view, the name of Cube data loading
    Investment, 1234, "India, current, heck, FY13", Plan1


    Utility of contour control
    OutlineLoad A:pract U:admin /I:C:\test1.csv /D:Account /L:C:\lg.log /X:C:\ex.exc


    Exception file

    [Thu Mar 28 18:05:39 GMT + 05:30 2013] Error loading of data record 1: investments, 1234, '' ' India, common, project, FY14' ' ', Plan1
    [Thu Mar 28 18:05:39 GMT + 05:30 2013] com.hyperion.planning.InvalidMemberException: Member India, common, rough, FY14 does not exist or you do not have access to it.
    [Thu Mar 28 18:05:40 GMT + 05:30 2013] Planning of vector data store processes loaded with exceptions: exceptions have occurred, examine the file exception for more information. 1 data record has been read 1 record of data have been processed, 0 were loaded successfully, 1 was rejected.


    Log file:


    Connected application "Rlap", liberation 11.113, adapter Interface Version 5, supported and not enabled workforce, CapEx no taken care and not active, CSS Version 3
    [Thu Mar 28 18:05:38 GMT + 05:30 2013] Input file located and opened successfully "C:\load.csv".
    [Thu Mar 28 18:05:38 GMT + 05:30 2013] Record header fields: account, Jan, Point of view, the name of Cube data loading
    [Thu Mar 28 18:05:38 GMT + 05:30 2013] Located by using 'Account' dimension and for the loading of data into the application 'Rlap.
    [Thu Mar 28 18:05:40 GMT + 05:30 2013] Loading dimension 'Account' has been successfully opened.
    [Thu Mar 28 18:05:40 GMT + 05:30 2013] A refresh of the cube operation will not be run.
    [Thu Mar 28 18:05:40 GMT + 05:30 2013] Create filters for safe operation will not be performed.
    [Thu Mar 28 18:05:40 GMT + 05:30 2013] Look at the files of newspapers of Essbase to status if Essbase data have been loaded.
    [Thu Mar 28 18:05:40 GMT + 05:30 2013] Planning of vector data store processes loaded with exceptions: exceptions have occurred, examine the file exception for more information. 1 data record has been read 1 record of data have been processed, 0 were loaded successfully, 1 was rejected.



    Infact members exist in outline.
    Any help would be appreciated.

    can you double check your csv file, open it in a text editor, because the error is showing additional citations.

    [Thu Mar 28 18:05:39 GMT + 05:30 2013] Error loading of data record 1: investments, 1234, '' ' India, common, project, FY14' ' ', Plan1

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Replace Option - deletion of data in other versions by loading data

    Hello

    While loading data using FDM, we use the option "replace". As expected, the existing data in the period in question, and all accounts are deleted before the new data file is loaded.

    We also observed that the data in another version is deleted due to this option. What is a standard behavior? If not, kindly tell me a way to prevent the deletion of data in other versions.

    Kind regards

    Amjad

    What is "replaced" in Essbase is determined by the cleardata control in the action of the LOAD of the Essbase adapter. Fixed default cleardata control on the scenario and the entities that are part of the load and will erase data for the period/periods being loaded, if you want a more targeted replace, update the cleardata statement in the script to load.

  • How to load data from ms sql to the by the file rules and maxl essbase

    Hi, everybody!
    Pretty Shure the kind of topic exists already on the forum, but unfortunately can't find it.

    I want to understand, how to load data from the database to ms sql for the PB of essbase application.

    (I) so I have:
    1. a request for 'society '.
    2 and its database 'Plan '.
    3. with simple outline, which contains only two dimensions:

    -The time period < 1 > (Label only)
    -Total year (~)
    -Qtr1 (+)
    Jan (+)
    Feb (+)
    Mar (+)
    -Qtr2 (+)
    APR (+)
    May (+)
    Jun (+)

    -Accounts < 1 > (Label only)
    -Lost / based (~) (Label only)
    -Revenues (+)
    L1.1 (+)
    L1.2 (+)
    -Costs (-)
    L2.1 (+)
    L2.2 (+)

    (II) also, I created a rules file called "CO_DWH" and associated with this schema
    It (Rules file) a 3 columns:
    Data of the 'period', 'account '.

    There is also the option checked "load data".

    (III) in MS SQL Server, I have a database of "hypdb" and "essadmin" with "password" login

    (IV) I executed my bat file:
    C:\Hyperion\EssLoad.bat
    There is only one line:

    EssLoad.bat-
    startMaxl.cmd EssLoad.txt
    ----------------------------------------------

    EssLoad.txt-
    login password admin on erpserver;
    Import database data Company.Plan connect in essadmin password using the server rules_file 'CO_DWH' error abort;
    disconnection;
    "exit";
    --------------------------------------------


    All plans that I copied well worked ERP system, but I don't understand, this is exactly table in the MS SQL database, data loading to essbase db?

    TOU have to do a few things
    1. on the server of Essbase, put in place a system odbc for your MySQL database connection
    2. in the State of charge, go to the file menu and select open source SQL database and in the source of data in your SQL statement. A few tips. Where it says select, don't put in the select Word and when he says to not put in of. The system adds for you. TI enter a simple SQL statement is to do this in the box so select if your SQL would be normanlly as select * from myTable just enter the code as * from myTable in the upscale neighborhood

    The ol/recover the click and fill your connection information. Should bring back the data in the State of charge. Save the State of charge and use

Maybe you are looking for