Event scripts FDM shot twice during data loads

Here's an interesting question. I added the following three scripts to different event (one at a time, ensuring that one of them is both), clear data before loading to Essbase:


Script event content:
' Declare local variables
Dim ObjShell
Dim strCMD
«Call MaxL script to perform data clear the calculation.»
Set objShell = CreateObject ("WScript.Shell")
strCMD = "D:\Oracle\Middleware\EPMSystem11R1\products\Essbase\EssbaseClient\bin\startMAXL.cmd D:\Test.mxl"
API. DataWindow.Utilities.mShellAndWait strCMD, 0


MaxL Script:
Login * identified by * on *;
run the calculation ' FIX("Member1","Member2") CLEARDATA "Member3"; ENDFIX' on *. *** ;
"exit";




However, it seems that clear is performed twice, both before and after the data has been loaded to Essbase. This has been verified at every step, checking the newspaper of Essbase applications:

No script event:
-No Essbase data don't clear in the application log

Above to add the script to the event "BefExportToDat":
-The script is executed once when you click Export in the customer Web FDM (before the "target load" modal popup is displayed). Entries are visible in the log of Essbase applications.
-Script is then run a second time when you click the OK button in the modal pop-up "target Load System". Entries are visible in the log of Essbase applications.

Above to add the script to the event "AftExportToDat":
-The script is executed once when you click Export in the customer Web FDM (before the "target load" modal popup is displayed). Entries are visible in the log of Essbase applications.
-Script is then run a second time when you click the OK button in the modal pop-up "target Load System". Entries are visible in the log of Essbase applications.

Above to add the script to the event "BefLoad":
-Script only runs that after you click Export in the FDM Web Client (before 'target system load' modal popup is displayed).
-Script is run AFTER loading to Essbase data when the OK button is clicked in the modal popup "load the target system". Entries are visible in the log of Essbase applications.

Some notes on the above:
1. "BefExportToDat" and "AftExportToDat" are both performed twice, before and after the modal popup "target Load System". :-(
2. "befLoad" is executed WHEN the data is loaded to Essbase. :-( :-(

Someone please any idea how we could run a clear Essbase database before the data is loaded, and not after we have charged for up-to-date data? And maybe about why event scripts above seem to be fired twice? It doesn't seem to be any logic to this!


BefExportToDat - entered in the journal Application Essbase:
+ [Sea 16 May 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info (1013091) +]
+ Received order [calculate] user [directory admin@Native] +.

+ [Sea 16 May 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info (1013162) +]
+ Received order [calculate] user [directory admin@Native] +.

+ [Sea 16 May 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info (1012555) +]
+ Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.
+...+

+ [Sea 16 May 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info (1003037) +]
Updated load cells [98] data

+ [Sea 16 May 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info (1003024) +]
Data load time: seconds [0.52]
+...+

+ [Sea 16 May 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info (1013091) +]
+ Received order [calculate] user [directory admin@Native] +.

+ [Sea 16 May 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info (1013162) +]
+ Received order [calculate] user [directory admin@Native] +.

+ [Sea 16 May 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info (1012555) +]
+ Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.


AftExportToDat - entered in the journal Application Essbase:
+ [Sea 16 May 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info (1013091) +]
+ Received order [calculate] user [directory admin@Native] +.

+ [Sea 16 May 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info (1013162) +]
+ Received order [calculate] user [directory admin@Native] +.

+ [Sea 16 May 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info (1012555) +]
+ Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.
+...+

+ [Sea 16 May 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info (1003037) +]
Updated load cells [98] data

+ [Sea 16 May 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info (1003024) +]
Data load time: seconds [0.52]
+...+

+ [Sea 16 May 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info (1013091) +]
+ Received order [calculate] user [directory admin@Native] +.

+ [Sea 16 May 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info (1013162) +]
+ Received order [calculate] user [directory admin@Native] +.

+ [Sea 16 May 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info (1012555) +]
+ Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.


BefLoad - entered in the journal Application Essbase:
+ [Sea 16 May 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info (1013091) +]
+ Received order [calculate] user [directory admin@Native] +.

+ [Sea 16 May 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info (1013162) +]
+ Received order [calculate] user [directory admin@Native] +.

+ [Sea 16 May 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info (1012555) +]
+ Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.
+...+

+ [Sea 16 May 16:23:44 2012]Local/Monthly/Monthly/admin@Native Directory/140095929911040/Info (1003037) +]
Updated load cells [98] data

+ [Sea 16 May 16:23:44 2012]Local/Monthly/Monthly/admin@Native Directory/140095929911040/Info (1003024) +]
Data load time: seconds [0.52]
+...+

+ [Sea 16 May 16:23:45 2012]Local/Monthly/Monthly/admin@Native 140095860504320/Directory/Info (1013091) +]
+ Received order [calculate] user [directory admin@Native] +.

+ [Sea 16 May 16:23:45 2012]Local/Monthly/Monthly/admin@Native 140095860504320/Directory/Info (1013162) +]
+ Received order [calculate] user [directory admin@Native] +.

+ [Sea 16 May 16:23:45 2012]Local/Monthly/Monthly/admin@Native 140095860504320/Directory/Info (1012555) +]
+ Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.

James, the scripts export and the Load event will fire four times, once for each type of file: the. DAT file (main TB file),-A.DAT (log file),-B.DAT and - c.DAT.

To work around this problem, then only run during the loading of the main TB file, add the following or something similar at the beginning of your event scripts. This assumes that strFile is in the list of parameters to the subroutine:

Select Case LCase(Right(strFile,6))
     Case "-a.dat", "-b.dat", "-c.dat" Exit Sub
End Select

Tags: Business Intelligence

Similar Questions

  • How to maintain the consistency of data of continuous report during data loads?

    Hi all

    I have a few reports that draw data from data on sales, and this mini mart - data warehouse is responsible for hourly and my report has to show data updates... it that changes necessary config that I have to do in the RPD to make reports to obtain the most recent data?

    Thank you
    DK

    DK... The imp thing, you need to take care is to disable the cache, or purge the memory cache after each load by using the method of pooling of events. This will ensure that you see the most recent data and data not expired.

    Hope this helps

  • Using the event Script to load the Excel model data for users

    Hello

    Whoever did this in the FDM 11.1.2.1.

    We want to exploit the event Script to load the data validated MDF for a standard template so that users can make some adjustments and then load the same data through the FDM? Is there a better alternative. Basically, trying to avoid making adjs data forms in HFM in view of the number of columns that we have and validations for each line required with adjustments.

    Thank you

    PM

    Hello

    google + Administrator's Guide.

    It's just a matter of thought in the solution:

    -Runs a SQL query to export data from table (TDATASEG). You can filter the query by LOADID

    -Write csv file records

  • Load event page which passes twice

    Consider the following code:

    
        
        
        
        
            

    this is a test

    the "page load event, context is [object event]" will be printed twice.  This causes havoc in my application. This only happens in the emulator to ripple, not in any other browser.

    Is there a work around for this? I found that if I wrap it in a setTimeout I am ok, but sounds a little hokey...

    It's a quirk with ripple and how it loads pages. You can use your during your dev setTimeout function and take it out when you test the device or submit.

  • HFM Consolidation during the loading of data

    Hi Experts,

    I have a small question on the Consolidation of the FDM to time data load of HFM.

    I know THAT FDM is going to pass parameters to the HFM to perform the Consolidation, all these entities available in the list of the VALIDATION ENTITIES to respective location and Validation group.

    My Question is, what scenario should he perform the consolidation in the HFM. (Where can I check it)

    1. It is based on the SCENARIO, mentioned in the validation logic in the VALIDATION RULE. (because, I find a difference from the same Point of view to scenario 2)

    If Abs (|) ACTUAL_1, ASSETS, PKI, Custom1, Custom2, Custom3, Custom4, Jolie; REAL ASSETS, PKI, Custom1, Custom2, Custom3, Custom4, |) > 10 then

    RESULT = False

    RES. PstrCheckMessage1 = "Difference between scenario 2, please check.

    On the other

    RESULT = True

    End If

    1. For the logic of the above rule, what scenario will be consolidated.

    Regarding

    Kath

    You're right, target category for your scenario of HFM.

    Active category of FDM is your selected in the bar of FDM POV category. Categories are used to "categorize" the data, such as scenarios. So you can see category FDM as script source.

  • Is it possible to ignore some accounts during the loading of data

    Hi, I have a rules file that I use to load the data.

    I want to ignore some accounts (only for now 112123, 123453, 546567) during my loading of data.

    is there a way to do... ?

    Thanks in advance

    Check by using a rules file to perform operations on documents, fields, and data

    Rejecting Records

    You can specify which fields Essbase ignores defining criteria of rejection. Rejection criteria are string and number conditions, when met by one or more fields of a record, causes Essbase refuse registration. You can set one or more criteria of rejection. If any field in the record meets the criteria of rejection, Essbase loads the record. For example, to dismiss the actual data from a data source and load only the budget data, create a rejection criterion for rejecting the records where the first field is real.

    Concerning

    Celvin Kattookaran

  • With FDM/ERPi of Oracle Data Source incremental data loads

    Hello

    I use ERPi 11.1.2.1. In the workspace, it is possible to set the option of the rule of the data load to be instant, or incremental. However, I use FDM/ERPi set to load data from Oracle GL in Essbase. Is it possible for me to put in place the FDM for the data to load incremental data rules charges? Could be a parameter in the source ERPi adapter?

    Thanks for any information you could provide.

    Yes, the source ERPi adapter there is an option for "Method of Load Data" that will allow you to define how the rule of the DL is run. By default, it is "FULL REFRESH" but can be changed.

    (A) connecting to the application via the workbench and the source system adapters
    (B) make a right click on the Source ERPI adapter and choose "options".

    You will see an option to load method and it will be the full refresh value, choose the value of the option you want in the menu drop-down and save.

  • Move rejected records to a table during a data load

    Hi all

    When I run my interfaces sometimes I get errors caused by "invalid records. I mean, some contents of field is not valid.

    Then I would move these invalid to another table records, while the data loading lights in order not to interrupt the loading of data.

    How can I do? There are examples that I could follow?

    Thanks in advance

    concerning

    Hi Alvaro,

    Here you can find different ways to achieve this goal and choose according to your requirement:

    https://community.Oracle.com/thread/3764279?SR=Inbox

  • Run the script FDM Essbase, no .bat files

    I want to run a script directly from FDM Essbase without needing to use a .bat file.
    I went to hollow post where calls MaxL scripts are discussed, but I have not found anywhere, there is a solution for sending a script to essbase.

    I have problems connecting to essbase using already existing and subsequently adapter configurations send an essbase script to the server. This is done for example in the Load event - where the essbase data are disabled. I would need assistance on how to implement this kind of soluition in the event script for example BefImport.

    Hi Nicklas,

    A couple of things to note.

    You must pass a 1 as the first parameter to use a file as a calculation script. Note that a calculation .csc script example files that are in your database and it's called AggMonth.csc you would spend in:
    Set objHWRerurn = API. IntBlockMgr.IntegrationMgr.PobjIntegrate.varCon.DataManipulation.fCalculate(1,"AggMonth")

    If you use 2 as the parameter to the fCaculate function, which means that the calculation script is a string, you can search for 'Essbase calculation chain' in the adapter to see how it works.

    If you really want to use a .maxl script and not an Essbase calc script you do not use the fCaculate function and will shout to essmsh.exe with parameters that does not use FDM adapter functions.

    PS You can also configure features of validation in FDM and assign scripts to those and then tell FDM you use 'file '. For the planning/essbase, it makes more sense to have just a script probably the method you follow.

    Kind regards

    John A. Booth
    http://www.metavero.com

    Published by: Jbooth on March 18, 2011 09:25

  • FDMEE event Script - BefFileImport

    Hi all

    I was looking for to use the feature to change dynamically Import referencing the guide to FDMEE administrator (see below).  However, BefFileImport event script does not seem to be an option in the Script Editor.  I also created a file called BefFileImport.py and she didn't register when I've run a load process.

    Is this an oversight in the Administrator's guide and they forgot to add this to FDMEE or am I missing something?

    THX,

    Mike

    Dynamically change import Formats

    You can use the BefFileImport event to change the format to import for a location dynamically.

    If fdmContext ["LOCNAME'] == 'ITALY ':

    filename = fdmContext ["FILENAME"]

    If filename [: 12] == 'ProductSales:

    fdmAPI.updateImportFormat ("SALESJOURNAL", fdmContext ["LOADID"])

    To answer my own question, it seems that I can dynamically change the Import Format using the BefImport event script.  I believe that the reference to "BefFileImport" in the FDMEE Administrator's guide is a classic obsolete FDM reference... they should refresh that.

    THX,

    Mike

  • Using .split on an import script with a comma delimited data file

    Hi everyone-

    Any attempt to create a script to import amount field to remove the apostrophes (') of a description field to account in a .csv data import file (any folder with an apostrophe will be rejected during the import phase).  Now if it was a file delimited by semicolons (or other separators and more by commas), I could remove all the apostrophes recording with a string.replace command, then return the amount with a command of string.split.  Unfortunately, there is a problem with this solution when using comma delimiters. My data file is comma-delimited .csv file with several amount fields that have commas in them.   Even if the fields are surrounded in quotes, the .split ignores the quotes and treats the commas in fields amount as delimiters.  Therefore, the script does not return the correct amount field.

    Here is an example of a record of reference data:

    "", "0300-100000", "description of the account with an apostrophe ' ', '$1 000.00",' $1 000.00 "," $1 000.00 ","$1 000.00"," $1 000,00 "" "

    My goal is to remove the apostrophes from field 3 and return the amount in field 8.

    Some things to note:

    • If possible, I would like to keep this as an import script for amounted to simplify administration - but am willing to undertake the event script BefImport if this is the only option or more frank than the import script-based solution.
    • I tried using regular expressions, as seems to be conceptually the simplest option to respect the quotes as escape character, but think that I am not implementing properly and impossible to find examples of regex for FDMEE.
    • I know that we cannot use the jython csv on import the script by Francisco blog post - fishing with FDMEE: import scripts do not use the same version of Jython as event/Custom scripts (PSU510). This may be a factor to go with a script of the event instead.
    • It's probably a little more engineering solution, but I have considered trying to write a script to determine where to start all the quotes and the end.  Assuming that there are no quotation marks on the inside of my description of account (or I could remove them before that), I could then use the positioning of the quotes to remove commas inside those positions - leaving the commas for the delimiters as is.  I could then use the .split as the description/amount fields have no commas.  I think it may be better to create a script of the event rather than down this solution from the point of view to keep administration as simple as possible
    • Yes, we could do a search and replace in the excel file to remove the apostrophes before import, but it's no fun

    Thanks for any advice or input!

    Dan

    Hi Dan,.

    If your line is delimited by comma and quote qualified, you can consider the delimiter as QuoteCommaQuote or ', ' because it comes between each field.  Think about it like that, then simply divided by this value:

    split("\",\"")

    Here's something I put together in Eclipse:

    '''

    Created on Aug 26, 2014

    @author: robb salzmann

    '''

    import re

    strRecord = "\"\",\"0300-100000\",\"Account description with an apostrophe ' \",\"$1,000.00\",\"$1,000.00\",\"$1,000.00\",\"$1,001.00\",\"$1,002.00\""

    strFields = strRecord.split("\",\"")

    strDescriptionWithoutApos =  strFields[2].replace("'", "")   'remove the apostrophe

    strAmountInLastCol = strFields[-1:].replace("\"", "")        'You'll need to strip off the last quote from the last field

    print strDescriptionWithoutApos

    print strAmountInLastCol

    Account with an apostrophe description

    ' $1 002,00

  • How do you check if Validate is successful in the event Script

    Hello

    I am facing a problem with a script of the 'aftvalidate' event that retrieves the data from the Tsegdata table. The script is written to a file as expected. But I want to put a condition above. I want to run the Script event only if the Validation is successful (that is to say the fish is Orange). I thought that we can use the lngProcState as "11" for success. He is yet to come as "11" even after that validation failed (IE fish is grey) and therefore running the script of the event. Is there another function that I can make use of, if this isn't the right way to test whether validation succeeded. ?

    Another thing - we have all the features that I use in the script to delete the imported data? I want to use it in the aftvalidate script once the validation is successful and generates the data file.

    Thanks in advance.

    PM

    If you want users to export data, but not load, you can try to set option adapter "Allows to load" false.

    With this configuration, users will be able to click on Export but they won't get a LOADING popup.

  • Data loading 10415 failed when you export data to Essbase EPMA app

    Hi Experts,

    Can someone help me solve this issue I am facing FDM 11.1.2.3

    I'm trying to export data to the application Essbase EPMA of FDM

    import and validate worked fine, but when I click on export its failure

    I am getting below error

    Failed to load data

    10415 - data loading errors

    Proceedings of Essbase API: [EssImport] threw code: 1003029 - 1003029

    Encountered in the spreadsheet file (C:\Oracle\Middleware\User_Projects\epmsystem1\EssbaseServer\essbaseserver1\app\Volv formatting

    I have Diemsion members

    1 account

    2 entity

    3 scenario

    4 year

    5. period

    6 regions

    7 products

    8 acquisitions

    9 Servicesline

    10 Functionalunit

    When I click on the button export its failure

    I checked 1 thing more inception. DAT file but this file is empty

    Thanks in advance

    Hello

    Even I was facing the similar problem

    In my case I am loading data to the Application of conventional planning. When all the dimension members are ignored in the mapping for the combination, you try to load the data, and when you click Export, you will get the same message. . DAT empty file is created

    You can check this

    Thank you

    Praveen

  • ODI - SQL for Hyperion Essbase data loading

    Hello

    We have created a 'vision' in SQL Server that contains our data.  The view currently has every year and periods of Jan 2011 to present.  Each period is about 300 000 records.  I want to only load one period at a time.  For example may 2013.  Currently we use ODBC through a rule of data loading, but the customer wants to use ODI to be compatible with the versions of dimension metadata.  Here's the SQL on the view that works very well.   Is there a way I can run this SQL in the ODI Interface so it pulls only what I declare in the Where clause?  If yes where can I do it?

    Select

    CATEGORY, YEAR, LOCATION, SCRIPT, DEPT, PROJECT, EXPCODE, TIME, ACCOUNT, AMOUNT

    Of

    PS_LHI_HYP_PRJ_ACT

    Where

    YEAR > = "2013" AND PERIOD = 'MAY '.

    ORDER BY CATEGORY ASC ASC FISCAL_YEAR, LOCATION ASC, ASC, ASC, ASC, ASC, PERIOD EXPCODE PROJECT DEPT SCENARIO CSA ACCOUNT CSA;

    Hello

    Simply use the following KM to load data - IKM SQL for Hyperion Essbase (DATA) - in an ODI interface that has the view that you created the Source model. You can add filters to the source which are dynamically by ODI variables to create the Where clause based on the month and year. Make sure you only specify a rule of load method to load the data into the KM

  • problems with the JSON data loading

    Hello

    I have follow-up Simon Widjaja (EDGEDOCKS) YouTube lesson for the JSON data loading external. But I am not able to connect at least the console database.

    I get this error: "error avascript in the handler! Type of event = element.

    Content.JSON is located in the folder. Data there are very simple:

    [

    {

    "title": "TITLE 1",

    'description': "DESCRIPTION 1"

    },

    {

    "title": "TITLE 2",

    'description': "DESCRIPTION 2"

    }

    ]

    And here's the code in edgeActions.js:

    (function ($, edge, compId) {})

    Composition of var = Edge.Composition, symbol = Edge.Symbol; alias for classes of edge commonly used

    Edge symbol: "internship."

    (function (symbolName) {}

    Symbol.bindElementAction (compId, NomSymbole, 'document', 'compositionReady', function (sym, e) {})

    external json data loading

    $.ajax({)

    type: 'GET ',.

    cache: false,

    URL: "content.json",

    data type: 'json ',.

    success: function (data) {console.log ("data:", data);},

    error: function() {console.log ("something went wrong") ;}}

    });

    });

    End of binding edge

    (}) ('step');

    End of edge symbol: "internship."

    }) (window.jQuery |) AdobeEdge. ($, AdobeEdge, "EDGE-11125477");

    I tried $getJSON also as mentioned in the youtube video.

    Please note: I do not understand 'something was wrong' also connected.

    I use the free trial version. It is a limitation in the free trial version?

    Well, same question as here: loading external data using ajax

    Cannot run the jQuery file is missing, then $. ajax() or $. getJSON().

    You must add the jQuery file as shown below:

    See: http://jquery.com/download/

    Note: Without loading the jQuery file, you can use these functions: API JavaScript Adobe Edge animate CC

Maybe you are looking for

  • Tecra A9 - background noise when recording an audio file

    Anyone know how to get a clear his record on a Toshiba Tecra A9? All my voice recordings are of low quality. Compare my Tecra sound recording:http://nursing.de/SoundRecording_ToshibaTecraA9.wav with a sound that I did on the other laptop registration

  • How to program the CFP-1808 via serial port?

    Dear Sir/Madam, I am studying this topic shows CFP-1808. I work as for the Department of mechanical engineering, University of Moratuwa. It is therefore want to do a practicle using PSC for the last year students of tha. First of all, I have to study

  • Trojan horse? items disappeared from the start menu and shortcuts of desktop do not work

    original title: Trojan horse? items disappeared from the start menu and desktop shortcuts do not work. I can't open any file I download cannot use the system restore and do not have cd to install what can do to solve this problem?

  • Can I feed 2 hard drives, one with XP and one with Win7?

    I have 2 160 GB SATA HDs. C: drive is runing Win7, and I want to install the other drive with Win XP, for me to perform older simulations. I unplugged the C: drive that has Win7 on it and went through the usual configuration with winXP, but after all

  • My office does not what it is.

    My office got frozen, unclickable and responding at all? Help, please! I use a laptop running in windows 7. Recently, he still has a problem causing my frozen office, unclickable and does not at all while the taskbar, the Start button work well. Here