Help to use MaxL to automate data loading Essbase

Hello

I am trying to automate the loading of data in an Essbase Database (for a planning application). My current script using the Maxl shell looks like this:

MaxL >
DATABASE Application.Database DATA IMPORT
OF TEXT SERVER DATA_FILE * 'data.csv' *.
With the HELP of SERVER RULES_FILE * 'csv. "
ERROR WRITING to *'c:\\hyperion\\MaxL_logs\data_csv.err'*.

In the script bold text indicates specific names or files for my case.

Using this script works successfully to load data in my planning application, as long as the data data.csv file is located in the directory of Planning application database. I want to be able to do is to load the data file data.csv from a directory that I clarify, as the data file cannot be placed in the directory database and must have its own directory instead.

Does anyone know what I need to use in a MaxL script to load a data file that is not in the directory database of the syntax? I can provide more information if needed. Thank you.

Hello

Have you tried...

DATABASE Application.Database DATA IMPORT
OF DATA_FILE TEXT "C:\\temp\\data.csv".
With the HELP of SERVER RULES_FILE 'csv '.
ON ERROR WRITE to 'c:\\hyperion\\MaxL_logs\data_csv.err ';

or

DATABASE Application.Database DATA IMPORT
TEXT LOCAL DATA_FILE 'C:\\temp\\data.csv '.
With the HELP of SERVER RULES_FILE 'csv '.
ON ERROR WRITE to 'c:\\hyperion\\MaxL_logs\data_csv.err ';

See you soon

John
http://John-Goodwin.blogspot.com/

Tags: Business Intelligence

Similar Questions

  • You can use SubVar in header data load rule?

    I think it's a 'no '.  We get a data file:

    Jan | FY13. Petty cash | Dept200 | Forecasts | Working | 25.00

    We want to ignore the 6th "fieldwork" and instead load the data that the & subvar Version has the value.   I try it and it does not work but maybe someone thought new a way forward in the DLR?

    Thank you

    Yes you can do it. I found problems when I got it that the last value of multiple values in the header. moving forward fixed my problem to just put in as & variablename or a certain number of members

    & variablename, 'Real', 'nextdim' without the quotes around the variable and the commas between the names of members. You don't mention what version you're on, but I think the oldest supported version is 9.3.1

  • loading data to essbase using EAS against back-end script

    Good afternoon

    We have noticed recently that when loading of our ASO cube (with the help of back-end scripts - esscmd etc.) it seems to load much more overhead then when using EAS and loading files of data individually.  When loading using scripts, the total size of the cube has been 1.2 gig.  When you load the files individually to EAS, the size of the cube has been 800 meg.  Why the difference?  is there anything we can do in scripts supported to reduce this burden?

    Thank you

    You are really using EssCmd to load the ASO cubes?  You should use MAxL with buffers to load. Default EAS uses a buffer to load when you load multiple files. Esscmd (and without the command buffer MAxL) won't. It means loads long and larger files. Loading ASO, it takes the existing .dat file and adds the new data. When you are not using a buffer load, it takes the .dat file and the .tmp file and merges together, then when you do the second file, it takes the files .dat (which includes the first load data) and repeats the process. Whenever he does that he has to move together (twice) .dat file and there is growth of the .dat files. If nothing else I'll call it fragmentation, but I don't think it's as simple as that. I think it's just the way the data is stored. When you use a buffer and no slices, he need only do this once.

  • Failed to load data to Essbase (IKM SQL for Hyperion Essbase (DATA)

    I am trying to load data to Hyperion Essbase. Unfortunately he is not going so well. I followed the instructions but I get this error "BUFFER_SIZE. I have not changed the default BUFFER_SIZE (it is set to < default >: 80) and I have not changed any other settings in the KM.

    Appreciate any thoughts...



    com.hyperion.odi.common.ODIConstants has No attribute * 'BUFFER_SIZE. "

    org.apache.bsf.BSFException: exception of Jython:
    Traceback (innermost last):
    "< String >" file, line 82, inside?
    AttributeError: class 'com.hyperion.odi.common.ODIConstants' has no attribute 'BUFFER_SIZE '.
    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
    at com.sunopsis.dwg.codeinterpretor.k.a (k.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting (SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders (SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders (SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt (SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt (SnpSessTaskSqlI.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask (SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep (SnpSessStep.java)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession (SnpSession.java)
    at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand (DwgCommandSession.java)
    at com.sunopsis.dwg.cmd.DwgCommandBase.execute (DwgCommandBase.java)
    at com.sunopsis.dwg.cmd.e.i (e.java)
    at com.sunopsis.dwg.cmd.h.y (h.java)
    at com.sunopsis.dwg.cmd.e.run (e.java)
    at java.lang.Thread.run (unknown Source)

    Published by: Chris Rothermel on April 13, 2010 15:44

    Hello

    What ODI and patch version you are running.
    Looks like you are using a KM newer than the java files that are located in the oracledi\drivers directory, I would say that you need to update the files java essbase for a newer version.
    In the last few patch releases memory size buffer has been added to the data load essbase KM for the use of the ASO, even if this meant you had to also update the java files as well as in the version.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Archives old data of Essbase database

    Running Essbase 6.5.3. (I know, it's old, but...) Running UNIX.

    I have a database of Essbase with data from 2002. I want to archive all data prior to 2006. I always have these data available for users in a separate database, but wants to reduce the amount of data in our primary database. Does anyone have the steps to do this?

    I was intending to make a copy of the application/database active using the Regional service. Perform export/import of data of level 0. My problem is that I don't know exactly how to divide the export files.

    I read a few forum messages that suggested using grep against the export file to divide the old data new data. I know how to use grep, but don't know exactly what to grep for the file. Initially, I thought I could grep based on the period (Jan_02, Feb_02, etc.), but after looking at the export file, there are lines that do not have these data. It looks like these lines of type, maybe "header." Do I need to include all type header lines in the 'current' and 'old' files?

    Also, FYI, the export file is too large to open in vi, which makes it a little more interesting.

    I wouldn't try to manipulate the export files.

    Simply copy the existing application to a new application with the data and then open the outline and remove unwanted members of the year. Save and restructure the contour and you'll have your archived data.

    Another option would be to use a rule of data loading to load the export file. As a rule set column properties to select or reject the required years and load files.

  • Passing parameters to MaxL to be used in a data load rule

    I would like to create a MaxL script that accepts parameters, and this parameter is then used ina given load rule, IE. It will be used to set the exercise used in a data load rule field.

    is it possible somehow? I don't see how I can use the settings of a rule of data loading, but maybe I could use MaxL to set the parameter as a value in a variable substitution and maybe use this way?


    Thank you.

    You gave the answer, yousef. You cannot pass parameters to a MaxL statement to a rule of the charge, but charge rules may have them-substitution variables (System 9 or newer). If the void var in a statement MaxL value, then on the State of charge, use that sub var fade like a header or a column header

  • When I try to use automatic dates, I get code 8007005. I ran the check disk and there is no problem here.

    When I try to use automatic dates, I get code 8007005. I ran the check disk and there is no problem here.

    Hello

    see if that helps to fix:

    How to reset the Windows Update components?

    There is also an automatic 'fix - it' here

    http://support.Microsoft.com/kb/971058

    Also, try to put the KB numbers in the search on the link below and then manually download the

    http://www.Microsoft.com/downloads/en/default.aspx

     

    or please repost your question in the correct windows update forum

    http://answers.Microsoft.com/en-us/Windows/Forum/windows_vista-windows_update?page=1&tab=all

  • Is there a way to erase the data partially in a cube BSO using Maxl

    There is a BSO cube for which I have a script that deletes all data using MAXL characteristic data reset. Now we want to delete only data for a given year, any ideas on what the best approach would be to use MAXL? Just to clarify, it is also script which is scheduled to run every day.

    I know that we have the clear data in the "region" for ASO cubes, but is there something similar for the BSO?

    Appreciate any help

    You can use the third method on the examples above and do the math online

    run the calculation

    "FIX ('2015').

    CLEARBLOCK ALL;

    ENDFIX;'

    on Sample.basic;

    avoids the need to create a calculation script

  • Cannot load data into Essbase using ODI

    Hi guys,.

    Help help. I have problem loading data into essbase using ODI. The error message is
    java.sql.SQLException: unexpected token: ACCOUNT in the statement [select C1_ACCOUNT "" account]

    I have a very simple flat file that are similar to the below:

    Account, resources, time, data
    Active, Na_Resource, Jan, 10
    Active, Na_Resource, 12, February

    With the same flat files, I am able to load data to load rules.


    I use 9.3.1.0 and ODI 10.1.3.4.0 essbase. I use the ODI to load members and data in the planning without any problem.


    Thank you

    Hello

    It seems to generate an extra set of quotation marks around the SQL, in my interface it generates.

    SQL = "" "select C1_ACCOUNT 'Account', C2_PERIOD 'Period', C3_RESOURCE 'Resource', C4_DATA 'Data' of the" C$ _0TestApp_testData "where (1 = 1) «»

    Note the single quotes around the account.

    If you go to the topology Manager, on the tab of the physical architecture, right-click 'Hyperion Planning' > 'change '.
    Select the "Langugage" tab for the "JYTHON" line, make sure that the "Object Delimiter" field has no quotes, if it's remove and apply and save.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Automate in loading the data in Excel to table

    Hello Experts,

    I have one of the oldest question about loading an excel file in a table. I searched a little here but couldn't find what I was looking for.

    I have an Excel as follows:
    Product/Retailer     Net Sales     Net Adjustments     Cancels Count     Cancels Amount     Cashes Count     Cashes Amount     Claims Count     Claims Amount     Returns Count     Returns Amount     Free Prize Count     Free Prize Amount     Free Promo Count     Free Promo Amount     Promo Credit Count     Promo Credit Amount     Return Commission     Net Discounts     Total Fees     Sales Commission     Cash Commission     Tkt Charge     Subscription Commission     Interim Sweeps     Net Due     Retailer     Name
    1               0          0          0          0          0          0          0          0          0          0          0               0               0               0               0               0               0               0          0          0               0          0          0               0          0     1          Pseudo Outlet                 
                                                                                                                                                          
    2               0          0          0          0          0          0          0          0          0          0          0               0               0               0               0               0               0               0          0          0               0          0          0               0          0     2          Subscription Outlet           
                                                                                                                                           
    4               0          0          0          0          0          0          0          0          0          0          0               0               0               0               0               0               0               0          0          0               0          0          0               0          0     4          Syndicate Terminal Outlet     
                                                                                                                                           
    10000               0          0          0          0          0          0          0          0          0          0          0               0               0               0               0               0               0               0          0          0               0          0          0               0          0     10000          Keno Draw PC Default Location 
                                                                                                                                                
    Loto               29760          0          0          0          69          -9495          0          0          0          0          0               0               0               0               0               0               0               0          0          -1488               -95          0          0               0          18682     200101          Triolet Popular Store         
    Inst Tk               26000          0          0          0          207          -12220          0          0          0          0          0               0               0               0               0               0               0               0          0          -1300               -48          0          0               0          12432     200101          Triolet Popular Store         
    200101               55760          0          0          0          276          -21715          0          0          0          0          0               0               0               0               0               0               0               0          0          -2788               -143          0          0               0          31114     200101          Triolet Popular Store         
                                                                                                                                                
    200102               0          0          0          0          0          0          0          0          0          0          0               0               0               0               0               0               0               0          0          0               0          0          0               0          0     200102          Friends Fast Food & Take Away 
                                                                                                                                           
    Loto               48440          0          0          0          68          -14689          2          -12129          0          0          0               0               0               0               0               0               0               0          0          -2422               -147          0          0               0          31182     200103          Le Cacharel Snack             
    Inst Tk               26000          0          0          0          230          -14600          0          0          0          0          0               0               0               0               0               0               0               0          0          -1300               -67          0          0               0          10033     200103          Le Cacharel Snack             
    200103               74440          0          0          0          298          -29289          2          -12129          0          0          0               0               0               0               0               0               0               0          0          -3722               -214          0          0               0          41215     200103          Le Cacharel Snack             
         
    I need to automate the loading of this file data in my table.

    Any ideas on how to start the process?

    Thank you
    Kevin

    Published by: Kevin CK on April 22, 2010 12:51 AM

    no idea why?

    ORA-30653: reject limit reached
    Cause: the release limit has been reached.
    Action: Clean data, or increase the limit of rejection.

    Errors are indicated in your journal - / bad-/ discardfile.
    Have you checked the?

    You can try:

    create table invoice_excel_temp
    (
    Product_name                varchar2(50),
    rest of columns...
    )
    ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER
                            DEFAULT DIRECTORY GTECHFILES
                            ACCESS PARAMETERS (FIELDS TERMINATED BY ',')
                           ... etc...(you edited your example while I was typing ;) )
                            LOCATION ('invoice_excel_c00472.csv')
                           )
    REJECT LIMIT UNLIMITED;
    

    and see what happens next...

    Published by: hoek on April 22, 2010 10:19

  • How to load several files column data into essbase using the rule of load.

    Hello

    I need to load a file of data into essbase, which includes data from several columns.

    Here is a sample file.


    Year, Department, account, Jan, Feb, Mar, Apr, may, June, July, August, Sept, Oct, Nov, Dec
    FY10, ministere1, account1, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
    FY10, agencies2, account1, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
    FY10, ministere3, account1, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12


    Thank you
    Sirot

    But this isn't an error as such, that is to say that no data values have been changed so that they possible already exist in the database.
    If there is no release, they should be in a file of errors.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Loading Oracle data to essbase using erroneous ODI

    I am trying to load data from oral table to essbase. But I get the following error.

    org.apache.bsf.BSFException: exception of Jython:
    Traceback (innermost last):
    "< String >" file, line 23, in there?
    com.hyperion.odi.essbase.ODIEssbaseException: lack of standard dimension for data loading column
    at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData (unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (unknown Source)
    at java.lang.reflect.Method.invoke (unknown Source)
    at org.python.core.PyReflectedFunction.__call__ (PyReflectedFunction.java)
    at org.python.core.PyMethod.__call__ (PyMethod.java)
    at org.python.core.PyObject.__call__ (PyObject.java)
    at org.python.core.PyInstance.invoke (PyInstance.java)
    to org.python.pycode._pyx3.f$ 0 (< string >: 23)
    to org.python.pycode._pyx3.call_function (< string >)
    at org.python.core.PyTableCode.call (PyTableCode.java)
    at org.python.core.PyCode.call (PyCode.java)
    at org.python.core.Py.runCode (Py.java)
    at org.python.core.Py.exec (Py.java)
    at org.python.util.PythonInterpreter.exec (PythonInterpreter.java)
    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
    at com.sunopsis.dwg.codeinterpretor.k.a (k.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting (SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders (SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders (SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt (SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt (SnpSessTaskSqlI.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask (SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep (SnpSessStep.java)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession (SnpSession.java)
    at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand (DwgCommandSession.java)
    at com.sunopsis.dwg.cmd.DwgCommandBase.execute (DwgCommandBase.java)
    at com.sunopsis.dwg.cmd.e.i (e.java)
    at com.sunopsis.dwg.cmd.h.y (h.java)
    at com.sunopsis.dwg.cmd.e.run (e.java)
    at java.lang.Thread.run (unknown Source)
    Caused by: com.hyperion.odi.essbase.ODIEssbaseException: lack of standard dimension for data loading column
    at com.hyperion.odi.essbase.ODIEssbaseDataWriter.validateColumns (unknown Source)
    ... 32 more

    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: lack of standard dimension for data loading column

    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
    at com.sunopsis.dwg.codeinterpretor.k.a (k.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting (SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders (SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders (SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt (SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt (SnpSessTaskSqlI.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask (SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep (SnpSessStep.java)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession (SnpSession.java)
    at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand (DwgCommandSession.java)
    at com.sunopsis.dwg.cmd.DwgCommandBase.execute (DwgCommandBase.java)
    at com.sunopsis.dwg.cmd.e.i (e.java)
    at com.sunopsis.dwg.cmd.h.y (h.java)
    at com.sunopsis.dwg.cmd.e.run (e.java)
    at java.lang.Thread.run (unknown Source)


    Also, I have the following doubts

    1. what should be the LKM defined for the soruce. What is LKM SQL FOR SQL

    2. I have 6 standard dimensions in the table oracle but Currncy and hsprates in essbase. So, how can I ignore these two dimensions by loading data. Or is there a way where I can have the default options set for these two dimensions.

    3. can I load data from several tables into essbase. If Yes please let me know the procedure.

    Hello

    In the interface and the target data store, you can enter a value in the field of cartography.
    For example, if you have a column called year in your target data store and you still want to load FY09, then you would enter "FY09" in the field of cartography

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • The Essbase Server data loading

    I'm writing a set of commands maxl to automate the update of metadata. So here's what I did:

    1. export 'app' database.' DB' Level0 given data_file 'ABC.txt'-> server for subscribe Level0

    2 import database 'app'.' DB' dimensions...-> to load dimensions

    3 import database 'app'.' DB' database server text data_file "ABC.txt"-> to reload Level0

    Now I get 'Error of syntax near end of statement' on this stage even after trying 'server data_file' too. I had read some examples to use '$ARBORPATH', but that doesn't seem too hard. It's for 11.1.2.4

    Perhaps you lack the error on your import, the following works part

    import of database sample.basic data server 'level0.txt' error abortion data_file;

    Also environment variables will be levied on the machine that is running the session MaxL.

    See you soon

    John

  • Data loading convert timestamp

    I use APEX 5.0 and download data wizard allows to transfer data from excel.

    I define Update_Date as timestamp (2) and the data in Excel is 2015/06/30 12:21:57

    NLS_TIMESTAMP_FORMAT is HH12:MI:SSXFF AM JJ/MM/RR

    I created the rule of transformation for update_date in the body of the function, plsql as below

    declare

    l_input_from_csv varchar2 (25): = to_char(:update_date, 'MM/DD/YYYY HH12:MI:SS AM');

    l_date timestamp;

    Start

    l_date: = to_timestamp (l_input_from_csv, ' MM/DD/YYYY HH12:MI: SS AM' ");

    Return l_date;

    end;

    I keep having error of transformation rule.  If I do create a transformation rule, it will ask for invalid month.

    Please help me to fix this.

    Thank you very much in advance!

    Hi DorothySG,

    DorothySG wrote:

    Please test on demand delivery data loading

    I stick a few examples to the Instruction of Page, you can use copy and paste function, it will give the same result

    Please change a coma separator ',' on the other will be message "do not load".

    Check your Application 21919. I changed the rule of transformation:

    Data is loading properly.

    Kind regards

    Kiran

  • Ignore the ASO - zero data loads and missing values

    Hello

    There is an option that ignores the zero values & the missing values in the dialog box when loading data in cube ASO interactively via EAS.

    Y at - it an option to specify the same in the MAXL Import data command? I couldn't find a technical reference.

    I have 12 months in the columns in the data flow. At least 1/4 of my data is zeros. Ignoring zeros keeps the size of the cube small and faster.

    We are on 11.1.2.2.

    Appreciate your thoughts.

    Thank you

    Ethan.

    The thing is that it's hidden in the command Alter Database (Aggregate Storage) , when you create the data loading buffer.  If you are not sure what a buffer for loading data, see loading data using pads.

Maybe you are looking for