Error loading data ODI

Hi John,.

I tried to load data from the file .csv to essbase. But the following error occurs

AttributeError: class 'com.hyperion.odi.common.ODIConstants' has no attribute 'PRE_LOAD_MAXL_SCRIPT '.

I kept the pre load MAXL script as a default value in the properties of the IKM.

Please suggest.

Thank you
Sravan

All that help and only one marked as helpful answer, you will maybe more generious than what I spend a lot of time trying to help :)

See you soon

John
http://John-Goodwin.blogspot.com/

Tags: Business Intelligence

Similar Questions

  • Error of the ODI - running MaxL in loading data ODI

    I have an ODI Interface that works very well to load data in an Essbase Cube.  However I had to add a stage that could run a calc script before loading to clear data from the current year and the period.   I built the MaxL script and successfully tested and it works ok. However in the options on my target in the flow section, I added this entry:

    PRE_LOAD_MAXL_SCRIPT: C:\ODI_Data\Scripts\MaxL\clr_act.mxl

    When I try and run I get the below error.  Any ideas what it could be?  He says the full path of the MaxL script so I thought that's the way they wanted.  That's the problem, I have no reference correctly?

    org.apache.bsf.BSFException: exception of Jython:

    Traceback (most recent call changed):

    File "< string >", line 89, < module >

    at com.hyperion.odi.essbase.ODIEssbaseConnection.executeMaxl (unknown Source)

    at com.hyperion.odi.essbase.AbstractEssbaseWriter.beginLoad (unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: error occurred while running script maxl. Error message is:

    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)

    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)

    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)

    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)

    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)

    at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)

    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:366)

    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)

    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:292)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:855)

    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)

    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)

    at java.lang.Thread.run(Thread.java:662)

    Caused by: Traceback (most recent call changed):

    File "< string >", line 89, < module >

    at com.hyperion.odi.essbase.ODIEssbaseConnection.executeMaxl (unknown Source)

    at com.hyperion.odi.essbase.AbstractEssbaseWriter.beginLoad (unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: error occurred while running script maxl. Error message is:

    at org.python.core.PyException.fillInStackTrace(PyException.java:70)

    at java.lang.Throwable. < init > (Throwable.java:181)

    at java.lang.Exception. < init > (Exception.java:29)

    to java.lang.RuntimeException. < init > (RuntimeException.java:32)

    to org.python.core.PyException. < init > (PyException.java:46)

    to org.python.core.PyException. < init > (PyException.java:43)

    at org.python.core.Py.JavaError(Py.java:455)

    at org.python.core.Py.JavaError(Py.java:448)

    at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:177)

    at org.python.core.PyObject.__call__(PyObject.java:355)

    at org.python.core.PyMethod.__call__(PyMethod.java:215)

    at org.python.core.PyMethod.instancemethod___call__(PyMethod.java:221)

    at org.python.core.PyMethod.__call__(PyMethod.java:206)

    at org.python.core.PyObject.__call__(PyObject.java:397)

    at org.python.core.PyObject.__call__(PyObject.java:401)

    to org.python.pycode._pyx0.f$ 0 (< string >: 89)

    to org.python.pycode._pyx0.call_function (< string >)

    at org.python.core.PyTableCode.call(PyTableCode.java:165)

    at org.python.core.PyCode.call(PyCode.java:18)

    at org.python.core.Py.runCode(Py.java:1204)

    at org.python.core.Py.exec(Py.java:1248)

    at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)

    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)

    ... 19 more

    Caused by: com.hyperion.odi.essbase.ODIEssbaseException: error occurred while running script maxl. Error message is:

    at com.hyperion.odi.essbase.ODIEssbaseConnection.executeMaxl (unknown Source)

    at com.hyperion.odi.essbase.AbstractEssbaseWriter.beginLoad (unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:175)

    ... more than 33

    Caused by: com.essbase.api.base.EssException: error occurred while running script maxl. Error message is:

    at com.hyperion.odi.essbase.wrapper.EssbaseConnection.executeMaxl (unknown Source)

    ... more than 40

    Log in to Oracle Support, and then search for the document 1152893.1

    See you soon

    John

    http://John-Goodwin.blogspot.com/

  • Error loading of ODI planning data

    Hello

    I am trying to load data from the planning table. I get the error in the charge at the stage of planning data:

    com.hyperion.odi.planning.ODIPlanningException: com.hyperion.odi.planning.ODIPlanningException: java.lang.RuntimeException: HyperionPlanningBean::beginLoad: could not derive local systemCfg format for loading data.

    Code:

    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    import java.lang class
    import java.lang Boolean
    import java.sql *.
    from java.util import HashMap
    #
    # Get the select statement in the meeting area:
    #
    SQL = ' "' select ltrim (rtrim (C2_NODE))"Project", ltrim (rtrim (C3_PARENT)) 'Parent', case when ltrim (rtrim (C4_ALIAS_2)) is null then '#missing»»
    of another ltrim (rtrim (C4_ALIAS_2)) end ' Alias: Alias 2 ', case when ltrim (rtrim (C5_ALIAS_1)) is null then '#missing' another
    LTRIM (RTrim (C5_ALIAS_1)) end ' Alias: Default "," #missing ' 'valid for Consolidations ", ltrim (rtrim (C6_DATA_STORAGE))" Data Storage", ltrim (rtrim (C7_TWO_PASS_CALCULATION)) 'Calculation of two', case when ltrim (rtrim (C8_DESCRIPTION)) is null then '#missing' other '.
    LTRIM (RTrim (C8_DESCRIPTION)) end 'Description', "#missing '' formula' ', #missing '" UDA"', #missing ' 'Smart List', C1_DATA_TYPE 'Data Type'," #missing ' "Operation", ltrim (rtrim (C9_AGGREGATION_1)) '(F_AdSale) aggregation' of "C$ _0Project" where (1 = 1) ' "

    srcCx = odiRef.getJDBCConnection ("SRC")

    stmt = srcCx.createStatement)

    srcFetchSize = 30

    stmt.setFetchSize (srcFetchSize)

    RS = stmt.executeQuery (sql)

    #load data
    stats = pWriter.loadData (rs)

    #close the database result set, connection
    RS. Close()
    stmt. Close()

    Any help will be appreciated

    -Kash

    I have not seen that error before but there's a doc in support of Oracle that highlights the error ' could not derive local systemCfg format for loading data [ID 1479461.1].
    It is not directly related to ODI or problem, but the workaround may help in your case.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • should we empty loading data ODI and move to "Load rules" of Essbase?

    Hello
    We use to load data in planning 11.1.1.3's ODI Essbase cubes.
    Should move us data loading ODI on "Load rules" of Essbase?
    We see no advantage to ODI over aid maxl Essbase calls script loading rules and calc scripts.
    One thing he has the advantage on Essbase, is that it can refresh planning. Can you load supporting details and the text in the cell?

    You have data from multiple sources, say some data sets are file bases, some are stored in warehouses or books of accounts, you want to collect these data, transform and apply the mapping, to load that data into essbase, you can add Automation around her then that is an example to use ODI to load data into essbase There are many others.
    Yes you can do it with maxl, but must eventually bring about more strive to add automation, mappings, error handling etc. as you would if you have walked the path of ODI.
    I do not say that you must use it, if you don't see any profit is meaningless.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Error loading data

    Hi all

    I am getting error below:

    ERROR - 1003007 - Data Value [0] Encountered Before All Dimensions Selected, [1] Records Completed. ERROR - 1241101 - Unexpected Essbase error 1003007. 

    Please suggest

    Check the State of charge. If there are all the dimensions in your database not specified in the load of data file, make sure that you specify in the header.

    Possible solutions

    Make sure that the data source is valid.

    Is a member of every dimension specified in the data source or rules file properly?

    The field of digital data is at the end of the recording? If this isn't the case, move the digital data field in the data source or move the field of digital data in the rules file.

    Members that can contain numbers (e.g. "100") surrounded quotes in the data source?

    If you use a header, is the header properly configured? Don't forget that you can add the names of missing dimension to the header.

    The data source contains extra spaces or tabs?

    The hierarchical update was saved?

    I hope that helps!

    Good luck!

  • Error loading data from Oracle 10 g to Excel

    Hello gurus,

    I get the error when loading data from the Oracle 10 g database table to Excel.
    I use SQL TO ADD SQL IKM.
    Staging area is different from the target and I use the same pattern sense to source and put in scene.
    Please help me...

    Error is

    0: S1000: sun.jdbc.odbc.JdbcOdbcBatchUpdateException: [Microsoft] [ODBC Excel Driver] operation must use an update query.
    sun.jdbc.odbc.JdbcOdbcBatchUpdateException: [Microsoft] [ODBC Excel Driver] operation must use an update query.

    at sun.jdbc.odbc.JdbcOdbcPreparedStatement.emulateExecuteBatch (unknown Source)

    at sun.jdbc.odbc.JdbcOdbcPreparedStatement.executeBatchUpdate (unknown Source)

    at sun.jdbc.odbc.JdbcOdbcStatement.executeBatch (unknown Source)

    at com.sunopsis.sql.SnpsQuery.executeBatch (SnpsQuery.java)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execCollOrders (SnpSessTaskSql.java)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt (SnpSessTaskSql.java)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt (SnpSessTaskSqlI.java)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask (SnpSessTaskSql.java)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep (SnpSessStep.java)

    at com.sunopsis.dwg.dbobj.SnpSession.treatSession (SnpSession.java)

    at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand (DwgCommandSession.java)

    at com.sunopsis.dwg.cmd.DwgCommandBase.execute (DwgCommandBase.java)

    at com.sunopsis.dwg.cmd.e.i (e.java)

    at com.sunopsis.dwg.cmd.g.y (g.java)

    at com.sunopsis.dwg.cmd.e.run (e.java)

    at java.lang.Thread.run (unknown Source)


    Kindly help me...

    Thanks in advance.

    Samuel

    Hello

    When you created the DSN for the spreed excel worksheet you dot not uncheck the read-only box.
    Make sure that the read-only checkbox is not checked, otherwise you will get an error message when trying to insert data into the Excel worksheet (error: "[Microsoft] [ODBC Excel Driver] operation must use an update query.").

    Thank you
    Fati

  • Error loading data with a rule in 11.1.2.1 file loading

    When I do the dim the Parent child build and alias will not load?

    Rule reading information SQL for the database [DP]
    Rule of reading of the purpose of the database rule [DP]
    Active parallel loading: [1] block to prepare the discussions, block [1] write discussions.
    DATAERRORLIMIT reached [1000]. Rejected records will be is more connected
    No value data modified by this data file loading
    Data load time for [Accounts01.txt] with [ACCOUNT.rul]: [0.512] seconds
    There were errors, look in C:\Oracle\Middleware\EPMSystem11R1\products\Essbase\eas\client\dataload.err
    Completed import data ['RPIWORK'.'] [DP ']
    Out of the prepared columns: [0]

    Outline = measurement Demension

    Measure
    Balance sheet
    Profit & loss

    Data file

    is a SQL query
    to Excel and save it as a text (delimited by tabs)

    Account.txt

    Parent | Child | Alias
    Total assets of the balance sheet 10000
    Total liabilities of the balance sheet 20000
    Balance Sheet Total 30000 of the owner
    Profit & loss 40000 recipes
    Profit & loss 50000 cost of goods sold
    Profits & losses 60000 S.G./A.
    Profit & loss 70000 other operating income
    Profit & loss 80000 other expenses
    Profit & loss 85000 interest income/expenses
    Profit & loss 90000 taxes
    Profit & loss 99000 capitalized Contra
    10000 10100 short-term assets
    10000-14000 «wood, forest and roads land»
    10000 17000 "goods, materials and equipment.
    10000 to 19000 active other long-term
    10000 19750 deferred tax assets
    20000 20001 short-term liabilities
    20000 21200 other Long term liabilities
    20000-22000 timber contracts
    20000 25100 long-term debt
    20000-26000 deferred tax liability
    30000 Equity subsidiary 30001
    30000 30250 voting common shares
    30000 30350 according to Common Stock
    30000 30450 capital paid in additional
    30000 30550 retained earnings
    30000 30610 other comprehensive income items
    30000 30675 tax distribution to Shareholde
    40000 40055 wholesales
    40000 41500 chips recipes
    40000 43000 power sales
    40000 nursery 45000 income
    40000 46000 transfer logs
    40000 47000 Intraco sales
    40000-49000 sale deduction
    50000 Labor 51000
    50000 52000 raw
    50000 rental 52800
    50000 53000 operating expenses
    50000 53900 forestry supplies
    50000 capitalized 53999 Contra-forestry
    50000 maintenance 54000
    50000 55000 fuels & lubricants
    50000-56000 utilities
    50000 57000 cost of direct registration
    50000 57500 personalized services
    50000 impoverishment 57700
    50000 58000 cost allocations of goods sold
    fixed cost 50000 59000
    50000 59510 changes in inventories
    60000 60100 salaries
    60000 60300 maintenance of PC hardware
    60000 60400 other G & A
    61000 60000 licenses/fees/charges
    60000 61400 benefits not
    60000 61550 furniture/fixtures
    60000 legal 61750
    60000 62000 fresh office
    60000 62500 professional services
    60000 63000 activities pre & Post jobs
    60000 63200 telecommunication costs
    60000 63550 employee activities
    60000 63800 sales & Promotions
    60000 63900 banking questions
    60000 64000 Admin depreciation
    60000 64500 insurance and property taxes
    60000 65000 allowances S G & A
    60000 66000 outside management
    70000 70100 rental income
    70000 disposals of fixed assets 70200
    70000 70400 Misc income
    80000 80200 factory inactive
    85000 85001 interest expense
    85000 85200 interest income
    90000-90100 income tax charges

    error file

    ------20000 not found in the database members
    20000 25100 long-term debt

    ------20000 not found in the database members
    20000-26000 deferred tax liability

    ------Member 30000 not found in the database
    30000 Equity subsidiary 30001

    ------Member 30000 not found in the database
    30000 30250 voting common shares

    ------Member 30000 not found in the database
    30000 30350 according to Common Stock

    ------Member 30000 not found in the database
    30000 30450 capital paid in additional

    ------Member 30000 not found in the database
    30000 30550 retained earnings

    ------Member 30000 not found in the database
    30000 30610 other comprehensive income items

    ------Member 30000 not found in the database
    30000 30675 tax distribution to Shareholde

    ------Member 40000 not found in the database
    40000 40055 wholesales

    ------Member 40000 not found in the database
    40000 41500 chips recipes

    ------Member 40000 not found in the database
    40000 43000 power sales

    ------Member 40000 not found in the database
    40000 nursery 45000 income

    ------Member 40000 not found in the database
    40000 46000 transfer logs

    ------Member 40000 not found in the database
    40000 47000 Intraco sales

    ------Member 40000 not found in the database
    40000-49000 sale deduction

    ------Member 50000 not found in the database
    50000 Labor 51000

    ------Member 50000 not found in the database
    50000 52000 raw

    ------Member 50000 not found in the database
    50000 rental 52800

    ------Member 50000 not found in the database
    50000 53000 operating expenses

    ------Member 50000 not found in the database
    50000 53900 forestry supplies

    ------Member 50000 not found in the database
    50000 capitalized 53999 Contra-forestry

    ------Member 50000 not found in the database
    50000 maintenance 54000

    ------Member 50000 not found in the database
    50000 55000 fuels & lubricants

    ------Member 50000 not found in the database
    50000-56000 utilities

    ------Member 50000 not found in the database
    50000 57000 cost of direct registration

    ------Member 50000 not found in the database
    50000 57500 personalized services

    ------Member 50000 not found in the database
    50000 impoverishment 57700

    ------Member 50000 not found in the database
    50000 58000 cost allocations of goods sold

    ------Member 50000 not found in the database
    fixed cost 50000 59000

    ------Member 50000 not found in the database
    50000 59510 changes in inventories

    ------Member 60000 not found in the database
    60000 60100 salaries

    ------Member 60000 not found in the database
    60000 60300 maintenance of PC hardware

    ------Member 60000 not found in the database
    60000 60400 other G & A

    ------Member 60000 not found in the database
    61000 60000 licenses/fees/charges

    ------Member 60000 not found in the database
    60000 61400 benefits not

    ------Member 60000 not found in the database
    60000 61550 furniture/fixtures

    ------Member 60000 not found in the database
    60000 legal 61750

    ------Member 60000 not found in the database
    60000 62000 fresh office

    ------Member 60000 not found in the database
    60000 62500 professional services

    ------Member 60000 not found in the database
    60000 63000 activities pre & Post jobs

    ------Member 60000 not found in the database
    60000 63200 telecommunication costs

    ------Member 60000 not found in the database
    60000 63550 employee activities

    ------Member 60000 not found in the database
    60000 63800 sales & Promotions

    ------Member 60000 not found in the database
    60000 63900 banking questions

    ------Member 60000 not found in the database
    60000 64000 Admin depreciation

    ------Member 60000 not found in the database
    60000 64500 insurance and property taxes

    ------Member 60000 not found in the database
    60000 65000 allowances S G & A

    ------Member 60000 not found in the database
    60000 66000 outside management

    ------Member 70000 not found in the database
    70000 70100 rental income

    ------Member 70000 not found in the database
    70000 disposals of fixed assets 70200

    ------Member 70000 not found in the database
    70000 70400 Misc income

    ------Member 80000 not found in the database
    80000 80200 factory inactive

    ------Member 85000 not found in the database
    85000 85001 interest expense

    ------Member 85000 not found in the database
    85000 85200 interest income

    ------Member 90000 N
    t database
    90000-90100 income tax charges


    That's how I build my loading rules file

    Create-> rules file
    File-> opendatafile->

    account01.txt

    Field-> Dimension Build Properties properties

    Dimension =
    Field 1 measure; Type = Parent
    Measurement of field 2; Type = child
    Measurement of field 2; Type = Alais

    Click the Dimension Build field
    Click on setting Dimension build - Parent\Child
    Ok

    Validate - rule is correct

    Save as account

    Load the data file = Acciount.txt
    Rule file = account

    Ok

    Published by: level following December 11, 2011 06:24

    Published by: level following December 11, 2011 06:25

    Published by: level following December 11, 2011 06:27

    The drop-down list in the EA when you right click on the database and select load data. I have a question, the Member that already in outline? If this isn't the case, you will have problems. You would have to add it or have a line in the file to load at the top, with something like account, balance sheet in it. Also in the State of charge have you changed the load parameters of dimension for the dimension of accounts to be parent/child for the dimension of accounts. Often enough, the people don't realize they have to double-click the dimension name to make sure it gets set as the dimension that gets changed.

    I'm sure that your question is she trying to make loading the data and not the generation dim, but it might just be the first problem

  • Loading data ODI

    Hi all
    I need to load data into Planning ODI.
    The data file contains elements of the bouble (account) as different values for the junction and now ODI loading contanis forst items it finds when loading and reject the other.
    In this regard, I would like to know if it is possbile with ODI load data using the sum for this bouble elements.

    Thanks for the help.

    Best regards
    Eugenio Gualtieri

    Hello

    I think I see what you're saying, each record is getting crushed in planning rather than be accumulated.

    You can get ODI to do the work for you, what you should do first: -.

    In your source data store (which I assume is a file) make sure the W01 and W02 columns are the numeric value and not string
    In your target data store (the planning dimension) make sure the W02 and W02 columns are set in digital.

    In your interface in the target for the W01 W02 columns you must surround what you currently have with SUM (column)

    for example if you source called WIRE
    the goal for W01 would SUM (THREAD. W01) and for W02, she would SUM (FILE.) W01)
    because you use the SUM function, it automatically creates a group of query that summarize the results.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Error loading data ASO loading it via Maxl

    Hi team,

    I have a huge data file and I am trying to load it in cube ASO. I did manually with the loading rules file. To automate the same, I wrote a maxl as below and get error like "Server queries will fail with the error code [1003086].

    Error unexpected essbase 1013295 '.»

    MAXL Query:

    change the 'app' database.' Cube' initialize load_buffer with buffer_id 1; Statement executed successfully

    import the "app" database.' Cube' data_file Server data ' / home/hypadmin/Oracle/Middleware/user_projects/epmsystem2/EssbaseServer/essbaseserver1/app/abc/abccube/Dataload.tx ' using Server rules_file 'load.rul' to load_buffer with buffer_id 1 error write to ' / home/hypadmin/Oracle/Middleware/user_projects/epmsystem2/EssbaseServer/essbaseserver1/app/abc/abccube/exp_16thAug.err ';

    "Queries to the server fails with error code [1003086].

    "Error unexpected essbase 1013295.

    import the database 'app'.' Cube' load_buffer with buffer_id data 1;  Statement executed successfully.


    Increasingly, I have another request.


    ASO, can I use the same OSI query to load the data as below without use of buffer.


    import the database 'app'.' Cube' database server of data_file ' / home/hypadmin/Oracle/Middleware/user_projects/epmsystem2/EssbaseServer/essbaseserver1/app/abc/abccube/Dataload.txt ' using the rules_file server 'Load.rul' error writing to ' / home/hypadmin/Oracle/Middleware/user_projects/epmsystem2/EssbaseServer/essbaseserver1/app/abc/abccube/exp_16thAug.err ';


    Thanks in advance





    Ah, as you had indicated the full path of the data file, I thought earlier that you run on the Essbase server. That's why I asked to omit the keyword "server."

    The right way to do it, if the execution of script on the local computer:

    (1) copy the data to the local computer and talk about the path of the script. In this case you don't need to use the keyword "server."

    (2) use the keyword 'server' and not to mention all the way, just the file name. I'm a little rusty on this point, because I have not used this option in ages, but I think that the file must be in the database directory (which is what you have), the directory of the application (abc), or under the ARBORPATH (.../essbaseserver1/app/)

    Andy

  • Error loading data using SQL loader

    I get an error message like "SQL * Loader - 350 combination illegal syntax of non-alphanumeriques characters error during loading of a file using SQL loader in RHEL." The command used to run SQL * Loader is:

    Sqlldr userid = < user name > / < password > control = data.ctl

    The control file is data.ctl:

    DOWNLOAD the data

    INFILE ' / home/oraprod/data.txt'

    Add in the table test

    {

    EmpID completed by «,»,

    fname completed by «,»,

    lname completed by «,»,

    treatment is completed with a white space

    }

    The data.txt file is:

    1, Kaushal, Hamad, 5000

    2, Chetan, Hamad, 1000

    Hopefully, my question is clear.

    Please get back with the answer to my query.

    Concerning

    Replace "{" by "("dans votre fichier de contrôle) "

    DOWNLOAD the data

    INFILE 'c:\data.txt.

    Add the emp_t table

    (

    EmpID completed by «,»,

    fname completed by «,»,

    lname completed by «,»,

    treatment is completed with a white space

    )

    C:\>sqlldr user/pwd@database control = c.ctl

    SQL * Loader: release 10.2.0.3.0 - Production on Wed Nov 13 10:10:24 2013

    Copyright (c) 1982, 2005, Oracle.  All rights reserved.

    Commit the point reached - the number of logical records 1

    Commit the point reached - the number of logical records 2

    SQL > select * from emp_t;

    EMPID, FNAME LNAME SALARY

    ---------- -------------------- -------------------- ----------

    1 kone hamadi 5000

    2 Chetan Hamad 1000

    Best regards

    Mohamed Houri

  • Error loading of ODI in essbase

    Hi all

    I tried SQL table ETL data into Essbase. I got the following error:

    org.apache.bsf.BSFException: exception of Jython:
    Traceback (most recent call changed):
    File "< string >", line 26, < module >
    at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData (unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)


    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: error reached records the maximum error threshold: 1

    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2457)
    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
    at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:537)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:338)
    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:263)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:822)
    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)
    at java.lang.Thread.run(Thread.java:662)

    Looks like you have set the option km: 0, you must remove : just let 0

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Error loading data to Essbase by FDM with adapter ES9x-G4-A

    Hello
    When I try to download Essbase data with adapter ES9x-G4-A FDM 9.3.1 9.3.1 it gives me error in the export, import and Validate are successful only export is in error.

    Error: export failed.
    Detail: Object required: ' API. IntBlockMgr.IntegrationMgr.PobjIntegrate.varCon'
    Online: 622


    When I checked it through customer FDM (workbench) I did not find Varcon Variable in FDM objects.
    am I missing something let me know?

    Please answer soon


    Thank you

    Method of setting:

    1. go to the Client Workbench FDM
    2. login to the application appropriate as a power user
    3. navigate to the tab cards and expand the target system adapters and adapter Essbase (ES9x-G4 *)
    4 expand the Actions folder and double-click Action to export to open the script
    5. go to line 357 in the Action Script to export (see the bug attached to the exact code of the line)
    6. place a single apostrophe (') at the beginning of the line to the premature destruction of the object in the comment.
    7. save the script by clicking the diskette icon in the toolbar
    8. test the export again.
    9. If the issue is resolved, please migrate the solution as the case in other environments.

  • Error loading bitmap to BitmapObject data

    The message while trying to open the site I worked on for weeks: "error loading data bitmap to BitmapObject", then the Muse closes. Is it possible to fix this or have I just lost weeks of work? I reinstalled and uninstalled (including preferences) and individual files Muse inspected several times with the same result. (Wait, that drives me crazy?)

    CapnJack salvation,

    I would check your .muse file,

    Please the same email to me at [email protected] with the link to this thread in the subject line.

    If your file size is more then 20 MB, then you can simply use all the file sharing services like Dropbox, creative cloud or Wetransfer

    and share the download link with me.

    Best regards

    Ankush;

  • ODI 11 g to load data to Hyperion - error ODI-1228

    Hi friends,

    I'm using ODI 11 g to load the data for Hyperion Essbase using the "memory Sunopsis Staging Area table engine', I several created interfaces, but some have failed at the third stage of the process '3 - Carga - SrcSet0 - Load data'.

    Number of interfaces I created worked without problems by using the same configuration, topology, and KM, I thought that the problem was created by the file this extraction I´m. Ce file failed in other interfaces, but if I dupplicate the last line of the file (copy and paste at the end) interfaces for the file is complete.

    After the first appearance of the error, I created other interfaces, most of them works without problem, but a couple of them generates exactly the same error as one explained using different files.

    I use:

    ODI 11 g 11.1.1.5.2.

    IKM: IKM: IKM SQLto Hyperion Essbase (DATA).

    HSQLDB: Version 2.0.0.

    Essbase: 11.1.2.1.

    Source of the cube: BSO.

    The error that generated the ODI is:

    ODI-1228: Fallo en SrcSet0 (Carga) en el destino SUNOPSIS_ENGINE conexion SUNOPSIS_MEMORY_ENGINE task.

    Caused by: java.sql.SQLException: sentencia SQL don't get runs en modo 'batch '.

    at org.hsqldb.jdbc.Util.sqlException (unknown Source)

    at org.hsqldb.jdbc.Util.sqlException (unknown Source)

    at org.hsqldb.jdbc.Util.sqlExceptionSQL (unknown Source)

    at org.hsqldb.jdbc.JDBCPreparedStatement.executeBatch (unknown Source)

    at oracle.odi.runtime.agent.execution.sql.SQLCommand.end(SQLCommand.java:267)

    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.endExecution(SQLExecutor.java:156)

    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.endExecution(SQLExecutor.java:1)

    at oracle.odi.runtime.agent.execution.DataMovementTaskExecutionHandler.handleTask(DataMovementTaskExecutionHandler.java:113)

    ...

    ...

    ...

    I hope that you can guide me to find a solution.

    Thanks for all, best regards.

    I had the same problem in a seemingly random way. I could not find any resolution so the safest option is to choose a different staging technology.

  • ODI: Error loading the data of HFM: invalid dimension name

    Hello

    I am fairly new to ODI and I was wondering if any guru could help me overcome a question im facing. I try to load data from a csv file in HFM. I chose the good KM (LKM file SQL and SQL IKM to Hyperion Financial Management Data), with the Sunopsis Memory engine than the staging area.

    To facilitate the file csv has the exact structure as well as the dimensions of HFM applications and has been located in the interface, as shown below:

    Column of the source - target HFM column file

    -Scenario
    Year - year
    Display - display
    Entity - entity
    Value - value
    Account - account
    PIC - PIC
    CUSTOM1 - Custom1
    CUSTOM2 - Custom2
    Custom3 - Custom3
    Custom4 - Custom4
    -Period
    DataValue - Datavalue
    -Description (no column of the source, mapped as ")

    The csv file contains basic members only. I set the error log file path, and when running the interface I get an error. When I open the error log, I see the following messages:

    Line: 1, error: invalid dimension name
    ! Column_Order = C1_SCENARIO, C2_YEAR, C3_VIEW, C4_ENTITY, C5_VALUE, C6_ACCOUNT, C7_ICP, C8_CUSTOM1, C9_CUSTOM2, C10_CUSTOM3, C11_CUSTOM4, C12_PERIOD, C13_DATAVALUE
    C1_SCENARIO
    line: 3 error: a valid column order is not specified.
    Actual; 2007; YTD; 20043; < entity currency >; 13040; [ICP no]; [None]; 1000; [None]; [None]; Jan; 512000; » »
    > > > > > >



    I'm not sure how to solve, as it is based on the interface mapping match dimensions on a 1:1 basis. In addition, dimension in the target column names correspond to the dimension names of application of HFM (that this application has been deducted).

    Help, please!

    Thank you very much
    Jorge

    Published by: 993020 on March 11, 2013 05:06

    Dear Experts,

    ODI: 11.1.1.6.0
    HFM: 9.3.3

    I also met a similar error as OP.

    In my case, the error occurs when I use SUNOPSIS_MEMORY_ENGINE as the staging. If I simply change this staging to the Oracle schema, the Interface will load data successfully to HFM. So, I'm curious on what cause the SUNOPSIS cannot become the staging for the loading of HFM.

    This will show in the IKM SQL to the FM data log file:

    Load data started: 3/14/2013 13:41:11.
    Line: 1, Error: Invalid dimension name
    !Column_Order = C1_SCENARIO, C2_YEAR, C3_VIEW, C4_ENTITY, C5_VALUE, C6_ACCOUNT, C7_ICP, C8_PRODUCT, C9_CUSTOMERS, C10_CHANNEL, C11_UNITSFLOWS, C12_PERIOD, C13_DESCRIPTION
    
    
    C1_SCENARIO

    
    Line: 3, Error: A valid column order is not specified.
    Actual;2007;YTD;EastSales;;Sales;[ICP None];Comma_PDAs;Electronic_City;National_Accts;[None];February;;555
    >>>>>>
    
    Load data completed: 3/14/2013 13:41:11.
    

    It seems like the query generated is not picking up the Column Alias name, but this only happens if I use SUNOPSIS_MEMORY_ENGINE as the staging. With Oracle schema as staging, data load is successfully finished.

    This is the generated code from the KM

    Prepare for Loading (Using Oracle as Staging)

    from java.util import HashMap
    from java.lang import Boolean
    from java.lang import Integer
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.hfm import ODIHFMConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    
    # Target HFM connection properties
    clusterName   = "demo92"
    userName      = "admin"
    password      =  "<@=snpRef.getInfo("DEST_PASS") @>"
    application   = "COMMA"
    
    targetProps = HashMap()
    targetProps.put(ODIConstants.SERVER,clusterName)
    targetProps.put(ODIConstants.USER,userName)
    targetProps.put(ODIConstants.PASSWORD,password)
    targetProps.put(ODIConstants.APPLICATION_NAME,application)
    
    # Load options
    consolidateOnly    = 0
    importMode            = "Merge"
    accumulateWithinFile  = 0
    fileContainsShareData = 0
    consolidateAfterLoad  = 0
    consolidateParameters = ""
    logEnabled             = 1
    logFileName           = r"C:\Temp\ODI_HFM_Load.log"
    tableName             = r"HFMData"
    columnMap            = 'SCENARIO=Scenario , YEAR=Year , VIEW=View , ENTITY=Entity , VALUE=Value , ACCOUNT=Account , ICP=ICP , PRODUCT=Product , CUSTOMERS=Customers , CHANNEL=Channel , UNITSFLOWS=UnitsFlows , PERIOD=Period , DATAVALUE=DataValue , DESCRIPTION=Description '
    srcQuery= """select   C1_SCENARIO    "Scenario",C2_YEAR    "Year",C3_VIEW    "View",C4_ENTITY    "Entity",C5_VALUE    "Value",C6_ACCOUNT    "Account",C7_ICP    "ICP",C8_PRODUCT    "Product",C9_CUSTOMERS    "Customers",C10_CHANNEL    "Channel",C11_UNITSFLOWS    "UnitsFlows",C12_PERIOD    "Period",555    "DataValue",C13_DESCRIPTION    "Description" from ODI_TMP."C$_0HFMData"  where      (1=1)     """
    srcCx                    = odiRef.getJDBCConnection("SRC")
    srcQueryFetchSize=30
    
    loadOptions = HashMap()
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEONLY, Boolean(consolidateOnly))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_IMPORTMODE, importMode)
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_ACCUMULATEWITHINFILE, Boolean(accumulateWithinFile))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_FILECONTAINSSHAREDATA, Boolean(fileContainsShareData))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEAFTERLOAD, Boolean(consolidateAfterLoad))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEPARAMS, consolidateParameters)
    loadOptions.put(ODIConstants.LOG_ENABLED, Boolean(logEnabled))
    loadOptions.put(ODIConstants.LOG_FILE_NAME, logFileName)
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_TABLENAME, tableName);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_COLUMNMAP, columnMap);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCECONNECTION, srcCx);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCEQUERY, srcQuery);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCEQUERYFETCHSIZE, Integer(srcQueryFetchSize));
    
    # Get the writer
    hfmWriter = HypAppConnectionFactory.getAppWriter(HypAppConnectionFactory.APP_HFM, targetProps);
    
    # Begin load
    hfmWriter.beginLoad(loadOptions)
    

    Prepare for loading (using SUNOPSIS as staging)

    from java.util import HashMap
    from java.lang import Boolean
    from java.lang import Integer
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.hfm import ODIHFMConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    
    # Target HFM connection properties
    clusterName   = "demo92"
    userName      = "admin"
    password      =  "<@=snpRef.getInfo("DEST_PASS") @>"
    application   = "COMMA"
    
    targetProps = HashMap()
    targetProps.put(ODIConstants.SERVER,clusterName)
    targetProps.put(ODIConstants.USER,userName)
    targetProps.put(ODIConstants.PASSWORD,password)
    targetProps.put(ODIConstants.APPLICATION_NAME,application)
    
    # Load options
    consolidateOnly    = 0
    importMode            = "Merge"
    accumulateWithinFile  = 0
    fileContainsShareData = 0
    consolidateAfterLoad  = 0
    consolidateParameters = ""
    logEnabled             = 1
    logFileName           = r"C:\Temp\ODI_HFM_Load.log"
    tableName             = r"HFMData"
    columnMap            = 'SCENARIO=Scenario , YEAR=Year , VIEW=View , ENTITY=Entity , VALUE=Value , ACCOUNT=Account , ICP=ICP , PRODUCT=Product , CUSTOMERS=Customers , CHANNEL=Channel , UNITSFLOWS=UnitsFlows , PERIOD=Period , DATAVALUE=DataValue , DESCRIPTION=Description '
    srcQuery= """select   C1_SCENARIO    "Scenario",C2_YEAR    "Year",C3_VIEW    "View",C4_ENTITY    "Entity",C5_VALUE    "Value",C6_ACCOUNT    "Account",C7_ICP    "ICP",C8_PRODUCT    "Product",C9_CUSTOMERS    "Customers",C10_CHANNEL    "Channel",C11_UNITSFLOWS    "UnitsFlows",C12_PERIOD    "Period",555    "DataValue",C13_DESCRIPTION    "Description" from "C$_0HFMData"  where      (1=1)     """
    srcCx                    = odiRef.getJDBCConnection("SRC")
    srcQueryFetchSize=30
    
    loadOptions = HashMap()
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEONLY, Boolean(consolidateOnly))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_IMPORTMODE, importMode)
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_ACCUMULATEWITHINFILE, Boolean(accumulateWithinFile))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_FILECONTAINSSHAREDATA, Boolean(fileContainsShareData))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEAFTERLOAD, Boolean(consolidateAfterLoad))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEPARAMS, consolidateParameters)
    loadOptions.put(ODIConstants.LOG_ENABLED, Boolean(logEnabled))
    loadOptions.put(ODIConstants.LOG_FILE_NAME, logFileName)
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_TABLENAME, tableName);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_COLUMNMAP, columnMap);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCECONNECTION, srcCx);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCEQUERY, srcQuery);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCEQUERYFETCHSIZE, Integer(srcQueryFetchSize));
    
    # Get the writer
    hfmWriter = HypAppConnectionFactory.getAppWriter(HypAppConnectionFactory.APP_HFM, targetProps);
    
    # Begin load
    hfmWriter.beginLoad(loadOptions)
    

    If anyone can help on how to solve this?

    Thank you

    Published by: user10620897 on March 14, 2013 14:28

Maybe you are looking for

  • DVD CD-ROM on Equium A200 - error Code 37 problem

    Hello My laptop does not recognize not the DVD or CD even if the laptop is only 2 months old. When I go to Device Manager came up with an error message: Windows cannot initialize the device of this material (Code 37) driver. Anyone know if it is an e

  • Toshiba Satellite 1900, stops after7 minutes.

    My toshiba will stop after 10 minutes. I tried to re - install the computer, put in the toshiba restore disk, but it stops after a while. Even if I tried to install XP. Then, a colleague tried to check the hard drive, BIOS update, but once it stops.

  • Bays of mathscript rt module-3D LV2013

    In these forum discussions, I have seen that earlier versions of the labview mathscript rt module did not support 3D tables. Does anyone know if this is still the case with LV2013 or LV2014? I use LV2013 and tried to create a 3D chart > C = zeros (3,

  • The computer does not recognize my skycaddie is hung

    Synchronization of caddie program is installed, but it seems that the computer does not recognize my skycaddie is hung Original title: I can't get the computer to recognize my sky caddie - when I go to install a new device is not able to find it

  • C6380 - 0xc19a0023 ink system failure error message