Error in HfmData.loadData

Hi Expert

The below error when I click on the fish exporting data load Workbench icon. The first two steps of import and Validate are completed successfully. Could if it you please let me know the navigation on how I can get the log file in the server? Thank you.

2014-04-30 02:56:27, 268 INFO [AIF]: beginning of the process FDMEE, process ID: 66
2014-04-30 02:56:27, 269 [AIF] INFO: recording of the FDMEE level: 4
2014-04-30 02:56:27, 269 [AIF] INFO: FDMEE log file: AAESRF\outbox\logs\AAES_66.log
2014-04-30 02:56:27, 269 [AIF] INFO: user: admin
2014-04-30 02:56:27, 269 INFO [AIF]: place: AAESLocation (Partitionkey:2)
2014-04-30 02:56:27, 270 [AIF] INFO: period name: Apr-14 (period key: 4/1/14-12:00 AM)
2014-04-30 02:56:27, 270 INFO [AIF]: category name: AAESGCM (category key: 2)
2014-04-30 02:56:27, 270 [AIF] INFO: name rule: AAESDLR (rule ID:7)
2014-04-30 02:56:31, 815 [AIF] INFO: Jython Version: 2.5.1 (Release_2_5_1:6813, September 26 2009, 13:47:54)
[JRockit (R) Oracle (Oracle Corporation)]
2014-04-30 02:56:31, 815 INFO [AIF]: Java platform: java1.6.0_37
2014-04-30 02:56:37, 219 [AIF] INFO: - START IMPORT STEP -
2014-04-30 02:56:37, 334 INFO [AIF]: - END IMPORT STEP -
2014-04-30 02:56:37, INFO 402 [AIF]: - EXPORT STEP START.
2014-04-30 02:56:37, 593 [AIF] INFO:
Move the data for the period "Apr-14'.
Microsoft (R) Windows Script Host Version 5.8
Copyright (C) Microsoft Corporation. All rights reserved.

Created map.
Failed to initialize.  Check the Windows Application event logs
2014-04-30 02:56:38, 457 FATAL [AIF]: error in HfmData.loadData
Traceback (most recent call changed):
File '< string >", line 106, in loadData
RuntimeError: Error loading

2014-04-30 02:56:38, 624 FATAL [AIF]: error in HFM load data
2014-04-30 02:56:38, INFO [AIF] 630: end process FDMEE, process ID: 66

The reason is full directory must be used for installation on the file in the folder root of the Application in the system settings tab.

Tags: Business Intelligence

Similar Questions

  • QNetworkReply running into the problem of loading JSON data

    Hello

    I am a beginner with C++ and QT, but so far I'm starting to love the NDK waterfall!

    I'm trying to load a json data file that is extracted via a http request. Everything goes through, but my json data simply would not load in the QVariantList. So after a few hours of poking arround, I noticed finally that the json returned by the http request data is missing two brackets [] (an @ beginning and an end @).

    When I load the json data into a file with the two brakets included, the QVariantList load properly and I can debug through the records...

    Now my question is... how C++ can I add those parentheses []... See the code example below:

    void MyJSONReadClass::httpFinished()
    {
      JsonDataAccess jda;
      QVariantList myDataList;
    
      if (mReply->error() == QNetworkReply::NoError)
      {
        // Load the data using the reply QIODevice.
        qDebug() << mReply;
        myDataList = jda.load(mReply).value();
      }
      else
      {
        // Handle error
      }
    
      if (jda.hasError())
      {
        bb::data::DataAccessError error = jda.error();
        qDebug() << "JSON loading error: " << error.errorType() << ": "
            << error.errorMessage();
        return;
      }
    
      loadData(myDataList);
    
      // The reply is not needed now so we call deleteLater() function since we are in a slot.
      mReply->deleteLater();
    }
    

    Also, I would have thought that the jda.hasError () have captured this question... but guess not!

    I use the wrong approach or wrong classes? The basic example used is the WeatherGuesser project.

    Thanks for your help...

    It is perhaps not related to media. Try to recover data from QNetworkResponse as a QByteArray then load it into JsonDataAccess using loadFromBuffer:

     myDataList = jda.loadFromBuffer(mReply.readAll()).value();
    

    If this is insufficient, you can add media in this way (not tested, please see the documentation for the names of functioning if it won't compile):

    QByteArray a = mReply.readAll();
    a.insert(0, '[');
    a.append(']');
    myDataList = jda.loadFromBuffer(a).value();
    

    Note that if the response data are zero end (most likely it is not, but there is a possibility of it), you will need to check if the last symbol in byte array is '\0' and insert the capture media.

    QByteArray docs:

    http://Qt-project.org/doc/Qt-4.8/QByteArray.html

  • FDMEE attribute Setup in the Import Format

    I'm on 11.1.2.3.510 and I have columns enabled in Workbench attributes.

    Then I tried to use the attribute fields in my Format to import and it will fail to export.

    Will there be a concrete way of these fields in the Format to import the configuration?

    My file is delimited by commas and although loads less attributes fields.

    Process detail log:

    2014-06-26 18:06:07, 774 INFO [AIF]: load data meet the following errors:

    | Error: 3303 | 3105.9005.9114 | May 2305 56700 FY14 Final 3105.9005.9114 commitment 450 |

    2014-06-26 18:06:07, 774 INFO [AIF]: load data failed.

    2014-06-26 18:06:07, 790 INFO [AIF]: unlock AIF0011 rules file

    2014-06-26 18:06:07, 790 INFO [AIF]: correctly unlocked rules AIF0011 file

    2014-06-26 18:06:07, 790 [AIF] INFO: [HPLService] Info: error: java.lang.Exception: load data failed.

    2014-06-26 18:06:07, ERROR 790 [AIF]: error encountered

    2014-06-26 18:06:07, 790 [AIF] INFO: [HPLService] Info: [loadData:228] END (java.lang.Exception: load data has failed.)

    2014-06-26 18:06:07, 806 DEBUG [AIF]: AIFUtil.callOdiServlet - END

    2014-06-26 18:06:07, 806 FATAL [AIF]: error in CommData.loadData

    Traceback (most recent call changed):

    Folder "< string >", line 4507, in loadData

    RuntimeError: java.lang.Exception: load data failed.

    Hello

    attributes of the dimensions specified in the import format does not affect the export. Attributes are used only in FDMEE and not exported to the EMP application target.

    To import data into the size attribute simply configure your format to import by adding and assigning columns sources their (Add > Dimension attribute).

    Regarding your charge problem, some of your members online, it seems that member 3105.9005.9114 is not valid in your planning application.

    | Error: 3303 | 3105.9005.9114 | May 2305 56700 FY14 Final 3105.9005.9114 commitment 450 |

    You can check if the member exists in your planning application. If there is, the cube is refreshed?

    Thank you

  • ODI 11.1.1.7 to connecing HFM 11.1.2.4 failed with the error 'error occurred in driver then the connection to the application of financial management [ApplicationName] on [HFMCluster] using the user name [admin]

    Hi experts,

    I'm trying use odi 11.1.1.7 to load the data in HFM 11.1.2.4 and it failed during the loading in the data store HFMData with the below error message:

    org.apache.bsf.BSFException: exception of Jython:
    Traceback (most recent call changed):
    File '< string >", line 3, in < module >
    at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:240)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)


    com.hyperion.odi.common.ODIHAppException: com.hyperion.odi.common.ODIHAppException: error occurred in the driver when connecting to the application of financial management [IFRSHFM] on [HFMCluster] using the [admin] username.

    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)
    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
    at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:577)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:468)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2128)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1930)
    to oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$ 2.doAction(StartScenRequestProcessor.java:580)
    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
    at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor.doProcessStartScenTask(StartScenRequestProcessor.java:513)
    to oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$ StartScenTask.doExecute (StartScenRequestProcessor.java:1073)
    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)
    at java.lang.Thread.run(Thread.java:662)
    Caused by: Traceback (most recent call changed):
    File '< string >", line 3, in < module >
    at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:240)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)


    com.hyperion.odi.common.ODIHAppException: com.hyperion.odi.common.ODIHAppException: error occurred in the driver when connecting to the application of financial management [IFRSHFM] on [HFMCluster] using the [admin] username.

    at org.python.core.PyException.fillInStackTrace(PyException.java:70)
    at java.lang.Throwable. < init > (Throwable.java:181)
    at java.lang.Exception. < init > (Exception.java:29)
    to java.lang.RuntimeException. < init > (RuntimeException.java:32)
    to org.python.core.PyException. < init > (PyException.java:46)
    to org.python.core.PyException. < init > (PyException.java:43)
    at org.python.core.Py.JavaError(Py.java:455)
    at org.python.core.Py.JavaError(Py.java:448)
    at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:177)
    at org.python.core.PyObject.__call__(PyObject.java:355)
    at org.python.core.PyMethod.__call__(PyMethod.java:215)
    at org.python.core.PyMethod.instancemethod___call__(PyMethod.java:221)
    at org.python.core.PyMethod.__call__(PyMethod.java:206)
    at org.python.core.PyObject.__call__(PyObject.java:397)
    at org.python.core.PyObject.__call__(PyObject.java:401)
    to org.python.pycode._pyx11.f$ 0 (< string >: 6)
    to org.python.pycode._pyx11.call_function (< string >)
    at org.python.core.PyTableCode.call(PyTableCode.java:165)
    at org.python.core.PyCode.call(PyCode.java:18)
    at org.python.core.Py.runCode(Py.java:1204)
    at org.python.core.Py.exec(Py.java:1248)
    at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)
    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
    ... 19 more
    Caused by: com.hyperion.odi.common.ODIHAppException: error occurred in the driver when connecting to the application of financial management [IFRSHFM] on [HFMCluster] using the [admin] username.
    at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:240)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:175)
    ... more than 33
    Caused by: com.hyperion.odi.common.ODIHAppException: error occurred in the driver when connecting to the application of financial management [IFRSHFM] on [HFMCluster] using the [admin] username.
    to com.hyperion.odi.hfm.ODIHFMAppConnection. < init > (ODIHFMAppConnection.java:56)
    at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:185)
    ... more than 38
    Caused by: com.hyperion.odi.hfm.wrapper.HFMException: error occurred in the driver when connecting to the application of financial management [IFRSHFM] on [HFMCluster] using the [admin] username.
    to com.hyperion.odi.hfm.wrapper.HFMConnection. < init > (HFMConnection.java:54)
    at com.hyperion.odi.hfm.wrapper.HFMServer.getConnection(HFMServer.java:87)
    to com.hyperion.odi.hfm.ODIHFMAppConnection. < init > (ODIHFMAppConnection.java:50)
    ... 39 more
    Caused by: com.hyperion.odi.hfm.wrapper.HFMException: error loading string resource for the code '103 '. Error code: 1813 (x 715)
    at com.hyperion.odi.hfm.wrapper.HFMDriverJNI.getConnection (Native Method)
    to com.hyperion.odi.hfm.wrapper.HFMConnection. < init > (HFMConnection.java:48)
    ... more than 41

    I've referenced Doc ID 1379286.1 copied below files ODI_HOME\agent\drivers and HFMDriver64_11.1.2.dll renamed to HFMDriver.dll and the file added to the variable 'path ': env

    odihapp_common.jar

    odi_hfm.jar

    HFMDriver64_11.1.2.dll

    msvcr100.dll

    MSVCP100.dll

    And I don't know that HFMDriver.dll is accessible by the studio of ODI (because I can't rename it when ODI studio was opened).

    So can someone please give me some advice? Thank you.

    Eric

    Hi CPR1,.

    Sorry for the late reply (I probably missed the system message), yes I asked the support of oracle and got the answer: KM for HFM 11.1.2.4 module is no longer supported. If I can choose only to customize the interface using the JAVA API or use FDMEE.

    Thanks for all the help, but it seems using ODI to load the data in HFM version 11.1.2.4 + will is no longer supported.

    Eric

  • ODI error when loading classic HFM Application data

    Hi all

    I'm the one stage integration of the source (flat file) at the request of HFM, I created it and loading data integration, I get the following error: -.

    org.apache.bsf.BSFException: exception of Jython:

    Traceback (most recent call changed):

    File '< string >", line 3, in < module >

    at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:240)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    com.hyperion.odi.common.ODIHAppException: com.hyperion.odi.common.ODIHAppException: [CLEAR_ALL_METADATA_BEFORE_LOAD, REPLACE_MODE] properties are required.

    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)

    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)

    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)

    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)

    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)

    at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:577)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:468)

    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2128)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:366)

    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)

    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:292)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:855)

    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)

    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)

    at java.lang.Thread.run(Thread.java:662)

    Caused by: Traceback (most recent call changed):

    File '< string >", line 3, in < module >

    at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:240)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    com.hyperion.odi.common.ODIHAppException: com.hyperion.odi.common.ODIHAppException: [CLEAR_ALL_METADATA_BEFORE_LOAD, REPLACE_MODE] properties are required.

    at org.python.core.PyException.fillInStackTrace(PyException.java:70)

    at java.lang.Throwable. < init > (Throwable.java:181)

    at java.lang.Exception. < init > (Exception.java:29)

    to java.lang.RuntimeException. < init > (RuntimeException.java:32)

    to org.python.core.PyException. < init > (PyException.java:46)

    to org.python.core.PyException. < init > (PyException.java:43)

    at org.python.core.Py.JavaError(Py.java:455)

    at org.python.core.Py.JavaError(Py.java:448)

    at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:177)

    at org.python.core.PyObject.__call__(PyObject.java:355)

    at org.python.core.PyMethod.__call__(PyMethod.java:215)

    at org.python.core.PyMethod.instancemethod___call__(PyMethod.java:221)

    at org.python.core.PyMethod.__call__(PyMethod.java:206)

    at org.python.core.PyObject.__call__(PyObject.java:397)

    at org.python.core.PyObject.__call__(PyObject.java:401)

    to org.python.pycode._pyx15.f$ 0 (< string >: 6)

    to org.python.pycode._pyx15.call_function (< string >)

    at org.python.core.PyTableCode.call(PyTableCode.java:165)

    at org.python.core.PyCode.call(PyCode.java:18)

    at org.python.core.Py.runCode(Py.java:1204)

    at org.python.core.Py.exec(Py.java:1248)

    at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)

    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)

    ... 19 more

    Caused by: com.hyperion.odi.common.ODIHAppException: [CLEAR_ALL_METADATA_BEFORE_LOAD, REPLACE_MODE] properties are required.

    at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:240)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:175)

    ... more than 33

    Caused by: com.hyperion.odi.common.ODIHAppException: [CLEAR_ALL_METADATA_BEFORE_LOAD, REPLACE_MODE] properties are required.

    at com.hyperion.odi.hfm.ODIHFMUtil.validateRequiredProperties(ODIHFMUtil.java:87)

    to com.hyperion.odi.hfm.ODIHFMMetadataLoader$ OptionsMetadataLoad.validate (ODIHFMMetadataLoader.java:66)

    at com.hyperion.odi.hfm.ODIHFMMetadataLoader.validateOptions(ODIHFMMetadataLoader.java:188)

    at com.hyperion.odi.hfm.ODIHFMAppStatement.validateLoadOptions(ODIHFMAppStatement.java:168)

    at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:195)

    ... more than 38

    Any help appreciated

    Thank you

    Pratik

    Hi all

    Thanks for the help!

    I solved the error

    Solution:-j' used the double model i.e HFMData copy data, so what I thought was to use the original data model, IE HFMData.

    Generally, we use one duplicate, but here is not the way.

    Thank you

    Pratik

  • Error while configuring Weblogic Portal (10.3.6) db connection

    Hi all

    I am a newbie to Weblogic Portal so much with goodness correct me if im wrong everywhere...

    I'm trying to set up a new body for Weblogic Portal 10.3.6

    While the establishment of a new area, I got the below error:

    CFGFWK-60850: the Test failed!

    Get more later tried running the command create_db.cmd, below is the generated error

    Kindly help me where mistaken

    D:\Oracle\Middleware\user_projects\domains\domain1 > create_db.cmd

    Database.Properties=D:\Oracle\Middleware\user_projects\domains\domain1\database. Properties

    BuildFile: D:\Oracle\Middleware\wlportal_10.3\p13n\db\build_createdatabase.xml

    init_props:

    appendDomainSqlAuth:

    init_tasks:

    setup_create_database_objects:

    create_database_objects:

    [echo] * call JDBCDataLoader * to create database objects and insert data from seed

    WARNING: Createdb.classpath of reference has not been defined when running, but were found in

    build file analysis, an attempt at resolution. Future versions of Ant can support

    referencing ids defined in the objectives not been executed.

    WARNING: Cwiz.jars of reference has not been defined when running, but were found in

    build file analysis, an attempt at resolution. Future versions of Ant can support

    referencing ids defined in the objectives not been executed.

    [java] Loading database...

    [java]

    [java]

    [java] logFile = create_db.log

    the [java] user = WEBLOGIC

    [java] password = *.

    [java] url = jdbc:derby:weblogic_eval; create = true

    [java] files=D:\Oracle\Middleware\wlportal_10.3/p13n/db/derby/jdbc_index/jdbc.index,D:\Oracle\Middleware\wlportal_10.3/content-mgmt/db/derby/jdbc_index/jdbc.index,D:\Oracle\Middleware\wlportal_10.3/portal/db/derby/jdbc_index/jdbc.index,D:\Oracle\Middleware\user_projects\domains\domain1/security/jdbc.index

    [java] driver = org.apache.derby.jdbc.EmbeddedDriver

    [java] saltFile=D:\Oracle\Middleware\user_projects\domains\domain1/security/SerializedSystemIni.dat

    [java] prodDir=D:\Oracle\Middleware\wlportal_10.3

    [java] Treatment of the file "D:\Oracle\Middleware\wlportal_10.3\p13n\db\derby\jdbc_index\jdbc.index".

    [java] Files =

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/seq_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/p13n_drop_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/p13n_drop_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/p13n_drop_constraints.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/p13n_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/er_drop_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/er_drop_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/er_drop_constraints.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/er_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/bt_drop_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/bt_drop_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/bt_drop_constraints.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/bt_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/rdbms_security_store_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/seq_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/p13n_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/p13n_create_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/p13n_create_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/p13n_create_triggers.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/data/required/p13n_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/er_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/er_create_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/er_create_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/er_create_triggers.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/data/required/er_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/bt_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/bt_create_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/bt_create_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/bt_create_triggers.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/rdbms_security_store_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/data/required/bt_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/data/required/bt9_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/lease_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/p13n9_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/p13n9_drop_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/p13n9_drop_constraints.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/lease_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/p13n9_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/p13n9_create_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/data/required/p13n9_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/p13n102_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/p13n102_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/data/required/p13n102_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/p13n/db/derby/job_manager_create_tables.sql

    [java] Treatment of the file "D:\Oracle\Middleware\wlportal_10.3\content-mgmt\db\derby\jdbc_index\jdbc.index".

    [java] Files =

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm10_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cmv10_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm9_drop_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm9_drop_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm9_drop_constraints.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm9_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm_drop_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm_drop_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm_drop_constraints.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cmv9_drop_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cmv_drop_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cmv_drop_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cmv_drop_constraints.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cmv_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm_create_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm_create_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm_create_triggers.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cmv_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cmv_create_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cmv_create_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cmv_create_triggers.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/data/required/cmv_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm9_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm9_create_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm9_create_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm9_create_triggers.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/data/required/cm9_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cmv9_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cmv9_create_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/data/required/cm_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/data/sample/cm_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm10_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cmv10_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cmv102_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/derby/cm102_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/data/sample/cm102_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/content-mgmt/db/data/sample/cm102_update_system_data.sql

    [java] Treatment of the file "D:\Oracle\Middleware\wlportal_10.3\portal\db\derby\jdbc_index\jdbc.index".

    [java] Files =

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/wps_drop_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/wps_drop_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/wps_drop_constraints.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/wps_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/comm_drop_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/comm_drop_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/comm_drop_constraints.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/comm_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf10_drop_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf10_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf9_drop_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf9_drop_constraints.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf9_drop_views.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf9_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf_drop_views.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf_drop_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf_drop_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf_drop_constraints.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf_drop_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/wps_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/wps_create_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/wps_create_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/wps_create_triggers.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/data/required/wps_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf_create_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf_create_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf_create_views.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf_create_triggers.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/comm_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/comm_create_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/comm_create_indexes.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/comm_create_triggers.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/data/required/comm_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf9_drop_columns.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf9_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf9_create_fkeys.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf9_create_views.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/data/required/pf_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/data/required/pf9_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/data/required/bt_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/data/required/bt9_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf10_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/data/required/pf10_insert_system_data.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf102_create_tables.sql

    [java] file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf102_drop_tables.sql

    [java] Error = SQLException when you run the file file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf_create_tables.sql

    [java] Failed to load

    [java] java.lang.Exception: SQLException when you run the file file:/D:/Oracle/Middleware/wlportal_10.3/portal/db/derby/pf_create_tables.sql

    [java] at com.oracle.cie.domain.jdbc.JDBCDataLoader.load(JDBCDataLoader.java:183)

    [java] at com.oracle.cie.domain.jdbc.JDBCDataLoader.main(JDBCDataLoader.java:1321)

    [java] Caused by: java.sql.SQLException: Table/view 'L10N_INTERSECTION' already exists in the schema "WEBLOGIC.

    [java] at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException (unknown Source)

    [java] at org.apache.derby.impl.jdbc.Util.generateCsSQLException (unknown Source)

    [java] at org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException (unknown Source)

    [java] at org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException (unknown Source)

    [java] at org.apache.derby.impl.jdbc.EmbedConnection.handleException (unknown Source)

    [java] at org.apache.derby.impl.jdbc.ConnectionChild.handleException (unknown Source)

    [java] at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement (unknown Source)

    [java] at org.apache.derby.impl.jdbc.EmbedStatement.execute (unknown Source)

    [java] at org.apache.derby.impl.jdbc.EmbedStatement.executeUpdate (unknown Source)

    [java] at com.oracle.cie.domain.jdbc.JDBCDataLoader.loadData(JDBCDataLoader.java:749)

    [java] at com.oracle.cie.domain.jdbc.JDBCDataLoader.load(JDBCDataLoader.java:167)

    [java]... 1 more

    [java] Caused by: java.sql.SQLException: Table/view 'L10N_INTERSECTION' already exists in the schema "WEBLOGIC.

    [java] at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException (unknown Source)

    [java] at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA (unknown Source)

    [java]... 12 more

    [java] Caused by: ERROR X0Y32: Table/view 'L10N_INTERSECTION' already exists in the schema "WEBLOGIC.

    [java] at org.apache.derby.iapi.error.StandardException.newException (unknown Source)

    [java] at org.apache.derby.impl.sql.catalog.DataDictionaryImpl.duplicateDescriptorException (unknown Source)

    [java] at org.apache.derby.impl.sql.catalog.DataDictionaryImpl.addDescriptor (unknown Source)

    [java] at org.apache.derby.impl.sql.execute.CreateTableConstantAction.executeConstantAction (unknown Source)

    [java] at org.apache.derby.impl.sql.execute.MiscResultSet.open (unknown Source)

    [java] at org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt (unknown Source)

    [java] at org.apache.derby.impl.sql.GenericPreparedStatement.execute (unknown Source)

    [java]... 6 more

    BUILD FAILED

    D:\Oracle\Middleware\wlportal_10.3\p13n\db\build_createdatabase.XML:97: The following error occurred during the execution of this line:

    D:\Oracle\Middleware\wlportal_10.3\p13n\db\build_createdatabase.XML:104: The following error occurred during the execution of this line:

    D:\Oracle\Middleware\wlportal_10.3\p13n\db\build_createdatabase.XML:68: Java returned:-1

    Total time: 48 seconds

    ====================================================================================================

    ERROR!

    CALL FOR ant - cp D:\Oracle\MIDDLE~1\WLSERV~1.3\server\lib\weblogic.jar; f D:\Oracle\Middleware\wlportal_10.3\p13n\db\build_createdatabase.xml FAILED!

    ====================================================================================================

    Future prospects for some entries...

    Thanks in advance,

    Varun Arora

    Strange, run the command even after a few hours and BUILD successfully

  • ETL Sample Application Integrator deployment error

    Hi friends,

    I try to install EID31_SamplePipeline app EID to OTN homepage.

    I use Integrator ETL 3.1.1 in windows 2008

    problem is I'm trying to import data by running baseline.grf in Integrator ETL I get some exceptions such as

    org.jetel.exception.BadDataFormatException: DimDate_FullDateAlternateKey (date) cannot be set to "01/06/2008 0:00 '-don't match set the format" MM/dd/yyyy hh: mm "file 336, box 2 ("DimDate_FullDateAlternateKey"), metadata"DimDate; value: ' 01/06/2008 0:00 '

    Now is due to the recording of inaccurate data in the .csv files? Well, I checked excel file record 336/337, and I have found no problem there.

    What I have to install MS Office in my server?

    Help, please.

    Kind regards.

    Detail error log is fixed:-

    INFO  [WatchDog_0] - ---------------------------------** End of Log **--------------------------------
    ERROR [WatchDog_541] - Component [Date Dim:DATA_READER1] finished with status ERROR.
    DimDate_FullDateAlternateKey (date) cannot be set to "6/1/2008 0:00" - doesn't match defined format "MM/dd/yyyy HH:mm" in record 336, field 2 ("DimDate_FullDateAlternateKey"), metadata "DimDate"; value: '6/1/2008 0:00'
      Unparseable date: "6/1/2008 0:00" at position 13
    ERROR [WatchDog_541] - Error details:
    org.jetel.exception.JetelRuntimeException: Component [Date Dim:DATA_READER1] finished with status ERROR.
        at org.jetel.graph.Node.createNodeException(Node.java:535)
        at org.jetel.graph.Node.run(Node.java:514)
        at java.lang.Thread.run(Thread.java:722)
    Caused by: org.jetel.exception.BadDataFormatException: DimDate_FullDateAlternateKey (date) cannot be set to "6/1/2008 0:00" - doesn't match defined format "MM/dd/yyyy HH:mm" in record 336, field 2 ("DimDate_FullDateAlternateKey"), metadata "DimDate"; value: '6/1/2008 0:00'
        at org.jetel.data.DateDataField.fromString(DateDataField.java:313)
        at org.jetel.data.parser.DataParser.populateField(DataParser.java:712)
        at org.jetel.data.parser.DataParser.parseNext(DataParser.java:560)
        at org.jetel.data.parser.DataParser.getNext(DataParser.java:184)
        at org.jetel.util.MultiFileReader.getNext(MultiFileReader.java:432)
        at org.jetel.component.DataReader.execute(DataReader.java:268)
        at org.jetel.graph.Node.run(Node.java:485)
        ... 1 more
    Caused by: java.lang.IllegalArgumentException: Unparseable date: "6/1/2008 0:00" at position 13
        at org.jetel.util.formatter.JavaDateFormatter.parseDate(JavaDateFormatter.java:75)
        at org.jetel.util.formatter.JavaDateFormatter.parseMillis(JavaDateFormatter.java:82)
        at org.jetel.data.DateDataField.fromString(DateDataField.java:308)
        ... 7 more
    
    INFO  [exNode_541_1307392988263_ENDECA_BULK_ADD_OR_REPLACE_RECORDS0] - Aborting run due to a signal from the user
    INFO  [WatchDog_541] - Execution of phase [0] finished with error - elapsed time(sec): 2
    ERROR [WatchDog_541] - !!! Phase finished with error - stopping graph run !!!
    INFO  [WatchDog_541] - -----------------------** Summary of Phases execution **---------------------
    INFO  [WatchDog_541] - Phase#            Finished Status         RunTime(sec)    MemoryAllocation(KB)
    INFO  [WatchDog_541] - 0                 ERROR                              2             81647
    INFO  [WatchDog_541] - ------------------------------** End of Summary **---------------------------
    INFO  [WatchDog_541] - WatchDog thread finished - total execution time: 2 (sec)
    ERROR [WatchDog_0] - Component [Run Steps:RUN_GRAPH1] finished with status ERROR.
    Execution of graph './graph/LoadData.grf' failed!
      Component [Date Dim:DATA_READER1] finished with status ERROR.
       DimDate_FullDateAlternateKey (date) cannot be set to "6/1/2008 0:00" - doesn't match defined format "MM/dd/yyyy HH:mm" in record 336, field 2 ("DimDate_FullDateAlternateKey"), metadata "DimDate"; value: '6/1/2008 0:00'
        Unparseable date: "6/1/2008 0:00" at position 13
      Inner exception: org.jetel.exception.JetelRuntimeException: Component [Date Dim:DATA_READER1] finished with status ERROR.
          at org.jetel.graph.Node.createNodeException(Node.java:535)
          at org.jetel.graph.Node.run(Node.java:514)
          at java.lang.Thread.run(Thread.java:722)
      Caused by: org.jetel.exception.BadDataFormatException: DimDate_FullDateAlternateKey (date) cannot be set to "6/1/2008 0:00" - doesn't match defined format "MM/dd/yyyy HH:mm" in record 336, field 2 ("DimDate_FullDateAlternateKey"), metadata "DimDate"; value: '6/1/2008 0:00'
          at org.jetel.data.DateDataField.fromString(DateDataField.java:313)
          at org.jetel.data.parser.DataParser.populateField(DataParser.java:712)
          at org.jetel.data.parser.DataParser.parseNext(DataParser.java:560)
          at org.jetel.data.parser.DataParser.getNext(DataParser.java:184)
          at org.jetel.util.MultiFileReader.getNext(MultiFileReader.java:432)
          at org.jetel.component.DataReader.execute(DataReader.java:268)
          at org.jetel.graph.Node.run(Node.java:485)
          ... 1 more
      Caused by: java.lang.IllegalArgumentException: Unparseable date: "6/1/2008 0:00" at position 13
          at org.jetel.util.formatter.JavaDateFormatter.parseDate(JavaDateFormatter.java:75)
          at org.jetel.util.formatter.JavaDateFormatter.parseMillis(JavaDateFormatter.java:82)
          at org.jetel.data.DateDataField.fromString(DateDataField.java:308)
          ... 7 more
    
    ERROR [WatchDog_0] - Error details:
    org.jetel.exception.JetelRuntimeException: Component [Run Steps:RUN_GRAPH1] finished with status ERROR.
        at org.jetel.graph.Node.createNodeException(Node.java:535)
        at org.jetel.graph.Node.run(Node.java:514)
        at java.lang.Thread.run(Thread.java:722)
    Caused by: org.jetel.exception.JetelRuntimeException: Execution of graph './graph/LoadData.grf' failed!
        at org.jetel.component.RunGraph.logError(RunGraph.java:538)
        at org.jetel.component.RunGraph.runGraphThisInstance(RunGraph.java:517)
        at org.jetel.component.RunGraph.runSingleGraph(RunGraph.java:397)
        at org.jetel.component.RunGraph.execute(RunGraph.java:303)
        at org.jetel.graph.Node.run(Node.java:485)
        ... 1 more
    Caused by: org.jetel.exception.JetelRuntimeException: Component [Date Dim:DATA_READER1] finished with status ERROR.
    DimDate_FullDateAlternateKey (date) cannot be set to "6/1/2008 0:00" - doesn't match defined format "MM/dd/yyyy HH:mm" in record 336, field 2 ("DimDate_FullDateAlternateKey"), metadata "DimDate"; value: '6/1/2008 0:00'
      Unparseable date: "6/1/2008 0:00" at position 13
    Inner exception: org.jetel.exception.JetelRuntimeException: Component [Date Dim:DATA_READER1] finished with status ERROR.
        at org.jetel.graph.Node.createNodeException(Node.java:535)
        at org.jetel.graph.Node.run(Node.java:514)
        at java.lang.Thread.run(Thread.java:722)
    Caused by: org.jetel.exception.BadDataFormatException: DimDate_FullDateAlternateKey (date) cannot be set to "6/1/2008 0:00" - doesn't match defined format "MM/dd/yyyy HH:mm" in record 336, field 2 ("DimDate_FullDateAlternateKey"), metadata "DimDate"; value: '6/1/2008 0:00'
        at org.jetel.data.DateDataField.fromString(DateDataField.java:313)
        at org.jetel.data.parser.DataParser.populateField(DataParser.java:712)
        at org.jetel.data.parser.DataParser.parseNext(DataParser.java:560)
        at org.jetel.data.parser.DataParser.getNext(DataParser.java:184)
        at org.jetel.util.MultiFileReader.getNext(MultiFileReader.java:432)
        at org.jetel.component.DataReader.execute(DataReader.java:268)
        at org.jetel.graph.Node.run(Node.java:485)
        ... 1 more
    Caused by: java.lang.IllegalArgumentException: Unparseable date: "6/1/2008 0:00" at position 13
        at org.jetel.util.formatter.JavaDateFormatter.parseDate(JavaDateFormatter.java:75)
        at org.jetel.util.formatter.JavaDateFormatter.parseMillis(JavaDateFormatter.java:82)
        at org.jetel.data.DateDataField.fromString(DateDataField.java:308)
        ... 7 more
    
        at org.jetel.component.RunGraph.assembleException(RunGraph.java:533)
        ... 5 more
    
    INFO  [WatchDog_0] - Execution of phase [0] finished with error - elapsed time(sec): 55
    ERROR [WatchDog_0] - !!! Phase finished with error - stopping graph run !!!
    INFO  [WatchDog_0] - -----------------------** Summary of Phases execution **---------------------
    INFO  [WatchDog_0] - Phase#            Finished Status         RunTime(sec)    MemoryAllocation(KB)
    INFO  [WatchDog_0] - 0                 ERROR                             55             98782
    INFO  [WatchDog_0] - ------------------------------** End of Summary **---------------------------
    INFO  [WatchDog_0] - WatchDog thread finished - total execution time: 55 (sec)
    INFO  [main] - Freeing graph resources.
    ERROR [main] -
    ----------------------------------------------------------------------------------------------------------------------------------- Error details ------------------------------------------------------------------------------------------------------------------------------------
      Component [Run Steps:RUN_GRAPH1] finished with status ERROR.
       Execution of graph './graph/LoadData.grf' failed!
        Component [Date Dim:DATA_READER1] finished with status ERROR.
         DimDate_FullDateAlternateKey (date) cannot be set to "6/1/2008 0:00" - doesn't match defined format "MM/dd/yyyy HH:mm" in record 336, field 2 ("DimDate_FullDateAlternateKey"), metadata "DimDate"; value: '6/1/2008 0:00'
          Unparseable date: "6/1/2008 0:00" at position 13
        Inner exception: org.jetel.exception.JetelRuntimeException: Component [Date Dim:DATA_READER1] finished with status ERROR.
            at org.jetel.graph.Node.createNodeException(Node.java:535)
            at org.jetel.graph.Node.run(Node.java:514)
            at java.lang.Thread.run(Thread.java:722)
        Caused by: org.jetel.exception.BadDataFormatException: DimDate_FullDateAlternateKey (date) cannot be set to "6/1/2008 0:00" - doesn't match defined format "MM/dd/yyyy HH:mm" in record 336, field 2 ("DimDate_FullDateAlternateKey"), metadata "DimDate"; value: '6/1/2008 0:00'
            at org.jetel.data.DateDataField.fromString(DateDataField.java:313)
            at org.jetel.data.parser.DataParser.populateField(DataParser.java:712)
            at org.jetel.data.parser.DataParser.parseNext(DataParser.java:560)
            at org.jetel.data.parser.DataParser.getNext(DataParser.java:184)
            at org.jetel.util.MultiFileReader.getNext(MultiFileReader.java:432)
            at org.jetel.component.DataReader.execute(DataReader.java:268)
            at org.jetel.graph.Node.run(Node.java:485)
            ... 1 more
        Caused by: java.lang.IllegalArgumentException: Unparseable date: "6/1/2008 0:00" at position 13
            at org.jetel.util.formatter.JavaDateFormatter.parseDate(JavaDateFormatter.java:75)
            at org.jetel.util.formatter.JavaDateFormatter.parseMillis(JavaDateFormatter.java:82)
            at org.jetel.data.DateDataField.fromString(DateDataField.java:308)
            ... 7 more
       Component [Date Dim:DATA_READER1] finished with status ERROR.
        DimDate_FullDateAlternateKey (date) cannot be set to "6/1/2008 0:00" - doesn't match defined format "MM/dd/yyyy HH:mm" in record 336, field 2 ("DimDate_FullDateAlternateKey"), metadata "DimDate"; value: '6/1/2008 0:00'
         Unparseable date: "6/1/2008 0:00" at position 13
       Inner exception: org.jetel.exception.JetelRuntimeException: Component [Date Dim:DATA_READER1] finished with status ERROR.
           at org.jetel.graph.Node.createNodeException(Node.java:535)
           at org.jetel.graph.Node.run(Node.java:514)
           at java.lang.Thread.run(Thread.java:722)
       Caused by: org.jetel.exception.BadDataFormatException: DimDate_FullDateAlternateKey (date) cannot be set to "6/1/2008 0:00" - doesn't match defined format "MM/dd/yyyy HH:mm" in record 336, field 2 ("DimDate_FullDateAlternateKey"), metadata "DimDate"; value: '6/1/2008 0:00'
           at org.jetel.data.DateDataField.fromString(DateDataField.java:313)
           at org.jetel.data.parser.DataParser.populateField(DataParser.java:712)
           at org.jetel.data.parser.DataParser.parseNext(DataParser.java:560)
           at org.jetel.data.parser.DataParser.getNext(DataParser.java:184)
           at org.jetel.util.MultiFileReader.getNext(MultiFileReader.java:432)
           at org.jetel.component.DataReader.execute(DataReader.java:268)
           at org.jetel.graph.Node.run(Node.java:485)
           ... 1 more
       Caused by: java.lang.IllegalArgumentException: Unparseable date: "6/1/2008 0:00" at position 13
           at org.jetel.util.formatter.JavaDateFormatter.parseDate(JavaDateFormatter.java:75)
           at org.jetel.util.formatter.JavaDateFormatter.parseMillis(JavaDateFormatter.java:82)
           at org.jetel.data.DateDataField.fromString(DateDataField.java:308)
           ... 7 more
    .
    .
    .
    .
    .
    .
    .
    .
    .
    .
    --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
    ERROR [main] - Execution of graph failed !
    
    
    

    What time zone are you and what is the time zone of the default local on the computer where you run Integrator.  You can't be in Pakistan would you?

    There was no midnight (0:00) June 1, 2008 in Pakistan, because of a change to Daylight Savings Time (DST), UTC + 5 to UTC + 6.  Thus, clocks has moved to May 31, 2008 23:59 on June 1, 2008 01:00, making midnight and time following it an invalid time value.  So if your TZ is Pakistan which could explain the error you see.

    You can change the value outside this time, or use a different TZ.

  • Mapping for the Siebel OPA connector error

    Hello

    We are working on Siebel with OPA using Web-determinations. The web server we use is Apache tomcat.

    Where once the server is restarted first call to OPA fails and it gives the following error in the logs.

    [November 17, 2014 07:24:06, 945] 267752922 [http-bio-8080-exec-2] ERROR com.oracle.determinations.web.siebel.SiebelIOClient - error loading cases

    com.oracle.determinations.siebel.io.web.ServiceCallException: read timed out

    at com.oracle.determinations.siebel.io.web.SiebelServiceUtil.callService(SiebelServiceUtil.java:98)

    at com.oracle.determinations.siebel.io.web.SiebelGetIOService.call(SiebelGetIOService.java:114)

    at com.oracle.determinations.siebel.io.web.SiebelServiceFactory.call(SiebelServiceFactory.java:41)

    at com.oracle.determinations.web.siebel.SiebelIOClient.load(SiebelIOClient.java:128)

    at com.oracle.determinations.web.siebel.SiebelIOClient.loadSession(SiebelIOClient.java:80)

    at com.oracle.determinations.web.siebel.SiebelDataAdapter.load(SiebelDataAdapter.java:139)

    at com.oracle.determinations.interview.engine.local.LocalInterviewSession.loadData(LocalInterviewSession.java:246)

    at com.oracle.determinations.web.platform.controller.actions.StartSessionAction.getResource(StartSessionAction.java:161)

    at com.oracle.determinations.web.platform.servlet.WebDeterminationsServlet.doGet(WebDeterminationsServlet.java:112)

    at javax.servlet.http.HttpServlet.service(HttpServlet.java:620)

    at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)

    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)

    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)

    at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)

    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)

    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)

    at com.oracle.determinations.web.platform.util.CharsetFilter.doFilter(CharsetFilter.java:46)

    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)

    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)

    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)

    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)

    at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)

    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:170)

    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98)

    at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)

    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)

    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)

    at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1040)

    to org.apache.coyote.AbstractProtocol$ AbstractConnectionHandler.process (AbstractProtocol.java:607)

    to org.apache.tomcat.util.net.JIoEndpoint$ SocketProcessor.run (JIoEndpoint.java:315)

    at java.util.concurrent.ThreadPoolExecutor.runWorker (unknown Source)

    to java.util.concurrent.ThreadPoolExecutor$ Worker.run (unknown Source)

    at java.lang.Thread.run (unknown Source)

    Caused by: java.net.SocketTimeoutException: Read timed out

    at java.net.SocketInputStream.socketRead0 (Native Method)

    at java.net.SocketInputStream.read (unknown Source)

    at java.net.SocketInputStream.read (unknown Source)

    at java.io.BufferedInputStream.fill (unknown Source)

    at java.io.BufferedInputStream.read1 (unknown Source)

    at java.io.BufferedInputStream.read (unknown Source)

    at sun.net.www.http.HttpClient.parseHTTPHeader(Unknown_Source)

    at sun.net.www.http.HttpClient.parseHTTP(Unknown_Source)

    at sun.net.www.protocol.http.HttpURLConnection.getInputStream(Unknown_Source)

    at com.oracle.determinations.siebel.io.web.SiebelServiceUtil.callService(SiebelServiceUtil.java:95)

    ... 32 more

    [November 17, 2014 07:24:06, 986] 267752963 [http-bio-8080-exec-2] ERROR com.oracle.determinations.web.siebel.SiebelIOClient - error when loading the session of Siebel

    This only happens when we call the OPA of first time. Second call before it all works correctly.

    Please suggest a solution for this, is it possible to increase the timeout in Siebel OPA connector.

    Thank you

    Bharat maklouf

    Looks like the siebel inbound web services take a while to run the first time. Ideally, it would be best to find a way to get them to start as soon as possible. I do not know how to Siebel, but you might have luck asking on the forums of Siebel.

    For the OPA to Siebel connector, you can add the property "serviceTimeout ="(la casse est importante) in the siebel data file - adapt .properties for determinations of web. "" This timeout property will only work for mappings of IO, but it looks like this, that's what you use.

  • Error in AIFUtil.callOdiServlet - connection to the ODI Servlet error

    Hi John & all,

    We have EMP 11123 put in place on a cluster of 2 nodes using the common mount Linux.

    The servers in the domain of WL managed ERPI are as below:

    ErpIntegrator0 Server A 6550

    ErpIntegrator1 Server B 6551

    The oracledi agent is available to

    http:// ServerA:6550 / oraclediagent

    Or

    http:// ServerB:6551 / oraclediagent

    However when we start loading it rule fails in the export with the error (data in the workspace management) below, we get the error below:

    "CommData.updateWorkflow - END."

    2014-01-17 12:24:56, 438 [AIF] DEBUG: AIFUtil.callOdiServlet - START

    2014-01-17 12:24:56, 450 FATAL [AIF]: error in AIFUtil.callOdiServlet - connection to the ODI Servlet error: url = http://localhost:6551/aif/ODIServlet , class="HPLService", method=loadData, params=[170, u'YXaa0YHQfx8rIJlwC40l9%2BxbO8sEEgKHBWsamkRbZZRoUkatmfYLpfdw2rA%2BI73XtsD9z5AW1vqU9XRlB5WByL03O18g9Yk2TZtc6sgKJG4MzV48jMuIYC7bTiwQ0r6jwWlCRThoPRiy450ScFygchYQG5%2FAHJyZAPzPTDcFlismBoSYiWOl4vk%2Bka5tX45aYjCsHtvVRn8tUvcnsHvRPIBWls2FaHhN3CvY9L0yOhZ8FUFmNNE0ltJ4dgMVh7MyDCVStT8LVBTv5fRIPvOrsWNJhlyvNHNzvyqqN3t677Gw4B%2Fa70TxDJYDb5eb83NZe%2F3GW%2BCKxRsHve6O8JLADO%2B80TIKSrEiLoialbrG%2B6mZApB78sg7h5PwuMZAcFzXxxmnPjDM1KnaK5RRrqddMQXKAxVCg8mhaZtuzFgd3C4AdzsJVGcLM8siaZOLhdjk']"

    Traceback.

    For some reason, it's thinking that ODI runs at localhost:6551.

    We tried to manually change the URL odiagent on the configuration page of data management to one of the two correct URL as shown above. We can check the connection test succeeded. After that we ran the load, there always seem to "localhost:6551". Then, we thought probably after that changing the setting that we must restart ErpIntegrator WL managed server. After reboot, we see the default URL "http://localhost:6551 / oraclediagent'. '"

    Not sure how to get generated this URL of Servlet ODI. For sure it is not the correct collection of the server/port combination. Please help on how to solve this problem.

    Concerning

    VARA

    Hi Craig,.

    It was actually http://localhost:6551 / oraclediagent. Now, the question has been resolved. In a clustered environment, it is expected that the ports are the same on both nodes. In our case on a node, it was 6550 and on the other, it was 6551. So in the registry, it is updated with the latest value of port this is the port on Node2.

    Here, he is able to interpret the localhost to the node that hosts the managed server ErpIntegrator. To test this, I kept two servers managed upwards (ErpIntegrator0 & ErpIntegrator1), then tried to check the connectivity to the data management page. He has failed. Then stopped ErpIntegrator1, maintained ErpIntegrator0 no luck. Arrested ErpIntegrator0 and up to the ErpIntegrator1. Then with localhost:6551 ODI connectivity agent works very well.

    So, I conclude that the port is same on both nodes. Another finding was if install us a separate stand-alone ODI on the server agent and try to use it on the page of data management that it does not work in a clustered env. Where as in a single-node configuration we can update the host & port of agent oracledi.

    Concerning

    VARA

  • ODI-17517: error in the interpretation of the task

    Hi gurus,

    I try to run my interface and ODI gives me this error. Anyone know what it is? I'm trying to load cube essbase with ODI.

    com.sunopsis.tools.core.exception.SnpsSimpleMessageException: ODI-17517: error in the interpretation of the task.

    Task: 2

    java.lang.Exception: the application script threw an exception: com.sunopsis.tools.core.exception.SnpsSimpleMessageException: Exception getObjectName ("L", "FACT_CUR", "LOG_ORACLE_DATABASE_PERO", "ESSBASE_CONTEXT", "D"): SnpPschemaCont.getObjectByIdent: SnpPschemaCont is no information OSB: load data into essbase to line: column 0: columnNo

    at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.transform(SnpCodeInterpretor.java:489)

    at com.sunopsis.dwg.dbobj.SnpSessStep.createTaskLogs(SnpSessStep.java:737)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:465)

    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2128)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:366)

    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)

    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:292)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:855)

    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)

    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)

    at java.lang.Thread.run(Thread.java:724)

    Caused by: java.lang.Exception: the application script threw an exception: com.sunopsis.tools.core.exception.SnpsSimpleMessageException: Exception getObjectName ("L", "FACT_CUR", "LOG_ORACLE_DATABASE_PERO", "ESSBASE_CONTEXT", "D"): SnpPschemaCont.getObjectByIdent: SnpPschemaCont is no information OSB: load data into essbase to line: column 0: columnNo

    at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.transform(SnpCodeInterpretor.java:476)

    ... 11 more

    Caused by: org.apache.bsf.BSFException: the application script threw an exception: com.sunopsis.tools.core.exception.SnpsSimpleMessageException: Exception getObjectName ("L", "FACT_CUR", "LOG_ORACLE_DATABASE_PERO", "ESSBASE_CONTEXT", "D"): SnpPschemaCont.getObjectByIdent: SnpPschemaCont is no information OSB: load data into essbase to line: column 0: columnNo

    at bsh.util.BeanShellBSFEngine.eval (unknown Source)

    at bsh.util.BeanShellBSFEngine.exec (unknown Source)

    at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.transform(SnpCodeInterpretor.java:471)

    ... 11 more

    Text: import com.hyperion.odi.common ODIConstants

    from com.hyperion.odi.connection import HypAppConnectionFactory

    import java.lang class

    import java.lang Boolean

    import java.sql *.

    from java.util import HashMap

    #

    # Get the select statement in the meeting area:

    #

    SQL = ' "' select FACT_CUR." ' "" '. "ACCOUNT", FACT_CUR. 'PERIOD', FACT_CUR. 'YEAR', FACT_CUR. SCENARIO 'Scenario', 'Version' heart, FACT_CUR. ENTITY 'Entity', FACT_CUR. 'PRODUCT', FACT_CUR. "CURRENCY", FACT_CUR. SEGMENT 'Segment', FACT_CUR. DATA 'Data' of <? = snpRef.getObjectName ("L", "FACT_CUR", "LOG_ORACLE_DATABASE_PERO", "", "D")? «> FACT_CUR where (1 = 1) "»

    srcCx = odiRef.getJDBCConnection ("SRC")

    stmt = srcCx.createStatement)

    srcFetchSize = <? If ("".equals (odiRef.getInfo ("SRC_FETCH_ARRAY"))) {out.print("1000") ;} else {out. Print (odiRef.GetInfo("SRC_FETCH_ARRAY")) ;} ? >

    #stmt.setFetchSize (srcFetchSize)

    stmt.setFetchSize (1)

    Print "query execution.

    RS = stmt.executeQuery (sql)

    Print "made the request for enforcement.

    #load data

    Print "loading data".

    stats = pWriter.loadData (rs)

    Print "data has finished loading.

    #close the database result set, connection

    RS. Close()

    stmt. Close().

    at com.sunopsis.dwg.dbobj.SnpSessStep.createTaskLogs(SnpSessStep.java:764)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:465)

    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2128)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:366)

    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)

    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:292)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:855)

    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)

    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)

    at java.lang.Thread.run(Thread.java:724)

    You need a context in which you have mapped both your physical schema in logic.

  • ODI: Error loading the data of HFM: invalid dimension name

    Hello

    I am fairly new to ODI and I was wondering if any guru could help me overcome a question im facing. I try to load data from a csv file in HFM. I chose the good KM (LKM file SQL and SQL IKM to Hyperion Financial Management Data), with the Sunopsis Memory engine than the staging area.

    To facilitate the file csv has the exact structure as well as the dimensions of HFM applications and has been located in the interface, as shown below:

    Column of the source - target HFM column file

    -Scenario
    Year - year
    Display - display
    Entity - entity
    Value - value
    Account - account
    PIC - PIC
    CUSTOM1 - Custom1
    CUSTOM2 - Custom2
    Custom3 - Custom3
    Custom4 - Custom4
    -Period
    DataValue - Datavalue
    -Description (no column of the source, mapped as ")

    The csv file contains basic members only. I set the error log file path, and when running the interface I get an error. When I open the error log, I see the following messages:

    Line: 1, error: invalid dimension name
    ! Column_Order = C1_SCENARIO, C2_YEAR, C3_VIEW, C4_ENTITY, C5_VALUE, C6_ACCOUNT, C7_ICP, C8_CUSTOM1, C9_CUSTOM2, C10_CUSTOM3, C11_CUSTOM4, C12_PERIOD, C13_DATAVALUE
    C1_SCENARIO
    line: 3 error: a valid column order is not specified.
    Actual; 2007; YTD; 20043; < entity currency >; 13040; [ICP no]; [None]; 1000; [None]; [None]; Jan; 512000; » »
    > > > > > >



    I'm not sure how to solve, as it is based on the interface mapping match dimensions on a 1:1 basis. In addition, dimension in the target column names correspond to the dimension names of application of HFM (that this application has been deducted).

    Help, please!

    Thank you very much
    Jorge

    Published by: 993020 on March 11, 2013 05:06

    Dear Experts,

    ODI: 11.1.1.6.0
    HFM: 9.3.3

    I also met a similar error as OP.

    In my case, the error occurs when I use SUNOPSIS_MEMORY_ENGINE as the staging. If I simply change this staging to the Oracle schema, the Interface will load data successfully to HFM. So, I'm curious on what cause the SUNOPSIS cannot become the staging for the loading of HFM.

    This will show in the IKM SQL to the FM data log file:

    Load data started: 3/14/2013 13:41:11.
    Line: 1, Error: Invalid dimension name
    !Column_Order = C1_SCENARIO, C2_YEAR, C3_VIEW, C4_ENTITY, C5_VALUE, C6_ACCOUNT, C7_ICP, C8_PRODUCT, C9_CUSTOMERS, C10_CHANNEL, C11_UNITSFLOWS, C12_PERIOD, C13_DESCRIPTION
    
    
    C1_SCENARIO

    
    Line: 3, Error: A valid column order is not specified.
    Actual;2007;YTD;EastSales;;Sales;[ICP None];Comma_PDAs;Electronic_City;National_Accts;[None];February;;555
    >>>>>>
    
    Load data completed: 3/14/2013 13:41:11.
    

    It seems like the query generated is not picking up the Column Alias name, but this only happens if I use SUNOPSIS_MEMORY_ENGINE as the staging. With Oracle schema as staging, data load is successfully finished.

    This is the generated code from the KM

    Prepare for Loading (Using Oracle as Staging)

    from java.util import HashMap
    from java.lang import Boolean
    from java.lang import Integer
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.hfm import ODIHFMConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    
    # Target HFM connection properties
    clusterName   = "demo92"
    userName      = "admin"
    password      =  "<@=snpRef.getInfo("DEST_PASS") @>"
    application   = "COMMA"
    
    targetProps = HashMap()
    targetProps.put(ODIConstants.SERVER,clusterName)
    targetProps.put(ODIConstants.USER,userName)
    targetProps.put(ODIConstants.PASSWORD,password)
    targetProps.put(ODIConstants.APPLICATION_NAME,application)
    
    # Load options
    consolidateOnly    = 0
    importMode            = "Merge"
    accumulateWithinFile  = 0
    fileContainsShareData = 0
    consolidateAfterLoad  = 0
    consolidateParameters = ""
    logEnabled             = 1
    logFileName           = r"C:\Temp\ODI_HFM_Load.log"
    tableName             = r"HFMData"
    columnMap            = 'SCENARIO=Scenario , YEAR=Year , VIEW=View , ENTITY=Entity , VALUE=Value , ACCOUNT=Account , ICP=ICP , PRODUCT=Product , CUSTOMERS=Customers , CHANNEL=Channel , UNITSFLOWS=UnitsFlows , PERIOD=Period , DATAVALUE=DataValue , DESCRIPTION=Description '
    srcQuery= """select   C1_SCENARIO    "Scenario",C2_YEAR    "Year",C3_VIEW    "View",C4_ENTITY    "Entity",C5_VALUE    "Value",C6_ACCOUNT    "Account",C7_ICP    "ICP",C8_PRODUCT    "Product",C9_CUSTOMERS    "Customers",C10_CHANNEL    "Channel",C11_UNITSFLOWS    "UnitsFlows",C12_PERIOD    "Period",555    "DataValue",C13_DESCRIPTION    "Description" from ODI_TMP."C$_0HFMData"  where      (1=1)     """
    srcCx                    = odiRef.getJDBCConnection("SRC")
    srcQueryFetchSize=30
    
    loadOptions = HashMap()
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEONLY, Boolean(consolidateOnly))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_IMPORTMODE, importMode)
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_ACCUMULATEWITHINFILE, Boolean(accumulateWithinFile))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_FILECONTAINSSHAREDATA, Boolean(fileContainsShareData))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEAFTERLOAD, Boolean(consolidateAfterLoad))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEPARAMS, consolidateParameters)
    loadOptions.put(ODIConstants.LOG_ENABLED, Boolean(logEnabled))
    loadOptions.put(ODIConstants.LOG_FILE_NAME, logFileName)
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_TABLENAME, tableName);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_COLUMNMAP, columnMap);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCECONNECTION, srcCx);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCEQUERY, srcQuery);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCEQUERYFETCHSIZE, Integer(srcQueryFetchSize));
    
    # Get the writer
    hfmWriter = HypAppConnectionFactory.getAppWriter(HypAppConnectionFactory.APP_HFM, targetProps);
    
    # Begin load
    hfmWriter.beginLoad(loadOptions)
    

    Prepare for loading (using SUNOPSIS as staging)

    from java.util import HashMap
    from java.lang import Boolean
    from java.lang import Integer
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.hfm import ODIHFMConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    
    # Target HFM connection properties
    clusterName   = "demo92"
    userName      = "admin"
    password      =  "<@=snpRef.getInfo("DEST_PASS") @>"
    application   = "COMMA"
    
    targetProps = HashMap()
    targetProps.put(ODIConstants.SERVER,clusterName)
    targetProps.put(ODIConstants.USER,userName)
    targetProps.put(ODIConstants.PASSWORD,password)
    targetProps.put(ODIConstants.APPLICATION_NAME,application)
    
    # Load options
    consolidateOnly    = 0
    importMode            = "Merge"
    accumulateWithinFile  = 0
    fileContainsShareData = 0
    consolidateAfterLoad  = 0
    consolidateParameters = ""
    logEnabled             = 1
    logFileName           = r"C:\Temp\ODI_HFM_Load.log"
    tableName             = r"HFMData"
    columnMap            = 'SCENARIO=Scenario , YEAR=Year , VIEW=View , ENTITY=Entity , VALUE=Value , ACCOUNT=Account , ICP=ICP , PRODUCT=Product , CUSTOMERS=Customers , CHANNEL=Channel , UNITSFLOWS=UnitsFlows , PERIOD=Period , DATAVALUE=DataValue , DESCRIPTION=Description '
    srcQuery= """select   C1_SCENARIO    "Scenario",C2_YEAR    "Year",C3_VIEW    "View",C4_ENTITY    "Entity",C5_VALUE    "Value",C6_ACCOUNT    "Account",C7_ICP    "ICP",C8_PRODUCT    "Product",C9_CUSTOMERS    "Customers",C10_CHANNEL    "Channel",C11_UNITSFLOWS    "UnitsFlows",C12_PERIOD    "Period",555    "DataValue",C13_DESCRIPTION    "Description" from "C$_0HFMData"  where      (1=1)     """
    srcCx                    = odiRef.getJDBCConnection("SRC")
    srcQueryFetchSize=30
    
    loadOptions = HashMap()
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEONLY, Boolean(consolidateOnly))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_IMPORTMODE, importMode)
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_ACCUMULATEWITHINFILE, Boolean(accumulateWithinFile))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_FILECONTAINSSHAREDATA, Boolean(fileContainsShareData))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEAFTERLOAD, Boolean(consolidateAfterLoad))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEPARAMS, consolidateParameters)
    loadOptions.put(ODIConstants.LOG_ENABLED, Boolean(logEnabled))
    loadOptions.put(ODIConstants.LOG_FILE_NAME, logFileName)
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_TABLENAME, tableName);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_COLUMNMAP, columnMap);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCECONNECTION, srcCx);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCEQUERY, srcQuery);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCEQUERYFETCHSIZE, Integer(srcQueryFetchSize));
    
    # Get the writer
    hfmWriter = HypAppConnectionFactory.getAppWriter(HypAppConnectionFactory.APP_HFM, targetProps);
    
    # Begin load
    hfmWriter.beginLoad(loadOptions)
    

    If anyone can help on how to solve this?

    Thank you

    Published by: user10620897 on March 14, 2013 14:28

  • Implementation error when executing load interface to Essbase data.

    When executing an interface to Essbase data loading I get this error:
    rg.apache.bsf.BSFException: exception of Jython:
    Traceback (most recent call changed):
    File "< string >", line 26, < module >
    at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData (unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)


    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: invalid column type specified for the column of data [ActivityPurpose]

    The data store is reversed using the RKM Hyperion Essbase, I did not any changes in the data store.
    I'm loading from an identical to the tabel of target array.
    I use the IKM SQL for Hyperion Essbase (DATA).

    Does anyone have any idea what this might be?

    Concerning
    Hans-Petter

    Might be useful to have a read of the following doc Oracle Support - "how to avoid signs of ' ODIEssbaseException: invalid column type specified for the data column" Message When loading Essbase models ODI? ". [ID 885608.1] »

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Error loading of ODI planning data

    Hello

    I am trying to load data from the planning table. I get the error in the charge at the stage of planning data:

    com.hyperion.odi.planning.ODIPlanningException: com.hyperion.odi.planning.ODIPlanningException: java.lang.RuntimeException: HyperionPlanningBean::beginLoad: could not derive local systemCfg format for loading data.

    Code:

    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    import java.lang class
    import java.lang Boolean
    import java.sql *.
    from java.util import HashMap
    #
    # Get the select statement in the meeting area:
    #
    SQL = ' "' select ltrim (rtrim (C2_NODE))"Project", ltrim (rtrim (C3_PARENT)) 'Parent', case when ltrim (rtrim (C4_ALIAS_2)) is null then '#missing»»
    of another ltrim (rtrim (C4_ALIAS_2)) end ' Alias: Alias 2 ', case when ltrim (rtrim (C5_ALIAS_1)) is null then '#missing' another
    LTRIM (RTrim (C5_ALIAS_1)) end ' Alias: Default "," #missing ' 'valid for Consolidations ", ltrim (rtrim (C6_DATA_STORAGE))" Data Storage", ltrim (rtrim (C7_TWO_PASS_CALCULATION)) 'Calculation of two', case when ltrim (rtrim (C8_DESCRIPTION)) is null then '#missing' other '.
    LTRIM (RTrim (C8_DESCRIPTION)) end 'Description', "#missing '' formula' ', #missing '" UDA"', #missing ' 'Smart List', C1_DATA_TYPE 'Data Type'," #missing ' "Operation", ltrim (rtrim (C9_AGGREGATION_1)) '(F_AdSale) aggregation' of "C$ _0Project" where (1 = 1) ' "

    srcCx = odiRef.getJDBCConnection ("SRC")

    stmt = srcCx.createStatement)

    srcFetchSize = 30

    stmt.setFetchSize (srcFetchSize)

    RS = stmt.executeQuery (sql)

    #load data
    stats = pWriter.loadData (rs)

    #close the database result set, connection
    RS. Close()
    stmt. Close()

    Any help will be appreciated

    -Kash

    I have not seen that error before but there's a doc in support of Oracle that highlights the error ' could not derive local systemCfg format for loading data [ID 1479461.1].
    It is not directly related to ODI or problem, but the workaround may help in your case.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Error loading of ODI in essbase

    Hi all

    I tried SQL table ETL data into Essbase. I got the following error:

    org.apache.bsf.BSFException: exception of Jython:
    Traceback (most recent call changed):
    File "< string >", line 26, < module >
    at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData (unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)


    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: error reached records the maximum error threshold: 1

    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2457)
    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
    at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:537)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:338)
    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:263)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:822)
    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)
    at java.lang.Thread.run(Thread.java:662)

    Looks like you have set the option km: 0, you must remove : just let 0

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Failed to load with error [3303

    Hi, guys.

    Can someone help me?

    When I run the odi interface to load data from oracle 11g to essbase-essbase 11 demo data base;

    the case is:

    Failed to load with error [3303

    C1_YEAR, C2_MARKET, C3_PRODUCT, C4_ACCOUNTS, C5_SCENARIO, C6_DATA, Error_Reason
    ""Qtr2', 'Is', 'Audio', 'Margin', 'Real', 1 ', ' failed to load with error [3303: Qtr2, East, Audio, marge, real, 1 member not found in database.]

    and newspapers is

    org.apache.bsf.BSFException: exception of Jython:
    Traceback (most recent call changed):
    File "< string >", line 26, < module >
    at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData (unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)

    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: error reached records the maximum error threshold: 1

    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
    at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:338)
    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:263)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:822)
    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:83)
    at java.lang.Thread.run(Thread.java:662)

    Thank you!

    You use a rule to load? It seems to me that you are not correct separator between ODI and essbase.
    It seems that you all members of loading as a single string when they should be spaced, check in the options of revenge against the State of charge and make sure they have the same delimiter.
    By default in the IKM will be fixed to the comma, in the rule of load, it can be set to the tab, open the rule of load, go to Options > data source properties > make sure that the separator is the comma.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

Maybe you are looking for

  • Renaming of files

    I was curious to know, if there was a way I could highlight a series of images and Relabel the in order without having to do manual. For example in the image below, there are 6 images with different names, I could highlight and name them 1-6 for exam

  • Satellite M40-276: sound card is shown to the registry sometimes not

    I have a sound problem. Sometimes the sound card is shown to the registry sometimes not.I fixed my laptop, but there is no sound card.Sorry for my bad English

  • HP PAVILION G6: HP PAVILION G6 bios administrator password reset

    I have a HP Pavilion g6 with a game password. As soon as you start looking for a password. I've seen other posts where you have provided a code to bypass the password. My system code is 90139349. Any help would be greatly appreciated. Thank you in ad

  • Windows 8 / 8.1 / 10 on Aspire One D257

    Hello Can I install Windows 8.1 or Windows 10 on Acer Aspire D257 - N57DQkk Ultra slim Netbook? Hope of quick answers... Thank you...

  • Silence X 230: Swap HDD to SSD or install mSATA SSD HARD disk?

    Hello I have a x 230 with a 320 GB of HDD (7500 RPM) and want to replace it with an SSD for a fully automated machine. Right now, most of the time the only sound I hear the machine is the rotation of the disk HARD and TP Fancontrol watch that especia