Loading of the data using the ODI Crores

Hi Experts,

I'm loading data 16.5 crores from source to target with in the only Oracle DB (for source and target Ihave two different schemas) and the target tablespace have 180 GB, whereas I am running using the ODI interface, found the extension tablespace TEMPerror. but I did not have enough space to increase the TEMp tablespace in my PB.
Please help me any other solution to solve the same.

Thank you in advance,

Kind regards
Chaitanya.

Hi Chaitanya,

Please add separate column (column sequence number) in the source table and retrieve the data with the sequence number and apply the index on the sequence column.
Add the filter to the sequence column condition in interface and run the interface in order to transmit the data between the source and target wise bouquet

for example: Add the command to the filter condition and transmit values like below

Command: sequnce_column_name between 1 and 100000

sequnce_column_name later between 100001 and 1000000 next sequnce_column_name between 1000001 and 10000000

Continue to run based on the calculation of your previous run time

It will be the word incase there no column of date as lupdated_date, created_date, created_by and updated_by

Kind regards
Phanikanth

Tags: Business Intelligence

Similar Questions

  • while loading does no data in the table target, odi performance degrades?

    Hello

    I am trying to load some data from the source (Oracle DB1) to the target (Oracle, DB2) table in the same server

    Source table have 45lacks of data. These data must be load in the target table.

    I create interface. I took LKM SQL SQL & IKM SQL for SQL incremental Add.

    yesterday I executed interface, I checked the status in the browser of the operator. "his display like the loading status.

    However, the interface is in State of loading.

    sometimes odi is unresponsive. Plese help me.

    Need to add all the fetch sizes?

    Please solve this problem.

    Thank you and best regards,

    A.Kavya.

    Hello

    As you say, I tried with different knowledge modules. I used the incremental update IKM for db and db 45 do not data loading.

    its really increase performance. with in 10 minutes it will be correctly loaded into the target table. previously, I used SQL & SQL LKM, IKM SQL for SQL control append.it takes a long time. It takes two days also data is not loaded in the target table. really bad performance.

    I used DB loading for better performnace LKM SQL to ORACLE & IKM INCREMENTAL UPDATE.

  • Question to load data using sql loader in staging table, and then in the main tables!

    Hello

    I'm trying to load data into our main database table using SQL LOADER. data will be provided in separate pipes csv files.

    I have develop a shell script to load the data and it works fine except one thing.

    Here are the details of a data to re-create the problem.

    Staging of the structure of the table in which data will be filled using sql loader

    create table stg_cmts_data (cmts_token varchar2 (30), CMTS_IP varchar2 (20));

    create table stg_link_data (dhcp_token varchar2 (30), cmts_to_add varchar2 (200));

    create table stg_dhcp_data (dhcp_token varchar2 (30), DHCP_IP varchar2 (20));

    DATA in the csv file-

    for stg_cmts_data-

    cmts_map_03092015_1.csv

    WNLB-CMTS-01-1. 10.15.0.1

    WNLB-CMTS-02-2 | 10.15.16.1

    WNLB-CMTS-03-3. 10.15.48.1

    WNLB-CMTS-04-4. 10.15.80.1

    WNLB-CMTS-05-5. 10.15.96.1

    for stg_dhcp_data-

    dhcp_map_03092015_1.csv

    DHCP-1-1-1. 10.25.23.10, 25.26.14.01

    DHCP-1-1-2. 56.25.111.25, 100.25.2.01

    DHCP-1-1-3. 25.255.3.01, 89.20.147.258

    DHCP-1-1-4. 10.25.26.36, 200.32.58.69

    DHCP-1-1-5 | 80.25.47.369, 60.258.14.10

    for stg_link_data

    cmts_dhcp_link_map_0309151623_1.csv

    DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2

    DHCP-1-1-2. WNLB-CMTS-03-3,WNLB-CMTS-04-4,WNLB-CMTS-05-5

    DHCP-1-1-3. WNLB-CMTS-01-1

    DHCP-1-1-4. WNLB-CMTS-05-8,WNLB-CMTS-05-6,WNLB-CMTS-05-0,WNLB-CMTS-03-3

    DHCP-1-1-5 | WNLB-CMTS-02-2,WNLB-CMTS-04-4,WNLB-CMTS-05-7

    WNLB-DHCP-1-13 | WNLB-CMTS-02-2

    Now, after loading these data in the staging of table I have to fill the main database table

    create table subntwk (subntwk_nm varchar2 (20), subntwk_ip varchar2 (30));

    create table link (link_nm varchar2 (50));

    SQL scripts that I created to load data is like.

    coil load_cmts.log

    Set serveroutput on

    DECLARE

    CURSOR c_stg_cmts IS SELECT *.

    OF stg_cmts_data;

    TYPE t_stg_cmts IS TABLE OF stg_cmts_data % ROWTYPE INDEX BY pls_integer;

    l_stg_cmts t_stg_cmts;

    l_cmts_cnt NUMBER;

    l_cnt NUMBER;

    NUMBER of l_cnt_1;

    BEGIN

    OPEN c_stg_cmts.

    Get the c_stg_cmts COLLECT in BULK IN l_stg_cmts;

    BECAUSE me IN l_stg_cmts. FIRST... l_stg_cmts. LAST

    LOOP

    SELECT COUNT (1)

    IN l_cmts_cnt

    OF subntwk

    WHERE subntwk_nm = l_stg_cmts (i) .cmts_token;

    IF l_cmts_cnt < 1 THEN

    INSERT

    IN SUBNTWK

    (

    subntwk_nm

    )

    VALUES

    (

    l_stg_cmts (i) .cmts_token

    );

    DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_cmts (i) .cmts_token);

    ON THE OTHER

    DBMS_OUTPUT. Put_line ("token is already present'");

    END IF;

    WHEN l_stg_cmts EXIT. COUNT = 0;

    END LOOP;

    commit;

    EXCEPTION

    WHILE OTHERS THEN

    Dbms_output.put_line ('ERROR' |) SQLERRM);

    END;

    /

    output

    for dhcp


    coil load_dhcp.log

    Set serveroutput on

    DECLARE

    CURSOR c_stg_dhcp IS SELECT *.

    OF stg_dhcp_data;

    TYPE t_stg_dhcp IS TABLE OF stg_dhcp_data % ROWTYPE INDEX BY pls_integer;

    l_stg_dhcp t_stg_dhcp;

    l_dhcp_cnt NUMBER;

    l_cnt NUMBER;

    NUMBER of l_cnt_1;

    BEGIN

    OPEN c_stg_dhcp.

    Get the c_stg_dhcp COLLECT in BULK IN l_stg_dhcp;

    BECAUSE me IN l_stg_dhcp. FIRST... l_stg_dhcp. LAST

    LOOP

    SELECT COUNT (1)

    IN l_dhcp_cnt

    OF subntwk

    WHERE subntwk_nm = l_stg_dhcp (i) .dhcp_token;

    IF l_dhcp_cnt < 1 THEN

    INSERT

    IN SUBNTWK

    (

    subntwk_nm

    )

    VALUES

    (

    l_stg_dhcp (i) .dhcp_token

    );

    DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_dhcp (i) .dhcp_token);

    ON THE OTHER

    DBMS_OUTPUT. Put_line ("token is already present'");

    END IF;

    WHEN l_stg_dhcp EXIT. COUNT = 0;

    END LOOP;

    commit;

    EXCEPTION

    WHILE OTHERS THEN

    Dbms_output.put_line ('ERROR' |) SQLERRM);

    END;

    /

    output

    for link -.

    coil load_link.log

    Set serveroutput on

    DECLARE

    l_cmts_1 VARCHAR2 (4000 CHAR);

    l_cmts_add VARCHAR2 (200 CHAR);

    l_dhcp_cnt NUMBER;

    l_cmts_cnt NUMBER;

    l_link_cnt NUMBER;

    l_add_link_nm VARCHAR2 (200 CHAR);

    BEGIN

    FOR (IN) r

    SELECT dhcp_token, cmts_to_add | ',' cmts_add

    OF stg_link_data

    )

    LOOP

    l_cmts_1: = r.cmts_add;

    l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));

    SELECT COUNT (1)

    IN l_dhcp_cnt

    OF subntwk

    WHERE subntwk_nm = r.dhcp_token;

    IF l_dhcp_cnt = 0 THEN

    DBMS_OUTPUT. Put_line ("device not found: ' |") r.dhcp_token);

    ON THE OTHER

    While l_cmts_add IS NOT NULL

    LOOP

    l_add_link_nm: = r.dhcp_token |' _TO_' | l_cmts_add;

    SELECT COUNT (1)

    IN l_cmts_cnt

    OF subntwk

    WHERE subntwk_nm = TRIM (l_cmts_add);

    SELECT COUNT (1)

    IN l_link_cnt

    LINK

    WHERE link_nm = l_add_link_nm;

    IF l_cmts_cnt > 0 AND l_link_cnt = 0 THEN

    INSERT INTO link (link_nm)

    VALUES (l_add_link_nm);

    DBMS_OUTPUT. Put_line (l_add_link_nm |) » '||' Has been added. ") ;

    ELSIF l_link_cnt > 0 THEN

    DBMS_OUTPUT. Put_line (' link is already present: ' | l_add_link_nm);

    ELSIF l_cmts_cnt = 0 then

    DBMS_OUTPUT. Put_line (' no. CMTS FOUND for device to create the link: ' | l_cmts_add);

    END IF;

    l_cmts_1: = TRIM (SUBSTR (l_cmts_1, INSTR (l_cmts_1, ',') + 1));

    l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));

    END LOOP;

    END IF;

    END LOOP;

    COMMIT;

    EXCEPTION

    WHILE OTHERS THEN

    Dbms_output.put_line ('ERROR' |) SQLERRM);

    END;

    /

    output

    control files -

    DOWNLOAD THE DATA

    INFILE 'cmts_data.csv '.

    ADD

    IN THE STG_CMTS_DATA TABLE

    When (cmts_token! = ") AND (cmts_token! = 'NULL') AND (cmts_token! = 'null')

    and (cmts_ip! = ") AND (cmts_ip! = 'NULL') AND (cmts_ip! = 'null')

    FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "

    TRAILING NULLCOLS

    ('RTRIM (LTRIM (:cmts_token))' cmts_token,

    cmts_ip ' RTRIM (LTRIM(:cmts_ip)) ")". "

    for dhcp.


    DOWNLOAD THE DATA

    INFILE 'dhcp_data.csv '.

    ADD

    IN THE STG_DHCP_DATA TABLE

    When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')

    and (dhcp_ip! = ") AND (dhcp_ip! = 'NULL') AND (dhcp_ip! = 'null')

    FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "

    TRAILING NULLCOLS

    ('RTRIM (LTRIM (:dhcp_token))' dhcp_token,

    dhcp_ip ' RTRIM (LTRIM(:dhcp_ip)) ")". "

    for link -.

    DOWNLOAD THE DATA

    INFILE 'link_data.csv '.

    ADD

    IN THE STG_LINK_DATA TABLE

    When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')

    and (cmts_to_add! = ") AND (cmts_to_add! = 'NULL') AND (cmts_to_add! = 'null')

    FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "

    TRAILING NULLCOLS

    ('RTRIM (LTRIM (:dhcp_token))' dhcp_token,

    cmts_to_add TANK (4000) RTRIM (LTRIM(:cmts_to_add)) ")" ""

    SHELL SCRIPT-

    If [!-d / log]

    then

    Mkdir log

    FI

    If [!-d / finished]

    then

    mkdir makes

    FI

    If [!-d / bad]

    then

    bad mkdir

    FI

    nohup time sqlldr username/password@SID CONTROL = load_cmts_data.ctl LOG = log/ldr_cmts_data.log = log/ldr_cmts_data.bad DISCARD log/ldr_cmts_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &

    nohup time username/password@SID @load_cmts.sql

    nohup time sqlldr username/password@SID CONTROL = load_dhcp_data.ctl LOG = log/ldr_dhcp_data.log = log/ldr_dhcp_data.bad DISCARD log/ldr_dhcp_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &

    time nohup sqlplus username/password@SID @load_dhcp.sql

    nohup time sqlldr username/password@SID CONTROL = load_link_data.ctl LOG = log/ldr_link_data.log = log/ldr_link_data.bad DISCARD log/ldr_link_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &

    time nohup sqlplus username/password@SID @load_link.sql

    MV *.log. / log

    If the problem I encounter is here for loading data in the link table that I check if DHCP is present in the subntwk table, then continue to another mistake of the newspaper. If CMTS then left create link to another error in the newspaper.

    Now that we can here multiple CMTS are associated with unique DHCP.

    So here in the table links to create the link, but for the last iteration of the loop, where I get separated by commas separate CMTS table stg_link_data it gives me log as not found CMTS.

    for example

    DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2

    Here, I guess to link the dhcp-1-1-1 with balancing-CMTS-01-1 and wnlb-CMTS-02-2

    Theses all the data present in the subntwk table, but still it gives me journal wnlb-CMTS-02-2 could not be FOUND, but we have already loaded into the subntwk table.

    same thing is happening with all the CMTS table stg_link_data who are in the last (I think here you got what I'm trying to explain).

    But when I run the SQL scripts in the SQL Developer separately then it inserts all valid links in the table of links.

    Here, she should create 9 lines in the table of links, whereas now he creates only 5 rows.

    I use COMMIT in my script also but it only does not help me.

    Run these scripts in your machine let me know if you also get the same behavior I get.

    and please give me a solution I tried many thing from yesterday, but it's always the same.

    It is the table of link log

    link is already present: dhcp-1-1-1_TO_wnlb-cmts-01-1

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-02-2

    link is already present: dhcp-1-1-2_TO_wnlb-cmts-03-3
    link is already present: dhcp-1-1-2_TO_wnlb-cmts-04-4

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-5

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-01-1

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-8
    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-6
    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-0

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-03-3

    link is already present: dhcp-1-1-5_TO_wnlb-cmts-02-2
    link is already present: dhcp-1-1-5_TO_wnlb-cmts-04-4

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-7

    Device not found: wnlb-dhcp-1-13

    IF NEED MORE INFORMATION PLEASE LET ME KNOW

    Thank you

    I felt later in the night that during the loading in the staging table using UNIX machine he created the new line for each line. That is why the last CMTS is not found, for this I use the UNIX 2 BACK conversion and it starts to work perfectly.

    It was the dos2unix error!

    Thank you all for your interest and I may learn new things, as I have almost 10 months of experience in (PLSQL, SQL)

  • Issue while loading data using the file Rules Essbase

    Hi all

    I am facing problem while loading data using the Rules file. In the rules file, I rejected several members in two areas (two dimensions). Now if I load the data using the rules file I'm getting errors for all members in the dataload.err file. If I reject mutiple members of a single field, the data load without settling errors in the dataload.err file.

    I want to know how rmany members of several fields of ejection for loading data using the file Rules Essbase? Is it possible?

    Okay, okay... I think that you must assign Global Select / reject Boolean in the parameters of loading data as 'Or':

  • Error of the ODI - running MaxL in loading data ODI

    I have an ODI Interface that works very well to load data in an Essbase Cube.  However I had to add a stage that could run a calc script before loading to clear data from the current year and the period.   I built the MaxL script and successfully tested and it works ok. However in the options on my target in the flow section, I added this entry:

    PRE_LOAD_MAXL_SCRIPT: C:\ODI_Data\Scripts\MaxL\clr_act.mxl

    When I try and run I get the below error.  Any ideas what it could be?  He says the full path of the MaxL script so I thought that's the way they wanted.  That's the problem, I have no reference correctly?

    org.apache.bsf.BSFException: exception of Jython:

    Traceback (most recent call changed):

    File "< string >", line 89, < module >

    at com.hyperion.odi.essbase.ODIEssbaseConnection.executeMaxl (unknown Source)

    at com.hyperion.odi.essbase.AbstractEssbaseWriter.beginLoad (unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: error occurred while running script maxl. Error message is:

    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)

    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)

    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)

    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)

    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)

    at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)

    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:366)

    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)

    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:292)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:855)

    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)

    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)

    at java.lang.Thread.run(Thread.java:662)

    Caused by: Traceback (most recent call changed):

    File "< string >", line 89, < module >

    at com.hyperion.odi.essbase.ODIEssbaseConnection.executeMaxl (unknown Source)

    at com.hyperion.odi.essbase.AbstractEssbaseWriter.beginLoad (unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: error occurred while running script maxl. Error message is:

    at org.python.core.PyException.fillInStackTrace(PyException.java:70)

    at java.lang.Throwable. < init > (Throwable.java:181)

    at java.lang.Exception. < init > (Exception.java:29)

    to java.lang.RuntimeException. < init > (RuntimeException.java:32)

    to org.python.core.PyException. < init > (PyException.java:46)

    to org.python.core.PyException. < init > (PyException.java:43)

    at org.python.core.Py.JavaError(Py.java:455)

    at org.python.core.Py.JavaError(Py.java:448)

    at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:177)

    at org.python.core.PyObject.__call__(PyObject.java:355)

    at org.python.core.PyMethod.__call__(PyMethod.java:215)

    at org.python.core.PyMethod.instancemethod___call__(PyMethod.java:221)

    at org.python.core.PyMethod.__call__(PyMethod.java:206)

    at org.python.core.PyObject.__call__(PyObject.java:397)

    at org.python.core.PyObject.__call__(PyObject.java:401)

    to org.python.pycode._pyx0.f$ 0 (< string >: 89)

    to org.python.pycode._pyx0.call_function (< string >)

    at org.python.core.PyTableCode.call(PyTableCode.java:165)

    at org.python.core.PyCode.call(PyCode.java:18)

    at org.python.core.Py.runCode(Py.java:1204)

    at org.python.core.Py.exec(Py.java:1248)

    at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)

    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)

    ... 19 more

    Caused by: com.hyperion.odi.essbase.ODIEssbaseException: error occurred while running script maxl. Error message is:

    at com.hyperion.odi.essbase.ODIEssbaseConnection.executeMaxl (unknown Source)

    at com.hyperion.odi.essbase.AbstractEssbaseWriter.beginLoad (unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:175)

    ... more than 33

    Caused by: com.essbase.api.base.EssException: error occurred while running script maxl. Error message is:

    at com.hyperion.odi.essbase.wrapper.EssbaseConnection.executeMaxl (unknown Source)

    ... more than 40

    Log in to Oracle Support, and then search for the document 1152893.1

    See you soon

    John

    http://John-Goodwin.blogspot.com/

  • Failed to load data using the contour load utility

    I am unable to load data using contour utility charge.
    Assigned data loading dimension such as: account
    Driver-Dimension such as: period
    Member as a pilot: Jan

    Connection file:

    Account, Jan, Point of view, the name of Cube data loading
    Investment, 1234, "India, current, heck, FY13", Plan1


    Utility of contour control
    OutlineLoad A:pract U:admin /I:C:\test1.csv /D:Account /L:C:\lg.log /X:C:\ex.exc


    Exception file

    [Thu Mar 28 18:05:39 GMT + 05:30 2013] Error loading of data record 1: investments, 1234, '' ' India, common, project, FY14' ' ', Plan1
    [Thu Mar 28 18:05:39 GMT + 05:30 2013] com.hyperion.planning.InvalidMemberException: Member India, common, rough, FY14 does not exist or you do not have access to it.
    [Thu Mar 28 18:05:40 GMT + 05:30 2013] Planning of vector data store processes loaded with exceptions: exceptions have occurred, examine the file exception for more information. 1 data record has been read 1 record of data have been processed, 0 were loaded successfully, 1 was rejected.


    Log file:


    Connected application "Rlap", liberation 11.113, adapter Interface Version 5, supported and not enabled workforce, CapEx no taken care and not active, CSS Version 3
    [Thu Mar 28 18:05:38 GMT + 05:30 2013] Input file located and opened successfully "C:\load.csv".
    [Thu Mar 28 18:05:38 GMT + 05:30 2013] Record header fields: account, Jan, Point of view, the name of Cube data loading
    [Thu Mar 28 18:05:38 GMT + 05:30 2013] Located by using 'Account' dimension and for the loading of data into the application 'Rlap.
    [Thu Mar 28 18:05:40 GMT + 05:30 2013] Loading dimension 'Account' has been successfully opened.
    [Thu Mar 28 18:05:40 GMT + 05:30 2013] A refresh of the cube operation will not be run.
    [Thu Mar 28 18:05:40 GMT + 05:30 2013] Create filters for safe operation will not be performed.
    [Thu Mar 28 18:05:40 GMT + 05:30 2013] Look at the files of newspapers of Essbase to status if Essbase data have been loaded.
    [Thu Mar 28 18:05:40 GMT + 05:30 2013] Planning of vector data store processes loaded with exceptions: exceptions have occurred, examine the file exception for more information. 1 data record has been read 1 record of data have been processed, 0 were loaded successfully, 1 was rejected.



    Infact members exist in outline.
    Any help would be appreciated.

    can you double check your csv file, open it in a text editor, because the error is showing additional citations.

    [Thu Mar 28 18:05:39 GMT + 05:30 2013] Error loading of data record 1: investments, 1234, '' ' India, common, project, FY14' ' ', Plan1

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • How to automate the process of loading data using load file &amp; Task Scheduler

    Hello

    I do the automated processes to load the data into Hyperion Planning application using the file data_Load.bat & Scheduler of tasks.

    I created Data_Load.bat file, but the rest of the process, I cannot complete.

    Could help you me, how to automate the process of loading data using the file Data_load.bat & task Scheduler or what are the rest of the file is require it to achieve.

    Thank you

    In response to your question using the maxl for loading scripts?

    If Yes, I've seen and deliver in the batch (ex: load_data.bat) that is you do not have the path of the maxl script complete with a batch when passing through the event the task scheduler will work, but the log file and / or error will not be created. Which means lots claims it linked task scheduler, although he did not do what you need to.

    If you use maxl use this as the batch

    "essmsh C:\data\DataLoad.mxl" or you can also use the full path for the maxl or work elsewhere. The only reason why I think that the maxl can then not work is if you do not have the updated batch updated to call on all LANE changes maxl or if you need to update your environment variables to correct the command essmsh to work in a command prompt.

  • Automate the process of loading data using Task Scheduler

    Hello

    I do the automated processes to load the data into Hyperion Planning application with the help of the Task Scheduler.

    I created Data_Load.bat file, but the rest of the process, I cannot complete.

    So could you help me, how to automate the loading of the process data using the file Data_load.bat & Task Scheduler.

    Thank you

    Windows 2008? Open the Task Scheduler, Action > creates a fundamental task, give it a name, select the frequency, select start a program, find the batch script, then you're pretty much done for a basic task.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Export of data from a RDBMS Table to an another RDBMS Table using functions of the ODI

    Hello
    I am facing a problem while exporting a RDBMS table data to an another RDBMS Table using user ODI functions.
    Name:-User_Func
    Group:-training
    Syntax:-User_Func($(SrcField))

    Implementation syntax: -.

    (CASE
    WHEN $(SrcField) > 40000 THEN 'HIGH '.
    WHEN $(SrcField) BETWEEN 30000 AND 40000 THEN 'AVERAGE '.
    OTHER "LOW".
    )
    Technology:-Oracle

    To map the column RANK of my TARGET_EMPTABLE I write
    User_Func (SRC_TABLENAME. SALARY)
    using the Expression Editor.
    I got the following error

    ODI-1227: task failed ODI_FUNC_INTERFACE (export) on the source of ORACLE Source_DataServer connection.
    Caused by: java.sql.SQLSyntaxErrorException: ORA-00905: lack of keyword

    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:462)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:405)
    at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:931)
    at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:481)
    at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:205)
    at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:548)
    at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:217)
    at oracle.jdbc.driver.T4CPreparedStatement.executeForDescribe(T4CPreparedStatement.java:947)
    at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1283)
    at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1441)
    at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3769)
    at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3823)
    at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1671)
    at oracle.odi.query.JDBCTemplate.executeQuery(JDBCTemplate.java:189)
    at oracle.odi.runtime.agent.execution.sql.SQLDataProvider.readData(SQLDataProvider.java:89)
    at oracle.odi.runtime.agent.execution.sql.SQLDataProvider.readData(SQLDataProvider.java:1)
    at oracle.odi.runtime.agent.execution.DataMovementTaskExecutionHandler.handleTask(DataMovementTaskExecutionHandler.java:70)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:366)
    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:292)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:855)
    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)
    at java.lang.Thread.run(Thread.java:619)
    and in the code tab is: -.

    Select
    SRC_FUNC_TABLE. E_NUMBER E_NUMBER,
    SRC_FUNC_TABLE. E_NAME E_NAME,
    SRC_FUNC_TABLE. E_LOC E_LOC,
    (CASE
    WHEN SRC_FUNC_TABLE. E_SAL > 40000 THEN 'HIGH '.
    WHEN SRC_FUNC_TABLE. E_SAL BETWEEN 30000 AND 40000 THEN 'AVERAGE '.
    OTHER "LOW".
    ) E_GRADE
    of SOURCE_SCHEMA. SRC_FUNC_TABLE SRC_FUNC_TABLE
    where (1 = 1)


    Help, please

    Anindya Chatterjee wrote:
    Hello
    I am facing a problem while exporting a RDBMS table data to an another RDBMS Table using user ODI functions.
    Name:-User_Func
    Group:-training
    Syntax:-User_Func($(SrcField))

    Implementation syntax: -.

    (CASE
    WHEN $(SrcField) > 40000 THEN 'HIGH '.
    WHEN $(SrcField) BETWEEN 30000 AND 40000 THEN 'AVERAGE '.
    OTHER "LOW".
    )

    Your syntax of the CASE statement is not correct
    Missing END keyword
    It should be

    (CASE
    WHEN $(SrcField) > 40000 THEN 'HIGH '.
    WHEN $(SrcField) BETWEEN 30000 AND 40000 THEN 'AVERAGE '.
    OTHER "LOW".
    END)

    Technology:-Oracle

    To map the column RANK of my TARGET_EMPTABLE I write
    User_Func (SRC_TABLENAME. SALARY)
    using the Expression Editor.
    I got the following error

    ODI-1227: task failed ODI_FUNC_INTERFACE (export) on the source of ORACLE Source_DataServer connection.
    Caused by: java.sql.SQLSyntaxErrorException: ORA-00905: lack of keyword

    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:462)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:405)
    at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:931)
    at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:481)
    at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:205)
    at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:548)
    at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:217)
    at oracle.jdbc.driver.T4CPreparedStatement.executeForDescribe(T4CPreparedStatement.java:947)
    at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1283)
    at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1441)
    at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3769)
    at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3823)
    at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1671)
    at oracle.odi.query.JDBCTemplate.executeQuery(JDBCTemplate.java:189)
    at oracle.odi.runtime.agent.execution.sql.SQLDataProvider.readData(SQLDataProvider.java:89)
    at oracle.odi.runtime.agent.execution.sql.SQLDataProvider.readData(SQLDataProvider.java:1)
    at oracle.odi.runtime.agent.execution.DataMovementTaskExecutionHandler.handleTask(DataMovementTaskExecutionHandler.java:70)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:366)
    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:292)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:855)
    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)
    at java.lang.Thread.run(Thread.java:619)
    and in the code tab is: -.

    Select
    SRC_FUNC_TABLE. E_NUMBER E_NUMBER,
    SRC_FUNC_TABLE. E_NAME E_NAME,
    SRC_FUNC_TABLE. E_LOC E_LOC,
    (CASE
    WHEN SRC_FUNC_TABLE. E_SAL > 40000 THEN 'HIGH '.


    WHEN SRC_FUNC_TABLE. E_SAL BETWEEN 30000 AND 40000 THEN 'AVERAGE '.
    OTHER "LOW".
    ) E_GRADE
    of SOURCE_SCHEMA. SRC_FUNC_TABLE SRC_FUNC_TABLE
    where (1 = 1)

    Help, please

  • Load SQL data using the Regional service

    Hello

    I've been loading data using text file, now I want to try to load from SQL server directly, but screen load data from EAS, "Data Source" is grayed, as soon as I clicked on SQL as the Data Source.

    What I miss here?, I configure ODBC to the server level.

    First of all, don't you configured the source SQL and ODBC in State of charge? Open the rule of the load and the "select" menu-> open SQL. Set up the sql statement in the form, then click ok/collect to test.

    Once that the rule of the load seems good, so to actually load the data, show the daa load screen as you did. Select SQL. The data source must be a Virgin because it is for the flat file. Select the rule of load (it has the SQL in it). If you look to the right of the screen, and you may need to expand to see, rather than enter the sqlID and password there. Enter those, and click ok. It should load as usual.

    If you want to leave MaxL, there is a syntax similar to tell the import statement is a sql load and give you it the id and the password

  • Create a target table by using the model of the ODI

    Hi all

    I am new to ODI. I'm try export table RDBMS for source RDBMS table at target.

    If we create the structure of the target in the trg schema table or we can create using the model fit obi by selecting new data store.

    Owb loading target tables is created and the data will be loaded.

    Pls help me about this.

    Thanks in advance.

    Hello
    You can create it in the two ways.if you create using ODI data store, you must select Create table option target in flow during loading of the target.

  • Deletion of the data loaded into the Planning

    We loaded data using ODI of planning on an intersection. I need to remove the data and reload to planning on the same intersection. How can I delete a single charge and reload the data? I don't want to clean up the database and recharge start. I just have to clean an intersection and load again.


    Thank you for your help

    Create a calc script clear the area of the database and run before you load the data again.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Try to get data of almost static load of the 9234 module with IEPE sensors

    I am train to acquire data of load which should be essentially a function of the square wave loading. The current configuration I have is cDAQ 9234 on LV 8.5 module that I used to start a module of 9233, I had, but after looking for here and that it seemed to me that the 9234, with DC coupling, would allow me to gain the quasi-static load (the load is applied for 2.5 seconds) and not only the load change whenever the load has been applied or removed. So, now I have connected 9234 and I get almost no change in output regardless of whether I got DC or AC coupling selected in MAX. In addition, it always seems to be only to register the initial support change, as before - it's the biggest concern I have.

    I had previously configured as a voltage signal and just put across the entrance, which at least made me reasonable support changes when I used the 9233. I expect at least that I would see the same or similar values with the configured 9234 in AC mode. If anyone can shed some light on this I would be very happy.

    Here are the technical details of load cell: http://www.dytran.com/img/products/1203V.pdf

    I have also attached my code, in which case it might shed some light on the question (please, be gentle - I do not know there is a better way to code this, but I have tried different methods cleaning cycle not properly).

    Thanks in advance for any help!

    John,

    Both the sine part and the exponential decay are indications provide excitement from HQ or using a sensor that uses AC coupling.  Looking at the datasheet that you have linked and the site Web of Dytran, I can't determine if the load cell has any kind of internal coupling of AC.  I would recommend you contact Dytran and explain the behavior that you see a load constant to see if this is expected behaviour or not.  It is possible that the scale is intended primarily for transient or dynamic loading and was not supposed to measure static charges.  Let me know what you hear from Dytran and we can continue to resolve this.

  • "windows failed to load because the nls data is missing or corrupted"

    IM currently using vista, and when I turn on my computer
    the following message appears: "windows failed to load because the nls data is missing or corrupted"

    What does this mean message?

    that means "nls"?
    is my information on my hard drive in danger?
    How can I fix?

    Recovery discs would be Windows Vista installation files.  It would be safe to use if they are needed when the sfc/scannow scanning.

    A recovery disc is a general term for media containing a backup State initial factory or a favorable condition of a computer such as configured by an OEM manufacturer or an end user. OEM supplied recovery media is usually comes with most computers to allow the user to reformat the hard drive and reinstall the operating system and preloaded software as it was when it was shipped.

    http://en.Wikipedia.org/wiki/Recovery_disc

  • The logged data is not loaded in the target table in cdc-compatible?

    Hello

    I tried with cdc-simple concept on single table.

    I had loaded, journaled table (only changed records) inserted in the simple target in cdc table. Its working fine.

    When I work on cdc-consistent and logged data are not loaded in the target table

    For this, I used the data model, it has 3 data stores. log the without option of data, its works very well.

    When I am trying to load tables logged in the target table in interface, its running fine.

    To do this, I chose "logged data only.

    Although I have not changed the records in the target table. target table is empty after the executed insterface.

    err1.png

    err4.png

    err2.png

    err3.png

    I chose the real option of insertion in ikm. But the logged data that is not inserted in the target table.

    Please help me.

    Thankin advacnce,

    A.Kavya.

    Hello

    You must EXPAND WINDOW and LOCK SUBSCRIBERS before consuming the CDC data:

    http://docs.Oracle.com/CD/E14571_01/integrate.1111/e12643/data_capture.htm#ODIDG283

    Subsequently, you unlock Subscriber and purge the log.

    Better to put a package to automate the whole thing.

Maybe you are looking for