ODI data loading speed

Hi all

Speed of loading data in ODI stage are too weak.
I use;
LKM = LKM SQL for Oracle,
CKM = SQL CKM,
IKM is IKM of SQL incremental update for Oracle's SQL Server replication.

I use no flow control in the interfaces.
SQL Server and Oracle database are installed on the same server.

How can I make it faster?

If the two database servers are on the same machine and you're dealing with data block, you must use an LKM that uses methods in block (BCP to extract data from SQL Server tables and external data into Oracle) - something like this https://s3.amazonaws.com/Ora/KM_MSSQL2ORACLE.zip KM (which is actually a revenge but the LKM is no different.
I hope it helps

Tags: Business Intelligence

Similar Questions

  • ODI data loading error checking logic

    Hi, I do not know if it's related to ODI or questioned our command script. We found that loading data ODI errors checking is one record of the other so that as soon as we will have an error in loading batch data (for example, a member is missing), the whole lot turns very slowly due to the error checking process.

    Everything we can fine tune the verification of the process of loading data errors ODI?

    Thank you!

    Hello

    Have a read here
    It was a bug, but is now resolved.

    Ok?

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Essbase EAS data vs ODI Essbase data load, which is the fastest?

    To load the data (no metadata) of flat file for Essbase, we have 2 ways, we're ODI, one is data load EAS, which is faster for loading data?

    Hello

    If you use the method of rule no load in ODI EAS will be faster.
    If you have ODI patched to 10.1.3.5.2_02 or newer (I recommend you be on the latest patches) and the use of the method of rule load then the speed should be similar to EA that it uses the same way of loading data.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • ODI - SQL for Hyperion Essbase data loading

    Hello

    We have created a 'vision' in SQL Server that contains our data.  The view currently has every year and periods of Jan 2011 to present.  Each period is about 300 000 records.  I want to only load one period at a time.  For example may 2013.  Currently we use ODBC through a rule of data loading, but the customer wants to use ODI to be compatible with the versions of dimension metadata.  Here's the SQL on the view that works very well.   Is there a way I can run this SQL in the ODI Interface so it pulls only what I declare in the Where clause?  If yes where can I do it?

    Select

    CATEGORY, YEAR, LOCATION, SCRIPT, DEPT, PROJECT, EXPCODE, TIME, ACCOUNT, AMOUNT

    Of

    PS_LHI_HYP_PRJ_ACT

    Where

    YEAR > = "2013" AND PERIOD = 'MAY '.

    ORDER BY CATEGORY ASC ASC FISCAL_YEAR, LOCATION ASC, ASC, ASC, ASC, ASC, PERIOD EXPCODE PROJECT DEPT SCENARIO CSA ACCOUNT CSA;

    Hello

    Simply use the following KM to load data - IKM SQL for Hyperion Essbase (DATA) - in an ODI interface that has the view that you created the Source model. You can add filters to the source which are dynamically by ODI variables to create the Where clause based on the month and year. Make sure you only specify a rule of load method to load the data into the KM

  • ODI: Error loading the data of HFM: invalid dimension name

    Hello

    I am fairly new to ODI and I was wondering if any guru could help me overcome a question im facing. I try to load data from a csv file in HFM. I chose the good KM (LKM file SQL and SQL IKM to Hyperion Financial Management Data), with the Sunopsis Memory engine than the staging area.

    To facilitate the file csv has the exact structure as well as the dimensions of HFM applications and has been located in the interface, as shown below:

    Column of the source - target HFM column file

    -Scenario
    Year - year
    Display - display
    Entity - entity
    Value - value
    Account - account
    PIC - PIC
    CUSTOM1 - Custom1
    CUSTOM2 - Custom2
    Custom3 - Custom3
    Custom4 - Custom4
    -Period
    DataValue - Datavalue
    -Description (no column of the source, mapped as ")

    The csv file contains basic members only. I set the error log file path, and when running the interface I get an error. When I open the error log, I see the following messages:

    Line: 1, error: invalid dimension name
    ! Column_Order = C1_SCENARIO, C2_YEAR, C3_VIEW, C4_ENTITY, C5_VALUE, C6_ACCOUNT, C7_ICP, C8_CUSTOM1, C9_CUSTOM2, C10_CUSTOM3, C11_CUSTOM4, C12_PERIOD, C13_DATAVALUE
    C1_SCENARIO
    line: 3 error: a valid column order is not specified.
    Actual; 2007; YTD; 20043; < entity currency >; 13040; [ICP no]; [None]; 1000; [None]; [None]; Jan; 512000; » »
    > > > > > >



    I'm not sure how to solve, as it is based on the interface mapping match dimensions on a 1:1 basis. In addition, dimension in the target column names correspond to the dimension names of application of HFM (that this application has been deducted).

    Help, please!

    Thank you very much
    Jorge

    Published by: 993020 on March 11, 2013 05:06

    Dear Experts,

    ODI: 11.1.1.6.0
    HFM: 9.3.3

    I also met a similar error as OP.

    In my case, the error occurs when I use SUNOPSIS_MEMORY_ENGINE as the staging. If I simply change this staging to the Oracle schema, the Interface will load data successfully to HFM. So, I'm curious on what cause the SUNOPSIS cannot become the staging for the loading of HFM.

    This will show in the IKM SQL to the FM data log file:

    Load data started: 3/14/2013 13:41:11.
    Line: 1, Error: Invalid dimension name
    !Column_Order = C1_SCENARIO, C2_YEAR, C3_VIEW, C4_ENTITY, C5_VALUE, C6_ACCOUNT, C7_ICP, C8_PRODUCT, C9_CUSTOMERS, C10_CHANNEL, C11_UNITSFLOWS, C12_PERIOD, C13_DESCRIPTION
    
    
    C1_SCENARIO

    
    Line: 3, Error: A valid column order is not specified.
    Actual;2007;YTD;EastSales;;Sales;[ICP None];Comma_PDAs;Electronic_City;National_Accts;[None];February;;555
    >>>>>>
    
    Load data completed: 3/14/2013 13:41:11.
    

    It seems like the query generated is not picking up the Column Alias name, but this only happens if I use SUNOPSIS_MEMORY_ENGINE as the staging. With Oracle schema as staging, data load is successfully finished.

    This is the generated code from the KM

    Prepare for Loading (Using Oracle as Staging)

    from java.util import HashMap
    from java.lang import Boolean
    from java.lang import Integer
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.hfm import ODIHFMConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    
    # Target HFM connection properties
    clusterName   = "demo92"
    userName      = "admin"
    password      =  "<@=snpRef.getInfo("DEST_PASS") @>"
    application   = "COMMA"
    
    targetProps = HashMap()
    targetProps.put(ODIConstants.SERVER,clusterName)
    targetProps.put(ODIConstants.USER,userName)
    targetProps.put(ODIConstants.PASSWORD,password)
    targetProps.put(ODIConstants.APPLICATION_NAME,application)
    
    # Load options
    consolidateOnly    = 0
    importMode            = "Merge"
    accumulateWithinFile  = 0
    fileContainsShareData = 0
    consolidateAfterLoad  = 0
    consolidateParameters = ""
    logEnabled             = 1
    logFileName           = r"C:\Temp\ODI_HFM_Load.log"
    tableName             = r"HFMData"
    columnMap            = 'SCENARIO=Scenario , YEAR=Year , VIEW=View , ENTITY=Entity , VALUE=Value , ACCOUNT=Account , ICP=ICP , PRODUCT=Product , CUSTOMERS=Customers , CHANNEL=Channel , UNITSFLOWS=UnitsFlows , PERIOD=Period , DATAVALUE=DataValue , DESCRIPTION=Description '
    srcQuery= """select   C1_SCENARIO    "Scenario",C2_YEAR    "Year",C3_VIEW    "View",C4_ENTITY    "Entity",C5_VALUE    "Value",C6_ACCOUNT    "Account",C7_ICP    "ICP",C8_PRODUCT    "Product",C9_CUSTOMERS    "Customers",C10_CHANNEL    "Channel",C11_UNITSFLOWS    "UnitsFlows",C12_PERIOD    "Period",555    "DataValue",C13_DESCRIPTION    "Description" from ODI_TMP."C$_0HFMData"  where      (1=1)     """
    srcCx                    = odiRef.getJDBCConnection("SRC")
    srcQueryFetchSize=30
    
    loadOptions = HashMap()
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEONLY, Boolean(consolidateOnly))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_IMPORTMODE, importMode)
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_ACCUMULATEWITHINFILE, Boolean(accumulateWithinFile))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_FILECONTAINSSHAREDATA, Boolean(fileContainsShareData))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEAFTERLOAD, Boolean(consolidateAfterLoad))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEPARAMS, consolidateParameters)
    loadOptions.put(ODIConstants.LOG_ENABLED, Boolean(logEnabled))
    loadOptions.put(ODIConstants.LOG_FILE_NAME, logFileName)
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_TABLENAME, tableName);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_COLUMNMAP, columnMap);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCECONNECTION, srcCx);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCEQUERY, srcQuery);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCEQUERYFETCHSIZE, Integer(srcQueryFetchSize));
    
    # Get the writer
    hfmWriter = HypAppConnectionFactory.getAppWriter(HypAppConnectionFactory.APP_HFM, targetProps);
    
    # Begin load
    hfmWriter.beginLoad(loadOptions)
    

    Prepare for loading (using SUNOPSIS as staging)

    from java.util import HashMap
    from java.lang import Boolean
    from java.lang import Integer
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.hfm import ODIHFMConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    
    # Target HFM connection properties
    clusterName   = "demo92"
    userName      = "admin"
    password      =  "<@=snpRef.getInfo("DEST_PASS") @>"
    application   = "COMMA"
    
    targetProps = HashMap()
    targetProps.put(ODIConstants.SERVER,clusterName)
    targetProps.put(ODIConstants.USER,userName)
    targetProps.put(ODIConstants.PASSWORD,password)
    targetProps.put(ODIConstants.APPLICATION_NAME,application)
    
    # Load options
    consolidateOnly    = 0
    importMode            = "Merge"
    accumulateWithinFile  = 0
    fileContainsShareData = 0
    consolidateAfterLoad  = 0
    consolidateParameters = ""
    logEnabled             = 1
    logFileName           = r"C:\Temp\ODI_HFM_Load.log"
    tableName             = r"HFMData"
    columnMap            = 'SCENARIO=Scenario , YEAR=Year , VIEW=View , ENTITY=Entity , VALUE=Value , ACCOUNT=Account , ICP=ICP , PRODUCT=Product , CUSTOMERS=Customers , CHANNEL=Channel , UNITSFLOWS=UnitsFlows , PERIOD=Period , DATAVALUE=DataValue , DESCRIPTION=Description '
    srcQuery= """select   C1_SCENARIO    "Scenario",C2_YEAR    "Year",C3_VIEW    "View",C4_ENTITY    "Entity",C5_VALUE    "Value",C6_ACCOUNT    "Account",C7_ICP    "ICP",C8_PRODUCT    "Product",C9_CUSTOMERS    "Customers",C10_CHANNEL    "Channel",C11_UNITSFLOWS    "UnitsFlows",C12_PERIOD    "Period",555    "DataValue",C13_DESCRIPTION    "Description" from "C$_0HFMData"  where      (1=1)     """
    srcCx                    = odiRef.getJDBCConnection("SRC")
    srcQueryFetchSize=30
    
    loadOptions = HashMap()
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEONLY, Boolean(consolidateOnly))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_IMPORTMODE, importMode)
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_ACCUMULATEWITHINFILE, Boolean(accumulateWithinFile))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_FILECONTAINSSHAREDATA, Boolean(fileContainsShareData))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEAFTERLOAD, Boolean(consolidateAfterLoad))
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_IKMDATA_CONSOLIDATEPARAMS, consolidateParameters)
    loadOptions.put(ODIConstants.LOG_ENABLED, Boolean(logEnabled))
    loadOptions.put(ODIConstants.LOG_FILE_NAME, logFileName)
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_TABLENAME, tableName);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_COLUMNMAP, columnMap);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCECONNECTION, srcCx);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCEQUERY, srcQuery);
    loadOptions.put(ODIHFMConstants.OPTIONS_NAME_SOURCEQUERYFETCHSIZE, Integer(srcQueryFetchSize));
    
    # Get the writer
    hfmWriter = HypAppConnectionFactory.getAppWriter(HypAppConnectionFactory.APP_HFM, targetProps);
    
    # Begin load
    hfmWriter.beginLoad(loadOptions)
    

    If anyone can help on how to solve this?

    Thank you

    Published by: user10620897 on March 14, 2013 14:28

  • Interface stopped ODI still load data into essbase

    Hi John,.

    I kept one of the interface between the two, but always the data loading process is ongoing and have generated the log file.

    I tried to kill the session of EAS but the only ins sessio not killed and he is disconnecting creates a new session for the same process

    If you use an agent, restart the agent of ODI.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • The data load has run in Odi but still some interfaces are running in the operator tab

    Hi Experts,

    I'm working on the customization of the olive TREE, we run the incremental load every day. Data loading is completed successfully, but the operator status icon tab showing some interfaces running.

    Could you please, what is the reason behind still running the interfaces tab of the operator. Thanks to Advance.your valuable suggestion is very useful.

    Kind regards

    REDA

    What we called stale session and can be removed with the restart of the agent.

    You can also manually clean session expired operator.

  • Forcing errors when loading essbase nonleaf data loading

    Hi all

    I was wondering if anyone had experience forcing data load errors when loadrules is trying to push the nonleaf members data in an essbase cube.

    I obviously ETL level to prevent members to achieve State of charge which are no sheet of error handling, but I would however like the management of the additional errors (so not only fix the errors).

    ID much prefer errors to be rejected and shown in the log, rather than being crushed by aggregation in the background

    Have you tried to create a security filter for the user used by the load that allows only write at level 0 and not greater access?

  • Data loading convert timestamp

    I use APEX 5.0 and download data wizard allows to transfer data from excel.

    I define Update_Date as timestamp (2) and the data in Excel is 2015/06/30 12:21:57

    NLS_TIMESTAMP_FORMAT is HH12:MI:SSXFF AM JJ/MM/RR

    I created the rule of transformation for update_date in the body of the function, plsql as below

    declare

    l_input_from_csv varchar2 (25): = to_char(:update_date, 'MM/DD/YYYY HH12:MI:SS AM');

    l_date timestamp;

    Start

    l_date: = to_timestamp (l_input_from_csv, ' MM/DD/YYYY HH12:MI: SS AM' ");

    Return l_date;

    end;

    I keep having error of transformation rule.  If I do create a transformation rule, it will ask for invalid month.

    Please help me to fix this.

    Thank you very much in advance!

    Hi DorothySG,

    DorothySG wrote:

    Please test on demand delivery data loading

    I stick a few examples to the Instruction of Page, you can use copy and paste function, it will give the same result

    Please change a coma separator ',' on the other will be message "do not load".

    Check your Application 21919. I changed the rule of transformation:

    Data is loading properly.

    Kind regards

    Kiran

  • Move rejected records to a table during a data load

    Hi all

    When I run my interfaces sometimes I get errors caused by "invalid records. I mean, some contents of field is not valid.

    Then I would move these invalid to another table records, while the data loading lights in order not to interrupt the loading of data.

    How can I do? There are examples that I could follow?

    Thanks in advance

    concerning

    Hi Alvaro,

    Here you can find different ways to achieve this goal and choose according to your requirement:

    https://community.Oracle.com/thread/3764279?SR=Inbox

  • Data loading Wizzard

    On Oracle Apex, is there a feasibility study to change the default feature

    1. can we convert the load data Wizard just insert to insert / update functionality based on the table of the source?

    2. possibility of Validation - Count of Records < target table is true, then the user should get a choice to continue with insert / cancel the data loading process.

    I use APEX 5.0

    Need it please advice on this 2 points

    Hi Sudhir,

    I'll answer your questions below:

    (1) Yes, loading data can be inserted/updated updated

    It's the default behavior, if you choose the right-hand columns in order to detect duplicate records, you will be able to see the records that are new and those who are up to date.

    (2) it will be a little tricky, but you can get by using the underlying collection. Loading data uses several collections to perform the operations, and on the first step, load us all the records of the user in the collection "CLOB_CONTENT". by checking this against the number of records in the underlying table, you can easily add a new validation before moving on to step 1 - step 2.

    Kind regards

    Patrick

  • Ignore the ASO - zero data loads and missing values

    Hello

    There is an option that ignores the zero values & the missing values in the dialog box when loading data in cube ASO interactively via EAS.

    Y at - it an option to specify the same in the MAXL Import data command? I couldn't find a technical reference.

    I have 12 months in the columns in the data flow. At least 1/4 of my data is zeros. Ignoring zeros keeps the size of the cube small and faster.

    We are on 11.1.2.2.

    Appreciate your thoughts.

    Thank you

    Ethan.

    The thing is that it's hidden in the command Alter Database (Aggregate Storage) , when you create the data loading buffer.  If you are not sure what a buffer for loading data, see loading data using pads.

  • Data loading 10415 failed when you export data to Essbase EPMA app

    Hi Experts,

    Can someone help me solve this issue I am facing FDM 11.1.2.3

    I'm trying to export data to the application Essbase EPMA of FDM

    import and validate worked fine, but when I click on export its failure

    I am getting below error

    Failed to load data

    10415 - data loading errors

    Proceedings of Essbase API: [EssImport] threw code: 1003029 - 1003029

    Encountered in the spreadsheet file (C:\Oracle\Middleware\User_Projects\epmsystem1\EssbaseServer\essbaseserver1\app\Volv formatting

    I have Diemsion members

    1 account

    2 entity

    3 scenario

    4 year

    5. period

    6 regions

    7 products

    8 acquisitions

    9 Servicesline

    10 Functionalunit

    When I click on the button export its failure

    I checked 1 thing more inception. DAT file but this file is empty

    Thanks in advance

    Hello

    Even I was facing the similar problem

    In my case I am loading data to the Application of conventional planning. When all the dimension members are ignored in the mapping for the combination, you try to load the data, and when you click Export, you will get the same message. . DAT empty file is created

    You can check this

    Thank you

    Praveen

  • FDMEE of planning data loaded successfully but not able to see the data in Planning - export of fish shows in FDMEE

    Hi all

    We loaded FDMEE data to planning, data has been loaded successfully, but not able to see the data in the Planning Application.

    In the processes log, I can see her mentioned data loaded in the Cube. Please advise on this.

    Thank you

    Roshi

    Two things:

    -I wasn't talking about method you import data but export data. You use the SQL method. Go to target Applications, select your application of planning/essbase, and set load method as a file. Memorize your settings

    2014-06-19 12:26:50, 692 [AIF] INFO: rules properly locked the file AIF0028

    2014-06-19 12:26:50, 692 INFO [AIF]: load data into the cube by launching the rules file...

    2014-06-19 12:26:50, 692 INFO [AIF]: loading data into the cube using sql...

    2014-06-19 12:26:50, 801 [AIF] INFO: the data has been loaded by the rules file.

    2014-06-19 12:26:50, 801 [AIF] INFO: Unlocking AIF0028 rules file

    2014-06-19 12:26:50, 801 [AIF] INFO: successfully unlocked rules AIF0028 file

    -Then export again and review. DAT file in the Outbox folder. Is it empty?

    -You need to add a new dimension to your import format (Dimension add > currency). Then add Local as expression

    -Import, validate and export data

  • Data loader detects the NUMBER instead of VARCHAR2

    Hello

    In my database, I have a table that stores information about components of vehicles. I created a new process identifier as unique keys, fields of manufacturer and reference data loading.

    (For example: manufacturer 'BOSCH', '0238845' reference)

    When the system is running the data map, it detects the column reference number, delete the first zero character '0238845' -> 238845

    How can I solve this problem?

    Kind regards

    transformation in the data loader to make a character?

    Thank you

    Tony Miller

    Software LuvMuffin

Maybe you are looking for

  • What version of Java 8 choose to download?

    He says I have to update Java, but the automatic link takes me only to download 7.51 which I already.I know there are links to Java 8 but I'm not tech savvy enough to work your version.Usually, I need to download a 32-bit and 64-bit version.Can someo

  • most of the words in e-mail are wavy red underlined as spell checker went wacky. How can I fix it?

    When I want to send an email, even now, about all the words have a red wavy underline. This just started a few weeks ago. Spell checking is also affected. The red lines do not cross to the recipient. Tried re-loading of Firefox. I use Yahoo mail and

  • HP Jet 11: System disabled 11 HP flow

    Hello, I got the system off message and it returns the number of 84240274

  • Set up a loop timed since another Panel

    Hello Fundamental issue here. Is it possible to set up a timed loop (period, priority, due date, etc.) in a vi that has no front panel to the side front of the vi one another? I wanted just a digital command on a front panel that can make these adjus

  • How can I restore my PC to last time?

    I downloaded a new vertion of the emusic handler and now everything is locked up. I'm afaid if I have an install emusic manager all together it can destroy my music. I wanted to restore my HP Pavilion Slimline s5220f running windows 7 to a permeable