Loading data mapping

Hi all

Pleas help me on the import using MapLoper map I can't import excel excel map file I read the documentation and search lots of ways to help

I've successfully used the MapLoader.xls several times to FDM and FDMEE with different versions of Excel.  The trick is NOT to change the file on the 'set a name' for the area to be downloaded. In my view, column F, 'Data rule name', has a formula in there that don't SHOULD NOT be changed.  On the right of this table, to be formulated additional which should NOT be touched.

I don't have access to a MapLoader.xls file now, but I remember that the specification of "Defining a name" goes to about 10,000 lines.  If you need more rows, you can INSERT lines in the area to "Define a name" and Excel automatically updated the "set."

Good luck.

Alex Liu

E.M.P. freelance architect

Tags: Business Intelligence

Similar Questions

  • Failed to load data from the staging area using the map of ERP SAP ABAP

    Hello

    I am new to ODI and some challenges. ODI 11g (11.1.1.6.0) I want to move data from a SAP EHP6 system to a warehouse of the oracle. I got the table that I need, using the SAP reverse engineering metadata browser. I then created the interface and on executing this operation fails on task/Session 13 - Loding - Srcset0 - load data into a staging. I use a shared directory. I put FTP_TRANSFER_METHOD = FSMOUNT_DIRECT. I get the error message is

    org.apache.bsf.BSFException: exception of Jython:
    Traceback (most recent call changed):
    File '< string >", line 14, in < module >
    Load error: see /home/dwsap/ZODI_13001_11001_GLOBAL.log for more details

    to org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
    to com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
    to com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
    to com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)
    to oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
    to oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
    to oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
    to com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
    to com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
    to com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)
    to com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
    to com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:366)
    to oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:292)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:855)
    to oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)
    to java.lang.Thread.run(Thread.java:744)
    Caused by: Traceback (most recent call changed):
    File '< string >", line 14, in < module >
    Load error: see /home/dwsap/ZODI_13001_11001_GLOBAL.log for more details

    to org.python.core.PyException.doRaise(PyException.java:219)
    to org.python.core.Py.makeException(Py.java:1166)
    to org.python.core.Py.makeException(Py.java:1170)
    to org.python.pycode._pyx0.f$ 0 (< string >: 50)

    to org.python.pycode._pyx0.call_function (< string >)
    to org.python.core.PyTableCode.call(PyTableCode.java:165)
    to org.python.core.PyCode.call(PyCode.java:18)
    to org.python.core.Py.runCode(Py.java:1204)
    to org.python.core.Py.exec(Py.java:1248)
    to org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)
    to org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
         ... 19 more


    I checked the log file and it contains the following


    $ ZODI_13001_11001_GLOBAL.log more

    SQL * Loader: release 11.2.0.4.0 - Production on Thu Mar 27 11:19:17 2014

    Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

    SQL * Loader-704: Internal error: ulconnect: OCIServerAttach [0]
    ORA-12504: TNS:listener did not have the SERVICE_NAME in CONNECT_DATA


    I checked my tnsnames.ora and it seems to be OK


    tnsnames.ora # Network Configuration file: /u01/app/oracle/product/11.2.0/dbhome_1/network/admin/tnsnames.ora
    # Generated by Oracle configuration tools.

    XXXXXX =
    (DESCRIPTION =
    (ADDRESS = (PROTOCOL = TCP)(HOST = xxxxxx) (PORT = 1521))
    (CONNECT_DATA =
    (SERVER = DEDICATED)
    (SERVICE_NAME = xxxxxx)
    (SID = xxxxxx)
    (GLOBAL_NAME = xxxxxx)
    )
    )


    Here's my listener.ora


    listener.ora # Network Configuration file: /u01/app/oracle/product/11.2.0/dbhome_1/network/admin/listener.ora
    # Generated by Oracle configuration tools.
    LISTENER =
    (DESCRIPTION_LIST =
    (DESCRIPTION =
    (ADDRESS = (PROTOCOL = CIP)(KEY = EXTPROC1521))
    (ADDRESS = (PROTOCOL = TCP)(HOST = xxxxx) (PORT = 1521))
    )
    )
    ADR_BASE_LISTENER = / u01/app/oracle



    What could be the problem?

    Concerning

    Thanks a lot for your help. The problem is that I had put in the incorrect name of the instance.

  • Error 14: could not load data space type card

    Hello

    I wrote a program in Labview 8.5 on a single computer, saved and copied to another computer. Here, I got this error message when I open the vi: Labview: memory is full, Labview support code 14 error: could not load type mapping of spatial data. I try to start the vi one the first computer again and it crashed. Then that crashed it corrupted each associated with the main vi vi. Even those I've recorded previously and which were not open at the time (I closed haven´t Labview between then that they were likely in the memory of Labview). I have encounter this error several times and each time I used the package inside the vi IMAQ. Is there a solution?

    Hi Alex,

    Unfortunately there seems to be another similar problem which prevents the LV2009 of savings for version 8.5 or earlier if the vi contains a structure of the event. (AS 183005).

    The solution is to use the 2009 SP1 version where this bug has been fixed.

    The solution for the other BECAUSE I mentioned has not been set so that the upgrade to SP1 and re-registration in version 8.2 is the way to do it for the moment.

    I apologise for the inconvenience this is causing. Let me know if you are able to fix it with that.

    Kind regards

  • Collection of Apex of access that is created as part of the wizard to load data

    Hi all

    I tried to figure out how to access the Collection of Apex that stores data related to the data load.

    Please need documentation that may help understand the process.

    Hi all

    In the upload form, we have 4 Pages / screen, each screen has an associated collection of Apex.

    Below the Apex Collection are associated with Pages

    "SPREADSHEET_CONTENT" - page 1

    "PARSE_COL_HEAD" - Page 2

    "LOAD_CONTENT" - Page 3

    "LOAD_COL_HEAD" - Page 3

    "FIN_LOAD_CONTENT" - Page 4

    What interested me was in the PARSE_COL_HEAD that lists the columns that are not mapped on the 2nd Page, (data map).

    So now that we know the collection of the Apex, I will ask the Collection:

    Select count (c002) in the CNT from apex_collections where 'PARSE_COL_HEAD' and C002 = collection_name = "DO_NOT_LOAD";

    If you want to see it working, I created an example on the link below:

    https://Apex.Oracle.com/pls/Apex/f?p=14281:6:116138945234588:no:

    User: DONOTMAP

    Password: 12345

    Please let me know if more details needed on implementation and detailed as follows.

  • ODI - 1228:Task Load Data - LKM File to SQL-fails on the connection target: table or view does not exist

    While performing a mapping (present in the package) that loads the file to table data, my mapping is being failed in the step - LKM file with above mentioned SQL error.

    This task is running for 30 candy Mint and loading data about 30 to 40 million for the temporary table of C$ ODI.

    Before the completion of the task is to make failure and also C$ table is also get deleted.

    Any possible resolution for above mentioned the issue?

    Problems have been solved.

    In our case, the prefix of all the data store name has been SRC_ so the nickname of all the data store became SRC, and the table name C$ depends on the daatastore Alias.

    So for executing two mapping tables $ CAN have been getting dropped by other mapping due to the same table name $ CAN.

    Change the Alias name giving it a unique name solve the problem.

  • How to load data into the App MVDEMO schema example

    Hi all

    I'm a POC on Oracle Mapviewer and try to build some reports in OBIEE using MApviewer.

    This POC, I use Oracle MVDEMO example Data (11g). I think that these sample data covers few countries like the USA.

    I need to make PDS for the Brazil, I downloaded data from the map of the site as Shapefiles Brazil

    http://GADM.org/country

    in these data of the Brazil, I got from .csv files 4 extensions, .dbf, .shp and SHX

    I need to know how can I load these files into my Oracle 11 g DB? Should I load data into the same pattern of mvdemo, if yes then which table?

    Any help will be appreciated a lot.

    Thank you

    Amit

    Use the Java shapefile Converter utility (http://www.oracle.com/technetwork/database/options/spatialandgraph/downloads/index-093371.html)

    GDAL (gdal.org) FME (Safe) or or MapBuilder.

    Specify the to SRID (i.e. the SRID for loading in Oracle geoms) 4326 or 8307.

    Load into a new table named anything you want. for example brazil_gadm with the geometry named GEOMETRY column

    Once it's loaded, verify that there is an entry for the table and column (BRAZIL_GADM, GEOMETRY) in user_sdo_geom_metadata

    Create a space on brazil_gadm.geometry index if the tool has not created a.

    Add the definitions of topic for the country, State or whatever the admin areas exist in the dataset.

    Import them as layers in OBIEE.

  • ODI - the most recent version of the loading data

    Hello

    I have a flat file with the above structure

    No. MRN DATE OF BIRTH

    12345 12/04/1988

    12345 13/06/1980

    12345 21/05/1989

    The requirement is to load data into Oracle tables where only the last row must be taken (in this sense Date of birth for last MNR number must be picked, essentially the last disk)

    There is no verification of available data, which means that no insert_date or no matter what version number is available

    What is the best way to achieve this.

    Any help will be appreciated.

    Thanks and greetings

    Reshma

    In this case as you consider this last incoming data as more later. You should be able to choose later from the file in 2 ways (using the oracle sequence generated by default or manually, by creating a sequence).

    In the second case, when loading data to the staging table.

    Create a SNO_MRN_SEQ sequence in Oracle DB

    create sequences SNO_MRN_SEQ

    START WITH 1

    INCREMENT BY 1

    NOCACHE;

    Create an additional (SNO_SEQ) field in the staging data store, which in the mapping interface you will map the sequence (schema_name. SNO_MRN_SEQ. NEXTVAL).

    Check the staging table data if the sequence is empty as expected.

    during the loading to the filter of the target on MRN_NO Table and use the below query, which will essentially come to the max of the sequence number.

    STGTABLE. SNO_SEQ =

    (SELECT MAX (B.SNO_SEQ)

    OF STAGING_TABLE B

    WHERE STGTABLE. MRN_NO = B.MRN_NO

    )

    Let me know if that helps!

  • FDM 11.1.2.2 EFFECTS in the data after loading data

    Hello

    I'm loading data for planning an application of FDM for 30/04/12 to 31/3/12 (April) FY13 but perform 30/04/13 to 31/03/13 (April) exercise 14 in planning and for the 13 year data are deleted automatically

    I use replace when exporting data option

    I guess that the FDM cannot remove data that are responsible for the planning application

    can someone help me solve this problem.

    Thank you

    Answer: The correct data is loaded to April 14 exercise

    Have you reviewed your Control Board for the periods? Maybe it's that you map Apr-2013 to APR-FY14

    Answer: Yes mapping for April 2013 is mapped to APR-FY14 > This is the reason why the data is loaded to FY14. You need to update your table of control periods.

    Then your data for FY13 are deleted in planning when using replace. When you load mode replace, FDM runs a clear script before loading data. Which erases all intersections for category (scenario), period and entity (all descendants of your target entity)

    Question why the 13 year data are deleted in the planning if we choose the option replace > by default the CLEAR of the LOAD ACTION script do not include the year dimension. This is why the data are deleted for all years. You must include in the script

  • How we call OLAP to make unnecessary aggredations for loading data?

    Hello

    I am trying to create an OLAP cube relatively simple two-dimensional (you can call it "Square OLAP"). My current environment is 11.2EE with MN for the workspace management.

    A dimension is date,-> year-> month-> day annually, the other is production unit, implemented as a hierarchy with some machine on the lower level. The fact is defined by a pair of low level of these dimensions values; for example, a measure is taken once per day for each machine. I want to store these detailed facts in a cube as well as aggregates, so that they can be easily drilled up to without questioning the original fact table.

    The rules of aggregation are on 'aggregation level = default' (which is the day and the machine respectively) for two of my dimensions, the cube is mapped to the fact with dimension tables table, the data is loaded, and the everything works as expected.

    The problem is with the charge itself, I noticed that it is too slow for my amount of sample data. After some research on the issue, I discovered a query in cube_build_log table, query the data is actually loaded.

    < SQL >

    <! [CDATA]

    SELECT / * + bypass_recursive_check cursor_sharing_exact no_expand no_rewrite * /.

    T4_ID_DAY ALIAS_37,

    T1_ID_POT ALIAS_38,

    MAX (T7_TEMPERATURE) ALIAS_39,

    MAX (T7_TEMPERATURE) ALIAS_40,

    MAX (T7_METAL_HEIGHT) ALIAS_41

    Of

    (

    SELECT / * + no_rewrite * /.

    T1. "" T7_DATE_TRUNC DATE_TRUNC. "

    T1. "" T7_METAL_HEIGHT METAL_HEIGHT. "

    T1. "" T7_TEMPERATURE OF TEMPERATURE. "

    T1. "" POT_GLOBAL_ID "T7_POT_GLOBAL_ID

    Of

    POTS. POT_BATH' T1)

    T7,

    (

    SELECT / * + no_rewrite * /.

    T1. "" T4_ID_DIM ID_DIM. "

    T1. "" ID_DAY "T4_ID_DAY

    Of

    LAUGHED. ("' DIM_DATES ' T1)

    T4,

    (

    SELECT / * + no_rewrite * /.

    T1. "" T1_ID_DIM ID_DIM. "

    T1. "" ID_POT "T1_ID_POT

    Of

    LAUGHED. ("' DIM_POTS ' T1)

    T1

    WHERE

    ((T4_ID_DIM = T7_DATE_TRUNC)

    AND (T1_ID_DIM = T7_POT_GLOBAL_ID)

    AND ((T7_DATE_TRUNC) IN "a long long list of dates for the currently processed cube partition is clipped")))

    GROUP BY

    (T1_ID_POT, T4_ID_DAY)

    ORDER BY

    T1_ID_POT ASC NULLS LAST,

    T4_ID_DAY ASC NULLS LAST]] > >

    < / SQL >


    View T4_ID_DAY, T1_ID_POT in the high level of the column list - here are identifiers of low level of my dimensions, which means that the query is not doing an aggregation here, because there is that one made by each pair of (ID_DAY, ID_POT).

    What I want to do is somehow to load data without doing that (totally useless in my case) the aggregation intermediaries. Basically I want it to be something like

    SELECT / * + bypass_recursive_check cursor_sharing_exact no_expand no_rewrite * /.

    T4_ID_DAY ALIAS_37,

    T1_ID_POT ALIAS_38,

    T7_TEMPERATURE ALIAS_39,

    T7_TEMPERATURE ALIAS_40,

    T7_METAL_HEIGHT ALIAS_41

    Etc...


    without aggregations. In fact, I can live even with this load query, such as the amount of data isn't that great, but I want that things in the right way to work (more or less ).

    Any chance to do?

    Thank you.

    I thought about it. There is a mistake in my correspondence, I've specified a column dim_table.dimension_id in a column field source of my section of the mapping, rather than the fact_table.dimension_id column, and (supposed) building tried to group by keys of the dimension tables. After this the definition of primary key did the trick (just the unique index, however, was not enough).

  • Loading data: formatting of data for the timestamp column

    Hi all

    I have a table with a timestamp column, named created_date. I want to download this table data using load data page. but there is a problem when downloading data, I have a csv file in which data column created_date in two different format as follows,

    03/09/2013-03:33

    02/09/2013-15:24

    the above data throws an error ORA-01821: date format not recognized.

    In Data / Table Mapping page, I tried with HH12:MI MM/DD/YYYY: SS AM. What format should I use for am and pm?


    Please help me solve...

    Thanks in advance

    Lacombe

    I solved it by using the format MM/DD/YYYY HH:MIAM.

    Thank you

    Lacombe

  • Member not found when loading data with SQL

    Hello everyone:

    I created a cube map extract all information with SQL statements and it works perfectly. Now I'm trying to load data in the same way, but I can't.

    I created a view that returns data in this format:

    Dimension 1 < tab > axis 2 < tab > member... Member of dimension 5 < tab > measure 1 < tab > 2 < tab > measure 3

    I designed a new rule, indicating for each column in the dimension; and for each measure which specific member of dimension of accounts must be used. I have check and everything is ok, but when I try to load data, it does not work and tells me error:

    + "Data value [3.5] met before that all the selected Dimensions, [1] Records duly filled.
    + Essbase error unexpected 1003007 "+"

    If I get the names of the members with quotes (because they contain spaces, and it is perhaps the reason for the previous error, although the rule analysis correctly) with the SQL statement, when Essbase import, it deletes these quotes. I must use another symbol, and as a rule change this another symbol quotes. Is this normal? I know that this issue when importing formulas, but not here.

    Why in 'Dimension building' I don't have this problem with quotes?

    And when changing the symbols of quotes, this error occurs:

    + "Member x not found in the database" +. " But I check the Member and there are in general terms. What's wrong with that? »


    Thanks in advance

    Concerning

    Javier

    Published by: Javi M on 26-mar-2011 05:52

    Yes, the SQL files and data (of all kinds) are supported by more than one column of data. As you noted, you just point to the Member that represents the column.

    That said, I bet that if you look at your view/table and load the rule against your outline, I bet you find a dimension be mismapped, for example, you think that this column 4 points to the scenario, but you really repointe it produces and which purported to be the column 1, or you missed a dimension. Happens to me all the time.

    Kind regards

    Cameron Lackpour

  • should we empty loading data ODI and move to "Load rules" of Essbase?

    Hello
    We use to load data in planning 11.1.1.3's ODI Essbase cubes.
    Should move us data loading ODI on "Load rules" of Essbase?
    We see no advantage to ODI over aid maxl Essbase calls script loading rules and calc scripts.
    One thing he has the advantage on Essbase, is that it can refresh planning. Can you load supporting details and the text in the cell?

    You have data from multiple sources, say some data sets are file bases, some are stored in warehouses or books of accounts, you want to collect these data, transform and apply the mapping, to load that data into essbase, you can add Automation around her then that is an example to use ODI to load data into essbase There are many others.
    Yes you can do it with maxl, but must eventually bring about more strive to add automation, mappings, error handling etc. as you would if you have walked the path of ODI.
    I do not say that you must use it, if you don't see any profit is meaningless.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • built-in photo editor called (photos) does not. It says "cannot load data.» Check your connection.

    Can someone please help, when I click on a photo to change and I chose the second Editor (photos), that it loads a bit, but then he said can not load data, check your connection. But it used to work before! and I am connected to wifi!

    Try to force stop the app and erase the data. It might be a little difficult because it does not find in the settings-applications-all. However, you should be able to open the app in app menu, then press home. After that, select recent application manager and longer tap on the pictures, you should get info app.

  • Loading data from the external XML file

    Hello people!

    I have a XML file on my server, updated by cron every 10 minutes and I want to load data to my webworks her application, but the jquery ajax function do not support the folded areas. So here's my question - how should this information? Can I somehow download this file and then use ajax?

    Welcome!

    in your config.xml file, add an authorization to access your server:

    subdomains of http://www.yourserver.com"="true"/ >

    This will get by cross-origin issues.

  • FDMEE LOAD DATA shows RUNNING status

    We are NOT able to reset the status of a task to load data FDMEE. It shows running since yesterday, we tried to restart the Foundation & FDMEE.

    I tried to update the tables AIF_PROCESSES and AIF_PROCESS_DETAILS corresponding to SUCCESS employment status and restarted the FDMEE but in vain. Surprisingly in the PROCESS MONITOR section it shows as successful. Dorsal and ODI all these jobs listed at 'SUCCESS' but on data FDMEE load rule page it shows 'How TO'.

    This limits us the extra charge for the reloading of data running.

    Version: 11.1.2.3.700

    Application: HFM

    Take a look at the AIF_BALANCE_RULES table and the STATUS column

    See you soon

    John

Maybe you are looking for