Vs FDMEE planning of the data loads

I wanted to have your comments for those who load data through FDMEE vs loading directly to planning (Administration > import and export > import data from the file).

What are the advantages and disadvantages for both?

I appreciate all the entries.

Hello

We could write many lines here.

For me, they cannot be compared as FDMEE is a product and using import the file in planning is a planning feature. Although they have a common thing (they load planning data) you must take into account all processes happening before loading data.

In FDMEE, you can create a set of metadata/data flow business integration in order to ensure the quality of your data (FDQM)... import from many sources, map data, validate, etc...

If you just want to load a flat file, with no mappings for all data correspond to your model of planning, really useless FDMEE. You can simply import your file through HP.

The only drawback to take FDMEE HP, it is that FDMEE is thought to financial data, so the numbers. If you want to load non-financial data (textual data), you would have to customize.

Hope that specifies

Tags: Business Intelligence

Similar Questions

  • Schema name is not displayed in the data loading

    Hi all

    I'm trying to load a CSV file using oracle apex data loading option. The options are using a new upload (.csv) file and table. In the data load page, the schema name is not list my current schema because of which I could not not to download the CSV file.
    Can someone please help with that?


    I use apex oracle 4.1.1

    Concerning
    Rajendrakumar.P

    Raj,

    If it works on apex.oracle.com (4.2) and not in your case (4.1.1), my suspicion is that this is a bug that has been fixed in 4.2 APEX. Apart from upgrading your version 4.2 of the APEX, I'm not sure that there is not really a viable alternative.

    Thank you

    -Scott-

    http://spendolini.blogspot.com
    http://www.enkitec.com

  • Deletion of the data loaded into the Planning

    We loaded data using ODI of planning on an intersection. I need to remove the data and reload to planning on the same intersection. How can I delete a single charge and reload the data? I don't want to clean up the database and recharge start. I just have to clean an intersection and load again.


    Thank you for your help

    Create a calc script clear the area of the database and run before you load the data again.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • The data load has run in Odi but still some interfaces are running in the operator tab

    Hi Experts,

    I'm working on the customization of the olive TREE, we run the incremental load every day. Data loading is completed successfully, but the operator status icon tab showing some interfaces running.

    Could you please, what is the reason behind still running the interfaces tab of the operator. Thanks to Advance.your valuable suggestion is very useful.

    Kind regards

    REDA

    What we called stale session and can be removed with the restart of the agent.

    You can also manually clean session expired operator.

  • Planning for the Data Source

    Hello

    Can we use a data source to build different planning Applications. What is the use of data source... Which holds a wrt planning RDBMS.

    A data source defines the connection details, he holds connection information to a relational/db schema where is stored the planning app. It will also have the login information to essbase.
    You need a separate for each planning application data source as the relational/db schema must be separate for each application.
    The RDBMS holds all the metadata for the purposes of planning, such configuration, dimensions, structure, forms, lists of tasks etc.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • path for the data loader

    Hello

    When I try to start loading play user in the DEV Forum

    I like error on ill-used dbc file

    I check it

    his watch a bad path for my BCD file

    no idea where to change this path

    ID OS RHEl 5
    DB 11.2.0.1
    EBS 12.1.2

    Hello

    Is the value of the variable 's_fnd_secure' in the $CONTEXT_FILE same as the value of the $FND_SECURE environment variable?

    Probably, but it can't hurt to check.

    Have you tried stop the Treaty Web Apps (and making sure that there are no process currently running on the o/s), and then by restarting the web server?

    This test may be useful...

    (1) http://server:port/OA_HTML/jsp/fnd/aoljtest.jsp

    2) click «Enter Test AOL/J installation»

    (3) run these two tests
    a. Locate the DBC
    b. check the settings of DBC

    Concerning
    Frank

  • When I try to download a (.xls) file in excel by using the page in the data loading oracle apex. I get a screen (as in the attached screenshot) and the load will fail. Is there a solution? Thanks in advance.

    screenshot.jpg

    Recording format .csv and then download solved the problem. Thanks to you all.

  • Automation of loading the data of single application FDM for different applications

    Friends, my query is somewhat complex. I have 6 (two in Hyperion Planning, two HPCM) and two in Hyperion Essbase applications. I copied the adapter in the Workbench and renamed accordingly to load the data for each of them. Now the problem is I want to automate the data load for each of these requests, but don't know how it's done. Through many forums to get a better understanding but no luck!

    A humble request to all the FDQM experts for their valuable advice on how to realize the automation of all the tools in one application of FDM.

    Thanks in advance!

    You would automate this process via the Batch Loader integrated with FDM. The process to use this is exactly the same as you have one or more target applications. The ultimate target application is based on the name of the place incorporated into the batch processing file naming convention. Each of your adapters different target will be associated with one or more locations in your configuration of metadata location FDM.

  • Ignore the ASO - zero data loads and missing values

    Hello

    There is an option that ignores the zero values & the missing values in the dialog box when loading data in cube ASO interactively via EAS.

    Y at - it an option to specify the same in the MAXL Import data command? I couldn't find a technical reference.

    I have 12 months in the columns in the data flow. At least 1/4 of my data is zeros. Ignoring zeros keeps the size of the cube small and faster.

    We are on 11.1.2.2.

    Appreciate your thoughts.

    Thank you

    Ethan.

    The thing is that it's hidden in the command Alter Database (Aggregate Storage) , when you create the data loading buffer.  If you are not sure what a buffer for loading data, see loading data using pads.

  • Generic procedure to load the data from the source to the table target

    Hi all

    I want to create a generic procedure to load data of X number of the source table to X number of the target table.

    such as:

    Source1-> Target1

    Source2-> Target2

    -> Target3 Source3

    Each target table has the same structure as the source table.

    The indexes are same as well. Constraint are not predefined in the source or target tables.there is no involved in loading the data from the business logic.

    It would simply add.

    This procedure will be scheduled during off hours and probably only once in a month.

    I created a procedure that does this, and not like:

    (1) make a contribution to the procedure as Source and target table.

    (2) find the index in the target table.

    (3) get the metadata of the target table indexes and pick up.

    (4) delete the index above.

    (5) load the data from the source to the target (Append).

    (6) Re-create the indexes on the target table by using the collection of meta data.

    (7) delete the records in the source table.

    sample proc as: (logging of errors is missing)

    CREATE or REPLACE PROCEDURE PP_LOAD_SOURCE_TARGET (p_source_table IN VARCHAR2,

    p_target_table IN VARCHAR2)

    IS

    V_varchar_tbl. ARRAY TYPE IS VARCHAR2 (32);

    l_varchar_tbl v_varchar_tbl;

    TYPE v_clob_tbl_ind IS TABLE OF VARCHAR2 (32767) INDEX OF PLS_INTEGER;

    l_clob_tbl_ind v_clob_tbl_ind;

    g_owner CONSTANT VARCHAR2 (10): = 'STG '.

    CONSTANT VARCHAR2 G_OBJECT (6): = 'INDEX ';

    BEGIN

    SELECT DISTINCT INDEX_NAME BULK COLLECT

    IN l_varchar_tbl

    OF ALL_INDEXES

    WHERE table_name = p_target_table

    AND the OWNER = g_owner;

    FOR k IN l_varchar_tbl. FIRST... l_varchar_tbl. LAST LOOP

    SELECT DBMS_METADATA. GET_DDL (g_object,

    l_varchar_tbl (k),

    g_owner)

    IN l_clob_tbl_ind (k)

    FROM DUAL;

    END LOOP;

    BECAUSE me IN l_varchar_tbl. FIRST... l_varchar_tbl. LAST LOOP

    RUN IMMEDIATELY "DROP INDEX ' |" l_varchar_tbl (i);

    DBMS_OUTPUT. PUT_LINE (' INDEXED DROPED AS :'|| l_varchar_tbl (i));

    END LOOP;

    RUN IMMEDIATELY ' INSERT / * + APPEND * / INTO ' | p_target_table |

    ' SELECT * FROM ' | '. p_source_table;

    COMMIT;

    FOR s IN l_clob_tbl_ind. FIRST... l_clob_tbl_ind LAST LOOP.

    EXECUTE IMMEDIATE l_clob_tbl_ind (s);

    END LOOP;

    RUN IMMEDIATELY 'TRUNCATE TABLE ' | p_source_table;

    END PP_LOAD_SOURCE_TARGET;

    I want to know:

    1 has anyone put up a similar solution if yes what kind of challenges have to face.

    2. it is a good approach.

    3. How can I minimize the failure of the data load.

    Why not just

    create table to check-in as

    Select "SOURCE1" source, targets "TARGET1", 'Y' union flag double all the

    Select "SOURCE2', 'TARGET2', 'Y' in all the double union

    Select "SOURCE3', 'Target3', 'Y' in all the double union

    Select "SOURCE4', 'TARGET4', 'Y' in all the double union

    Select 'Source.5', 'TARGET5', 'Y' in double

    SOURCE TARGET FLAG
    SOURCE1 TARGET1 THERE
    SOURCE2 TARGET2 THERE
    SOURCE3 TARGET3 THERE
    SOURCE4 TARGET4 THERE
    SOURCE.5 TARGET5 THERE

    declare

    the_command varchar2 (1000);

    Start

    for r in (select source, target of the archiving of the pavilion where = 'Y')

    loop

    the_command: = "insert / * + append * / into ' |" r.Target | ' Select * from ' | '. r.source;

    dbms_output.put_line (the_command);

    -execution immediate the_command;

    the_command: = 'truncate table ' | r.source | "drop storage."

    dbms_output.put_line (the_command);

    -execution immediate the_command;

    dbms_output.put_line(r.source ||) 'table transformed');

    end loop;

    end;

    Insert / * + append * / into select destination1 * source1

    truncate table SOURCE1 drop storage

    Treated SOURCE1 table

    Insert / * + append * / to select TARGET2 * in SOURCE2

    truncate table SOURCE2 drop storage

    Treated SOURCE2 table

    Insert / * + append * / into select target3 * of SOURCE3

    truncate table SOURCE3 drop storage

    Treated SOURCE3 table

    Insert / * + append * / into TARGET4 select * from SOURCE4

    truncate table SOURCE4 drop storage

    Table treated SOURCE4

    Insert / * + append * / into TARGET5 select * from source.5

    truncate table source.5 drop storage

    Treated source.5 table

    Concerning

    Etbin

  • Data loader detects the NUMBER instead of VARCHAR2

    Hello

    In my database, I have a table that stores information about components of vehicles. I created a new process identifier as unique keys, fields of manufacturer and reference data loading.

    (For example: manufacturer 'BOSCH', '0238845' reference)

    When the system is running the data map, it detects the column reference number, delete the first zero character '0238845' -> 238845

    How can I solve this problem?

    Kind regards

    transformation in the data loader to make a character?

    Thank you

    Tony Miller

    Software LuvMuffin

  • Cannot delete the data and source ERPI loading system rules 11.1.1.3

    Hello

    I am trying to remove the source system screen ERPI in the workspace (what of more, we use in our application UAT), but it throws to up and say error message "could not delete the Source System. There are Applications associated with this Source system targets. Please remove all associated target Applications. »

    I tried to delete the data load rules, but are all not valid and deletion icon (trash) is disabled.

    Can someone help me how to clean these invalid instances?

    Version: 11.1.1.3

    Thank you

    Jehanne

    Figured out, remove the application of "Target Application Registartion" and your rules of metadata and loading data are deleted.

  • To access the data that is loaded by the XML Loader class

    Hi guys,.

    I have trouble accessing the data loaded by external class.

    Here is my code:

    Main class:

    package {}

    import flash.display.MovieClip;
    nucleus of import. XMLLoader;

    SerializableAttribute public class Main extends MovieClip {}

    var projectSetupMainMenuXML:Boolean = true;

    public void Main() {}

    If {(projectSetupMainMenuXML)
    var mainMenuXML = new XMLLoader ("menu.xml");
    }
    }
    }
    }

    XMLLoader class:

    package base {}

    import flash.display. *;
    import flash.events. *;
    flash.net import. *;

    public class XMLLoader {}
    private var mainMenu:XML;
    private var urlLoader:URLLoader;

    public function XMLLoader (mainMenuPath:String) {}
    var urlRequest:URLRequest = new URLRequest (mainMenuPath);
    urlLoader = new URLLoader();
    urlLoader.addEventListener (Event.COMPLETE, completeListener);
    urlLoader.load (urlRequest);
    }

    private void completeListener(e:Event):void {}
    mainMenu = new XML (urlLoader.data);
    e.target.removeEventListener (Event.COMPLETE, completeListener);
    }
    }
    }

    Now, I want to create another external class (called MainMenu) which will be launched from the main class.

    This class should create the menu based on the loaded XML class.

    My question is, how can I make use of the loaded content through the XMLLoader class within the class MainMenu XML?

    Thank you.

    I think you have to use XMLLoader as a singleton - with the properties and static methods. In this way, you can load XML only once and make available XMl data to any object in the application.

  • FDMEE planning

    Hello

    We have upgraded to FDMEE. 11.1.2.3.0.26

    Source: Commas file

    Now Pierce at the source, I have a requirement like that display data detailed during the break and I'm loading in Hyperion planning only a summary.

    let say I have

    Account1, Account2, entity, given

    Ac1111, Ac1001, E1, 1000

    Ac1111, Ac1001, E2, 2000

    When I load a data, I took Account2, body and sound planning under 3000 data loading. When break us thro we Account2, entity and data and not of account1.

    I tried to add the dimension of research as account1, but he doesn't have to drill holes in the page.

    What I am asking is to break one on each line of data in the respective file in its data.

    Please provide any idea to achieve this.

    Thank you

    Upgrade to the latest patch 530, there were some problems with drill-thru and attrib fades in the first version of fdmee

    Also check the options

  • Move the data from the partitions again tablespace - direct table filled by ETL

    Hello

    We have a large composite table date-range/hash partitioned table (using time partitioning 11.2.0.3 range)

    1 partition for each month and each such partition 4 under separate partitions.

    We want to move some data from most significant storage but also direct table and inserted in all the days as part of the data warehouse and large table want to ensure

    operations below result in errors in the data loading procedure

    Suppose also should not take long to run lines 40 million or more each month.

    Select *.

    Of user_tab_partitions

    Where table_name = 'CUSTOMER_TRANSACTION '.

    And nom_partition like 'SYS % '.

    1. (1) Alter table retailer_transaction and rename partition < nom_partition > CUST_PART_ < YYYYMM >

    1. (2) tablespace create dedicated for the month of the affected partition

    1. ((3) alter table retailer_transaction move tablespace subpartition < sys... > < partition_tablespace > created in 2) above

    ALTER table retailer_transaction change the default attributes for the partition < CUST_PART < YYYYMM > < tablespace in 2 above > tablespace)

    4 global index which would eventually rebuild afterwards due to the transition if movement makes no valid.

    Several local bitmap indexes that belong to partitions.

    Thoughts?

    Thank you

    I agree in a good store of data, everything should be according to the best practices ETL or guideline. I guess it is a FACT table, because it's big enough. A method that I can share (followed in the real world big DW) is "ELP" IE Partition Exchange loading. You can find the basic architecture of the Internet.

    The basic architecture is as

    The source (in your case partitioned) Table--> (unpartitioned) intermediate Table--> Table possible target (partitioned)--> the cleaning activities (and overtime).

    You can use any ETL tool for this (I won't mention any other reasons).

    Later, you can delete the original table and use that new table as your FACT table, your application will be available all the time. There are drawbacks as most uses of double-spaced, other than normal ETL these work will take certain amount of resources so that other ALS etc. In addition, if you load a few current days during these activities you plan / program accordingly.

    Your demo is also fine, but regular ETL may fail because you rename the partitions (some uses of ETL tool partition name, in case you ever ELP) also unusable global indexes can cause failure (you have local too?).

Maybe you are looking for

  • Whats an OP?

    I ask how to get a substantive theme for the new tab page, and the answers say, when the OP opens? So, how do a background theme for the new tab on the page.Thanks people, I love Firefox.

  • Notes disappear

    Hello I just upgraded my MacBook pro and all my notes disappeared from the Application of Notes of Apple. As I have found in some forums I found the NotesV6.storedata file and opened it with a sqlite program, but still I can't find all the. Today is

  • Someone had a problem of speaker with Droid RAZR HD

    Hello everyone, I have the Droid Razr Max HD, and it's a great phone. My headphone speaker had really crackily and distroted about 1.5 months ago. It finally got really irritating I did a factory reset. This has not fixed the problem. I contacted the

  • Extend the warranty of Lenovo W540

    I bought a Lenovo W540 on February 14, 2015 at the Portugal. It turns out that when I sign up by the "Lenovo Solution Center", I found that the registered warranty is may 2014 end June 28, 2017. I tried to change the date of purchase, but I can't fin

  • using the table grid

    Hi all I have a 2D array of 8-bit binary data. I want to add data to a table and change the color of each cell according to the values of data in the following wayIf any bit of the 8-bit data is '1' the cell must be red otherwise the cell will be gre