Data on imports of HFM FDM

Hi all

What would be the best approach to import data from HFM to FDM?

Use HFM API to run HFM EA so extract FDM can import data from a database table?

Thank you

Hello

It is difficult to comment on what is a "best practice", as that is defined by your organization might not be valid somewhere else.

There are two options:
1 EA jobs
2 HFM data extracts

Each requires some custom scripts and road maps based on what you're trying to do.

If you choose #1 - you will probably use an integration Script to read from a specific database table.
If you choose #2 - you will use HFM API to generate a flat file; and then go flat file in FDM for treatment.

Thank you

Tags: Business Intelligence

Similar Questions

  • SQL Server 2008 R2 CPU and RAM requirement for HFM/FDM

    Summary of the problem

    ---------------------------------------------------

    Requirement of SQL Server 2008 R2 for HFM/FDM

    Description of the problem

    ---------------------------------------------------

    We install a new EPM 11.1.2.3 - HFM and FDM on one server that hosts the services of the Foundation and FR. Our question is on the CPU and Ram needed on the SQL Server. We have 2 users with 4 GB of data, but we have to get the good performance of SQL server.

    Please provide me with a white paper link or just tell me what it takes.

    Thanks in advance.

    4 CPUS and 16 GB of RAM should be more than enough to support your needs

  • Date of creation and the Date of importation

    When you import photos or video in the Photos to a folder, the application uses the date of importation of integration rather than the original creation date.  The result is that imports are all presented together under "Today."  Many photos and video taken on different dates, so I would only they listed according to date of creation rather than be grouped under the date of importation.  I went 'View' and checked "date of creation".  Photos don't work with "SORT" because it is always grey.  Any help would be greatly appreciated!

    If you look in the window of Photos photos and videos are sorted by date with the oldest items at the top.  This sort order cannot be change.

    In the pictures window, the elements are sorted by the date imported into the library with the oldest at the top.  The sort order cannot be changed here either.

    So, you can use a smart album to include all your photos in the library and then sort them one of these ways:

    The smart album could be created to this criterion:

    that would include all the photos in the library.  Now you can sort them as you like.  Just make sure that the date is early enough to catch all the photos in the library.

    Moments in Photos are new events, i.e. groups of photos sorted by date of catch.

    When the iPhoto library has been migrated first to the pictures there is a folder created in the box titled iPhoto events and all migrated iPhoto events (which are now Moments) are represented by an album in this folder. Use the Command + Option + S key combination to open the sidebar if it is not already open.

    NOTE: it has been reported by several users that if the albums of the event are moved in the iPhoto Library folder in the sidebar, they disappear.  It is not widespread, but several users have reported this problem.  Therefore, if you want to if ensure that you keep these event albums do not transfer out of the iPhoto events folder.

    Is there a way to simulate events in pictures.

    When new photos are imported in the library of Photos, go to the smart album last import , select all the photos and use the file menu option ➙ New Album or use the key combination command + N.  Call it what you want.  It appears just above the folder to iPhoto events where you can drag it into the events in iPhoto folder

    When you click on the folder to iPhoto events, you will get a window simulated of iPhoto events.

    Albums and smart albums can be sorted by title, by Date, with the oldest first and by Date with the most recent first.

    Tell Apple what missing features you want restored or new features added in Photos Photo-Applefeedback.

  • Camera import Windows video gives dates with the date of importation and don't shoot don't date!

    When I import pictures using windows import camera it gives all my videos the date where I imported them and not the date, I took them. they come out well on my camera. So if I copy and paste it, it gives the correct date. except that I want to use Windows import not only to copy and paste! Help, please!

    If you import data, this file of date data, show the date of importation.

    I use another program to copy pictures from the camera to the PC and gives the dates of importation and the device date.

  • From LR6.1 to LR6.3: rename folders created are based on the date of importation rather than the date of shooting!

    I started using LR with version 2... My import workflow was even up to 6.1 and all my files are organized in the same way for more than 10 years.

    I went on LR6.2 given the changes made in the import dialog. But upgraded to LR6.3 think the problem has been resolved.

    But now I noticed that all my photos are renamed and placed in folders corresponding to the date of importation rather than the date of shooting... I rename my files from custom and is gone in this dialogue, but has not as how to restore LR import my photos using the date of shooting. This creates havoc on my hard drives and in LR. I use the date of shooting a lot to find my photos, since I often neglect to add keywords.

    I really need to be able to get the date of shooting on the behalf of my photos and have copied them in the folder the correct date.

    Thanks in advance for your help.

    BTW: something has changed in the way the forum appears. Research is not very powerful... it has not brought something, yet I know that the import dialog has been at length when 6.2 was introduced. In addition, I see no browser passed a few more recent messages. I don't like the new interface.

    Well, I answered my own question. Yes, I was looking for settings in the destination Panel... There is a line that shows what would change the name... and it showed the date of importation as the new name... but actually... It was only to show you what the name change would look like. Because I had a selected image, I thought it would show that the change of name for this image. But it is not related to the image... The import worked as expected... My apologies for disturbing him. .

  • Error data pump import

    Hi all

    I get errors below when you try to use the data pump import a table into the dump of the file (of a pre-production database) in a different database (prod_db) even version. I used the expdp to generate the dump file.

    Gettings errors-
    ORA-39083
    ORA-00959
    ORA-39112

    Any suggestions or advice would be appreciated.

    Thank you


    Import: Release 10.2.0.1.0 - 64 bit Production

    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    ;;;
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64 bit Production
    With partitioning, OLAP and Data Mining options
    Main table 'SYSTEM '. "' SYS_IMPORT_TABLE_01 ' properly load/unloaded
    Start "SYSTEM". "" SYS_IMPORT_TABLE_01 ": System / * DIRECTORY = DATA_PUMP_DIR DUMPFILE = EXP_Mxxxx_1203.DMP LOGFILE = Mxxxx_CLAIMS. LOG TABLES = CLAIMS
    Object type SCHEMA_EXPORT/TABLE/TABLE processing
    ORA-39083: object of type THAT TABLE could not create with the error:
    ORA-00959: tablespace "OXFORD_DATA_01" does not exist
    Because sql is:
    CREATE TABLE 'xxx '. "" CLAIMS "("CLAIM_NUM"VARCHAR2 (25) NOT NULL ACTIVATE, ACTIVATE THE"LINE_NUM"NUMBER (3.0) NOT NULL, 'PAID_FLAG' CHAR (1), CHAR (1) 'ADJ_FLAG', 'CLAIM_TYPE' VARCHAR2 (20), 'MEM_ID' VARCHAR2 (20), VARCHAR2 (20) 'MEM_SUF', DATE 'MEM_BEG_DATE', 'MEM_REL' CHAR (2), 'MEM_NAME' VARCHAR2 (40), DATE 'MEM_DOB', 'MEM_SEX' CHAR (1),"REC_DATE"DATE, DATE OF THE"PAID_DATE") 'FROM_DATE' DATE "
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    ORA-39112: type of dependent object INDEX: 'xxx '. «CLAIMS_IDX1 ' ignored, base object type TABLE: 'Xxx'.» "" Creation of CLAIMS ' failed
    ORA-39112: type of dependent object INDEX: 'xxx '. «CLAIMS_PK ' ignored, base object type TABLE: 'Xxx'.» "" Creation of CLAIMS ' failed
    Object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS of treatment
    ORA-39112: dependent object type skipped INDEX_STATISTICS, object-based INDEX type: 'xxx '. "' CLAIMS_IDX1 ' failed to create
    ORA-39112: dependent object type skipped INDEX_STATISTICS, object-based INDEX type: 'xxx '. "" Creation of CLAIMS_PK ' failed
    Object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS treatment
    ORA-39112: dependent object type skipped TABLE_STATISTICS, object-based ARRAY type: "xxx". "" Creation of CLAIMS ' failed
    Work 'SYSTEM '. "" SYS_IMPORT_TABLE_01 "completed with error (s) from 6 to 13:49:33


    Impdp name username/password DIRECTORY = DATA_PUMP_DIR DUMPFILE = EXP_Mxxxx_1203.DMP LOGFILE is Mxxxx_CLAIMS. LOG TABLES = CLAIMS

    Before you create the tablespace or use clause REMAP_TABLESPACE for the import.

  • Data pump import only indexes

    Hello I dumpfile having 100 million documents. I was loading to another server using data pump import utility. It loads the table of data in half an hour or less but it take 7hr when importing indexes. Finally I have to kill jobs and to reveal the index Btree alone was created successfully on all the table any failure of plan B to create now I have the dump file which have all data and metadata. is it possible in utility Datapump to import only the missing indexes or import all indexes only no table and other things? If I use the Include command simply index? Please suggest me a solution

    Oracle Studnet wrote:
    Right, but there is only one way to solve the problem I want to do? can I extract only the index of the dump file and import them rather table and other objects

    Impdp HR/hr dumpfile = directory = data_pump_dir hrdp.dmp include = index content = metadata_only

  • Sharing data between planning and HFM

    Hi all

    How can we share data and metadata between HFM and planning, can you help me to list the possible ways? Epma is can be used to do?

    Thank you.

    Check this one as well:
    http://www.SlideShare.NET/Ranzal/HFM-integration-collaborate-2010
    http://www.Network54.com/Forum/58296/thread/1250272593/HFM+data+to+Essbase

    See you soon... !!!

  • Data pump import

    I cannot using the data pump import tool. I work with Oracle 10.2.4.0 on Windows Server 2008. I am trying to import a large amount of data (several patterns, thousands of paintings, hundreds of GB of data) of the same version of Oracle on Linux. Since there is so much data and will be if long to import, I try to make sure that I can make it work for a table or a diagram before I try. So I will try to import the TEST_USER. Table 1 table using the following command:

    Dumpfile = logfile = big_file_import.log big_file.dmp Impdp.exe tables = test_user.table1 remap_datafile=\'/oradata1/data1.dbf\':\'D:\oracle\product\10.2.0\oradata\orcl\data1.dbf\'

    I provide sys as sysdba to connect when he asks (not sure how you get this info in the original order). However, I get the following error:

    the "TEST_USER" user does not exist

    My understanding is that the data pump utility would create all the necessary patterns for me. Is it not the case? The target database is a new facility for the source schemas do not exist here.

    Even if I create the schema of test_user by hand, then I get an error message indicating that the tablespace does not exist:

    ORA-00959: tablespace "TS_1" does not exist

    Then even to do it, I don't want to have to do first, does not work. Then he gives me something about how the user has no privileges on tablespace.

    The data pump utility is not supposed to do that sort of thing automatic? What I really need to create all schemas and storage by hand? That will take a long time. I'm doing something wrong here?

    Thank you
    Dave

    tables = test_user.table1

    The "TABLES" mode does NOT create database accounts.

    The FULL mode creates space storage and database accounts before importing the data.

    PATTERNS mode creates the database accounts before importing data - but expect Tablespaces to exist before so that tables and indexes can be created in the correct storage spaces.

    Hemant K Collette
    http://hemantoracledba.blogspot.com

  • "Resume" a Data Pump import Execution failure

    First of all, here is a "failure" Data Pump Import I want to "resume":
    ] $ impdp directory dumpfile=mySchema%U.dmp "" / as sysdba "" = DATA_PUMP_DIR logfile = Import.log patterns parallel = MYSCHEMA = 2

    Import: Release 10.2.0.3.0 - 64 bit Production Tuesday, February 16, 2010 14:35:15

    Copyright (c) 2003, 2005, Oracle. All rights reserved.

    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64 bit Production
    With the options of Real Application Clusters, partitioning, OLAP and Data Mining
    Table main 'SYS '. "' SYS_IMPORT_SCHEMA_01 ' properly load/unloaded
    Departure 'SYS '. "' SYS_IMPORT_SCHEMA_01 ': ' / * AS SYSDBA"dumpfile=mySchema%U.dmp directory = DATA_PUMP_DIR logfile = Import.log patterns parallel = MYSCHEMA = 2
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Object type SCHEMA_EXPORT/SYNONYM/SYNONYM of treatment
    Object type SCHEMA_EXPORT/TYPE/TYPE_SPEC of treatment
    Object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE of treatment
    Processing object type SCHEMA_EXPORT/SEQUENCE/EXCHANGE/OWNER_GRANT/OBJECT_GRANT
    Object type SCHEMA_EXPORT/TABLE/TABLE processing
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "MYSCHE"...
    ...
    ... 0 KB 0 rows
    Processing object type SCHEMA_EXPORT/TABLE/SCHOLARSHIP/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    ORA-39126: worker unexpected fatal worker error of $ MAIN. LOAD_METADATA [INDEX: "MONSCHEMA".] ["" OBJECT_RELATION_I2 "]
    SELECT process_order, flags, xml_clob, NVL (dump_fileid,: 1), NVL (dump_position,: 2), dump_length, dump_allocation, constituent, object_row, object_schema, object_long_name, processing_status, processing_state, base_object_type, base_object_schema, base_object_name, goods, size_estimate, in_progress 'SYS '. "" SYS_IMPORT_SCHEMA_01 "process_order where: 3: 4 AND processing_state <>: 5 AND double = 0 ORDER BY process_order
    ORA-06502: PL/SQL: digital error or value
    ORA-06512: at "SYS." "MAIN$ WORKER", line 12280
    ORA-12801: error reported in the parallel query P001, instance pace2.capitolindemnity.com:bondacc2 Server (2)
    ORA-30032: the suspended order (trade-in) has expired
    ORA-01652: unable to extend segment temp of 128 in tablespace OBJECT_INDX
    ORA-06512: at "SYS." DBMS_SYS_ERROR', line 105
    ORA-06512: at "SYS." "MAIN$ WORKER", line 6272
    -PL/SQL call stack-
    the line object
    serial number of handle
    package body 14916 0x1f9ac8d50 SYS. MAIN$ WORKER
    body 0x1f9ac8d50 6293 package SYS. MAIN$ WORKER
    body 0x1f9ac8d50 3511 package SYS. MAIN$ WORKER
    body 0x1f9ac8d50 6882 package SYS. MAIN$ WORKER
    body 0x1f9ac8d50 1259 package SYS. MAIN$ WORKER
    0x1f8431598 anonymous block 2
    Job 'SYS '. "' SYS_IMPORT_SCHEMA_01 ' stopped because of the fatal 23:05:45
    As you can see this work is not without treatment statistics, constraints, etc. from PL/SQL. I want to do is make another impdp command but ignore objects that were imported with success as shown above. Is it possible to do (using the EXCLUDE parameter maybe?) with impdp? If so, what would it be?

    Thank you

    John

    Why you just restart this work. It ignores everything that has already been imported.

    Impdp "" / as sysdba "" attach = "SYS." "" SYS_IMPORT_SCHEMA_01 ".

    Import > keep

    Dean

  • HFM/FDM data loading

    I inherited the company Hyperion and I am a user of essbase, which now is work of HFM.

    A user wants to support access to a specific entity in HFM. Make:

    (a) give the excess of writing user to the security group, which is part of the entity? or
    (b) only I'm in FDM and create a situation and that a link to HFM?

    The entity in which the person wishes to load data is not yet a situation to FDM (I added the new entity last week). (To aid (FM/FDM 11.1.2.1, db SQL, Windows operating system)

    Help, please!

    RIM,

    (a) Yes. Even if they have access FDM to a location that corresponds to the HFM entity, they will not be able to actually export/load HFM data without access to good security.
    (b) it depends a bit. If the new entity is loaded as a piece of a book / existing location, you will need to just update the map of the entity for the existing location. If it's a totally new company, then you can go the whole nine yards and create the location of FDM, etc. and entity HFM, cards.

  • How do Ms integration between two Applications of HFM FDM

    Dear all,

    I need to create a FDQM Application in order to transcode and transfer of a HFM to an another HFM Silverligh application data. I'm on system 11. Then.

    1. I need a step by step guide to perform this task, because it is completely new to me.

    In addition, I ask the following also:

    2. is it possible to extract only a part of the source HFM database: for example, I want to extract only the data of the Custom4 [no] element (excluding any other sheet of this size);

    3. is it possible to extract data from parent entities? How can I do this?

    I know that my questions are a bit more generic, but again, I'm new to this kind of implementation.

    Thank you very much.

    You can use the function fDBExtract of the HFM adapter class to extract data from HFM in a flat file, and then import this file in FDM.

  • When the data is loaded into the FDM

    Hi all

    We have an obligation for us to have 2 Applications of FDM:

    1 CAPITAL

    2 INCOMESTATEMENT

    data sources we get are 1 single file for applications of CAPITAL and INCOMESTATEMENT

    1. when I insert data in capital application there will be a few data INCOMESTATEMENT which should not prevent these accounts in validation must be ignored when we click on validate

    2. same way

    When I insert data in application INCOMESTATEMENT there will be some data of CAPITAL that should not stop in the validation of accounts must be ignored when we click on validate

    Is there any script if the accounts are validated it should load or it should be ignored, but do not see a is not validate?

    Thank you

    According to needs, you map those elements to ignore them in FDM or you can ignore them in the import using the straight import format or a script in the import format.

    Concerning

    JOINT TASK FORCE

  • HFM, FDM 11.1.2.2 to 9.3.1 price drop?

    Hi all

    Could someone please suggest Rollbacking (Migrating) of HFM and FDM 11.1.2.2 9.3.1

    > Is this possbile?, if he is kindly share the steps?

    There is no other possibility I could think.

    1. have a backup of database 9.3.1 database.

    2. If you want to restore to 9.3.1 at some point then just install the 9.3.1 software and connect to the database.

    If you do not have the 9.3.1 database backup or the 9.3.1 application artifacts such profile, metadata, rules, data and journals then it is not at all possible for us to downgrade the 11.1.2.2. database of the 9.3.1. ORACLE COULD ONLY HELP.

  • Submit data by recovery with HFM

    Hello everyone,

    It is time that the new budget 2016 next year, and we would like to HFM allows to send data and consolidating the budget. I know how to use HFM to view the data, but this is the first time I want to use HFM to send data directly with smart view recover.

    The idea is to each manager can submit the budget by itself, and also consult. I have already noticed the HSsetvalue formula, but I don't really know how to use it and I want to present all the data at once.

    Please, some people have this kind of project before with smart?

    Thank you very much

    Hello. Yes, you can use the HsSetValue function to put data in HFM write the function once with the cell formulas, copy and paste and then submit. Another option would be to build forms within HFM which can then be opened in Smartview and entered. Less flexible but faster for end users.

    As the data are intended to build you Calc to build the budget, for example, can cause people enter effective and then calculate an expenditure of human resources based on the number of people.

    Eric

Maybe you are looking for