Loading the data into the order of series of Timestamp

I need to load data into the target table in the order of series of timestamp. How to do it.

ex:

2015-12-10 02:14:15

2015-12-10 03:14:15

2015-12-10 04:14:15

After you follow the how to use the Direct-Path INSERT ordered by your "timestamp column" series described here above, you can sort the lines in ODI (order of) this way:

Tags: Business Intelligence

Similar Questions

  • loading dynamic data into the component

    Hello
    Im having a problem with loading of text in the component. Until the new item came I was loading the data of a component called slideshowpro in a dynamic text field. my script looked like this (with the t_txt dynamic text field):

    Net.slideshowpro.slideshowpro import. *;

    function onImageData(event:SSPDataEvent) {}
    If {(event.type=="imageData)"}
    t_txt.htmlText = Event.Data.Caption;
    }
    }

    my_ssp.addEventListener (SSPDataEvent.IMAGE_DATA, onImageData);


    -I now want to load data into the component layout of text of the same name (t2_text). How would I change the script above for flow data into the component layout of text rather than the dynamic text field? THX.

    The author of the component might look at the example of ImportMarkup. This shows how to convert text from markup. TLF laboratories don't have html conversion. If its plain text look at the HelloWorld example.

  • Load xml data into the table

    Hi all

    I have an XML (emp.xml) with data below:

    -< root >
    -< row >
    < name > steve < / lastname >
    < 30 > < / Age >
    < / row >
    -< row >
    < name > Paul < / lastname >
    <>26 < / Age >
    < / row >
    < / root >

    I would like to create a stored procedure to store the xml data into the EMP table.

    EMP
    LastName age
    Steve 30
    Paul 26

    I tried to watch all threads related to this forum, but cannot find the right wire. Any help is greatly appreciated. Thank you

    With

    SQL>  select * from xmltable('root/row' passing xmltype('
    
    steve
    30
    
    
    Paul
    26
    
    ') columns lastname path 'lastname',
                       Age path 'Age')
    /
    LASTNAME   AGE
    ---------- ----------
    steve      30
    Paul       26   
    

    You can now just make a

    Insert into emp as select...

  • Load xml data into an Oracle table

    Hello

    I went through some threads in the forums itself, but for my requirement that nothing comes closer, I write my request. I have a XML like this
    <? XML version = "1.0"? >
    < ACCOUNT_HEADER_ACK >
    < HEADER >
    < STATUS_CODE > 100 < / STATUS_CODE >
    Check < STATUS_REMARKS > < / STATUS_REMARKS >
    < / Header >
    < DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 2 < / SEGMENT_NUMBER >
    PR Polytechnic < REMARKS > < / COMMENTS >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 3 < / SEGMENT_NUMBER >
    < REMARKS > PR Polytechnic administration < / COMMENTS >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 4 < / SEGMENT_NUMBER >
    < REMARKS > rp Polytechnique finance < / COMMENTS >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 5 < / SEGMENT_NUMBER >
    < REMARKS > logistics Polytechnique rp < / COMMENTS >
    < / DETAILS >
    < / DETAILS >
    < HEADER >
    < STATUS_CODE > 500 < / STATUS_CODE >
    < STATUS_REMARKS > process exception < / STATUS_REMARKS >
    < / Header >
    < DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 20 < / SEGMENT_NUMBER >
    Basic Polytechnique < REMARKS > < / COMMENTS >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 30 < / SEGMENT_NUMBER >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 40 < / SEGMENT_NUMBER >
    Finance basic Polytechnique < REMARKS > < / COMMENTS >
    < / DETAILS >
    < DETAIL >
    < SEGMENT_NUMBER > 50 < / SEGMENT_NUMBER >
    Logistics base Polytechnique < REMARKS > < / COMMENTS >
    < / DETAILS >
    < / DETAILS >
    < / ACCOUNT_HEADER_ACK >

    Here is the xml structure of the master structure and child I want to insert data in Oracle tables using the sql * loader initially tried to create a control file, but I don't know how to terminate in the control file, so I created two control files

    load data
    INFILE 'acct.xml' ' str ' < / DETAIL >»»
    TRUNCATE
    in the xxrp_acct_detail table
    TRAILING NULLCOLS
    (
    dummy fill finished by "< DETAIL >."
    SEGMENT_NUMBER surrounded by '< SEGMENT_NUMBER >' and ' < / SEGMENT_NUMBER >, "
    REMARKS framed by 'Of REMARKS <>' and ' < / COMMENTS >.
    )


    load data
    ACCT.XML INFILE' "str" < / header > ' "»
    TRUNCATE
    in the xxrp_acct_header table
    fields terminated by '< HEADER >.
    TRAILING NULLCOLS
    (
    dummy fill finished by "< HEADER >."
    STATUS_CODE framed by '< STATUS_CODE >' and ' < / STATUS_CODE >. "
    STATUS_REMARKS surrounded by '< STATUS_REMARKS >' and ' < / STATUS_REMARKS >.
    )

    I refer to the same xml file in two control files, where with regard to the first control file, I was able to load the files but the second which I suppose as table header not able to load the records of rest. I get the below error.

    Sheet 2: Rejected - error on the XXRP_ACCT_HEADER, column DUMMY table.
    Field in the data file exceeds the maximum length
    Sheet 3: Rejected - error on the XXRP_ACCT_HEADER, column DUMMY table.
    Field in the data file exceeds the maximum length

    In fact if its possible to seggrate a control file so it will be very useful for me. I'm also open for the external table as option. Please help me in this regard.

    Thanks in advance.

    Concerning
    Mr. Nagendra

    Here are two possible solutions:

    (1) reading the headers and separate details using two XMLTables:

    DECLARE
    
     acct_doc xmltype := xmltype( bfilename('TEST_DIR','acct.xml'), nls_charset_id('AL32UTF8') );
    
    BEGIN
    
     insert into xxrp_acct_details (status_code, status_remarks, segment_number, remarks)
     select x1.status_code,
            x1.status_remarks,
            x2.segment_number,
            x2.remarks
     from xmltable(
      '/ACCOUNT_HEADER_ACK/HEADER'
      passing acct_doc
      columns header_no      for ordinality,
              status_code    number        path 'STATUS_CODE',
              status_remarks varchar2(100) path 'STATUS_REMARKS'
     ) x1,
     xmltable(
      '$d/ACCOUNT_HEADER_ACK/DETAILS[$hn]/DETAIL'
      passing acct_doc as "d",
              x1.header_no as "hn"
      columns segment_number number        path 'SEGMENT_NUMBER',
              remarks        varchar2(100) path 'REMARKS'
     ) x2
     ;
    
    END;
    

    All first (alias X 1) retrieves all headers in separate lines. The HEADER_NO generated column is used to keep track of the position of the header into the document.
    Then, we join a second XMLTable (X 2), passing HEADER_NO, so that we can access the corresponding items in DETAIL.

    (2) reading with one XMLTable, but somewhat more complex XQuery:

    DECLARE
    
     acct_doc xmltype := xmltype( bfilename('TEST_DIR','acct.xml'), nls_charset_id('AL32UTF8') );
    
    BEGIN
    
     insert into xxrp_acct_details (status_code, status_remarks, segment_number, remarks)
     select x.*
     from xmltable(
      'for $i in /ACCOUNT_HEADER_ACK/HEADER
       return
        for $j in $i/following-sibling::DETAILS[1]/DETAIL
        return element r {$i, $j}'
      passing acct_doc
      columns status_code    number        path 'HEADER/STATUS_CODE',
              status_remarks varchar2(100) path 'HEADER/STATUS_REMARKS',
              segment_number number        path 'DETAIL/SEGMENT_NUMBER',
              remarks        varchar2(100) path 'DETAIL/REMARKS'
     ) x
     ;
    
    END;
    

    Here, we use an XQuery query to extract the information that we need.
    Basically it's the same logic as above, but with two nested loops which access each header, then each RETAIL location immediately after in the order of the documents.

    Here is the link to the documentation XMLTable and XQuery in Oracle:
    http://download.Oracle.com/docs/CD/B28359_01/AppDev.111/b28369/xdb_xquery.htm#CBAGCBGJ

  • LOAD CSV DATA INTO A NEW TABLE

    I have a basic csv file-12 columns, 247 lines I tried to import or paste it into a new table and with each try, not all records get downloaded in the new table. Using the "Load Data" tool, I tried text data load, load the worksheet data, by import .csv and copy/paste. I put the records in the order PK ID of the table, I added dummy entries in all fields are null (that is, I added the word "None" for empty fields). But nothing works. I get about half of existing records.

    Why?

    The Max Size for VARCHAR2 is 4000. If you have a text which is longer, you must type clob column.

    If you wann download the entire file you need of a blob column you need to implement a uploadscript like this: http://www.dba-oracle.com/t_easy_html_db_file_browse_item.htm

  • How to load HFM data into Essbase

    Hello

    How can bring us the HFM data into Essbase cube with using EAL, since we have performance using EAL - DSS as a source with OBIEE reporting problems.

    The expanded use of analytical, I heard we can only only level 0 of HFM Essbase data and need to write the currency conversion and the calculations of eliminating PKI in Essbase to roll the Member parent. Is this true?

    Also how we can convert security HFM to Essbase security.

    Please suggest me on this.

    Thank you
    Vishal

    Security will be a bit tricky as Essbase uses generally filters and HFM security classes. You can potentially use shared groups in Shared Services, but licensing issues soar depending on supply. Your best bet is maybe to look at export artifact LCM to handle.

  • Load table data into Oracle Essbase

    Hi all

    I have a problem

    1 oracle table have 50 million records

    Essbase load data from that

    ODI error message:

    Caused by: java.sql.BatchUpdateException: means: Java heap space

    at org.hsqldb.jdbc.JDBCPreparedStatement.executeBatch (unknown Source)

    at oracle.odi.runtime.agent.execution.sql.BatchSQLCommand.execute(BatchSQLCommand.java:44)

    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:102)

    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:1)

    at oracle.odi.runtime.agent.execution.DataMovementTaskExecutionHandler.handleTask(DataMovementTaskExecutionHandler.java:87)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)

    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:577)

    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:468)

    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2128)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:366)

    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)

    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:292)

    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:855)

    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)

    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)

    at java.lang.Thread.run(Thread.java:662)

    I think that Agent Load data so great, if memory cannot load.

    How to fix?

    Please give me a solution.

    Thank you

    As Craig said, move the staging area out the SUNOPSIS MEMORY ENGINE initially. This should only be considered if you are mobile / transform small amounts of data (including 50 million lines isn't :-)).) Why you do not set the staging on your Source Oracle DB in this way, you remove an unnecessary data movement i.e. the LKM and don't rely on memory in engine.

  • Error loading of data into Essbase of SQL Server

    Hello experts!

    I have another urgent and confusing issue. I am load data in a SQL Server view in Essbase (which is inverted with several columns of data) using ODI 11.1.1.5 and I get the following error at + "3-loading-SrcSet0-Load Data" + no:

    ODI-1227: SrcSet0 (load) task fails on the source of connection MICROSOFT_SQL_SERVER.
    + Caused by: java.sql.SQLException: incorrect syntax near the keyword 'View' [FMWGEN] [SQLServer JDBC Driver] [SQL Server] +.

    where 'View' is the name of the dimension (data) column in the SQL view.


    Please help me with advice! Thank you very much!

    Have you checked the generated SQL code, you can run the SQL generated directly in SQL server studio.
    If you use a column called view and one using a view?, if so this can be the question

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • help to load xml data into a loop

    Hola amigos, acudo a ustedes porque estoy doing a small application that muestre archivos para mi nueva zona of descargas, esta take utilizando diagramacion paint (gracias por los slideshows as against aqui) y los archivos los carga desde los datos a xml file.
    I have a movieclip cargado con addChild, y knew vez dentro carga tambien con addChild varios insert utilizando UN loop, the cosa are what who use ese mismo loop para as cada vez shouts the panel of the United Nations, are the cargue los datos xml, pero no puedo hacerlo, sale me el error 1009 respond el object are nulo. ACA the dejo parte del codigo, por if any of ustedes me can help con eso.
    Por adelantado the agradesco.

    Hello people, I come to you because I do a small application that shows the files for my new download area, it is made using liquid layout and files information loading from a xml file.
    I have a movieclip loaded with addChild and Interior load also with addChild several panels by using a loop, the thing is that I wanted to use the same loop for each time to add a new Panel, also load information to an xml file, but I can't do it every time show my 1009 error saying that the object is null (void). Here I leave you a part of the code for them if anyone of you can help me with this.
    Thank you in advance for the help, and here is a sample of the API: http://nesmothdesign.com/Media/home.swf

    set the XML

    var imgLoader:Loader;
    var xml;
    var xmlList:XMLList;
    var xmlLoader:URLLoader = new URLLoader();
    xmlLoader.load (new URLRequest ("listado.xml"));
    xmlLoader.addEventListener (Event.COMPLETE, xmlLoaded);
    function xmlLoaded(event:Event):void
    {
    XML = XML (event.target.data);
    xmlList = xml.children ();
    trace (xmlList.length ());
    }

    Add counter to the panns
    var miContenedor:contenedor;
    miContenedor = new contenedor();
    addChild (miContenedor);
    Tweener.addTween(miContenedor,{x:stage.stageWidth/2-465,time:1,transition:"easeIn"});)
    miContenedor.y = body_mc.y + 10;
    Add container´s children-
    var miPartA:panelTipoA;
    var miPartB:panelTipoB;
    for (var a: int = 0; has < = 3; a ++)
    {
    miPartA = new panelTipoA();
    miPartB = new panelTipoB();
    miContenedor.addChild (miPartA);
    miContenedor.addChild (miPartB);
    miPartA.y = a * miPartA.height + (a * 10);
    miPartB.y = a * miPartB.height + (a * 10);
    miPartB.x = miPartB.width + 15;
    imgLoader = new Loader();
    imgLoader.load (new URLRequest (xmlList [a] .attribute ("thumb")));
    miContenedor.miParteA.addChild (imgLoader);
    }

    Atention: las 3 of code lines should add the respective to the pannel xml file data.

    miContenedor.miParteA.addChild (imgLoader);

    is the problem.  There is no miContenedor.miParteA.  use:

    miParteA.addChild (imgLoader);

  • Loading the data into Essbase is slow

    Loading the data into Essbase is slow.
    Loading speed of 10 seconds on a record.

    It is used standard KM.

    How it is possible to optimize the loading of data into Essbase?

    Thank you

    --
    Gelo

    Just for you say the patch was released

    ORACLE DATA INTEGRATOR 10.1.3.5.2_02 UNIQUE PATCH

    Patch ID - 8785893

    8589752: IKM SQL data Essbase - loading bulk instead of rank by rank treatment mode when an error occurs during the loading

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Cannot load the metadata of the EBS

    Hi guys!

    I'm using 11.1.2.3.500 and try to shoot from 12 of the EBS metadata.

    Everything is well configured, (so far).

    I am trying to load 3 (account, entity and product in this order) dimensions form EBS and I got a green OK on Sun entity and one yellow triangle for others (account and entity).

    It seems that he loses the connection between planning. But what happens if its OK for entity dim?

    [Wed Jul 02 19:46:05 IS 2014] Execution of the refresh cube...

    [Wed Jul 02 19:46:05 IS 2014] An error occurred during the refresh cube operation: com.hyperion.planning.HspRuntimeException: unable to connect to a Hyperion Essbase.

    Check that Hyperion Essbase is running and check your network connections.

    Thank you

    RMarques

    Journal of the whole:

    ACCOUNT

    2014-07-02 19:45:20, 399 [AIF] INFO: beginning of the process FDMEE, process ID: 30

    2014-07-02 19:45:20, 399 [AIF] INFO: recording of the FDMEE level: 4

    2014-07-02 19:45:20, 399 [AIF] INFO: FDMEE log file: /u01/app/oracle/product/epm/EPMSystem11R1/HPL/EPRI/outbox/logs/HPL_DEV_30.log

    2014-07-02 19:45:20, 400 INFO [AIF]: user: admin

    2014-07-02 19:45:20, 400 INFO [AIF]: place: EBS_ACT_DL_AUD (Partitionkey:2)

    2014-07-02 19:45:20, 400 INFO [AIF]: period name: NA (period key: null)

    2014-07-02 19:45:20, 400 INFO [AIF]: category name: NA (category key: null)

    2014-07-02 19:45:20, 400 INFO [AIF]: rule name: 1 (rule ID:1)

    2014-07-02 19:45:23, 681 [AIF] INFO: Jython Version: 2.5.1 (Release_2_5_1:6813, September 26 2009, 13:47:54)

    [JRockit (R) Oracle (Oracle Corporation)]

    2014-07-02 19:45:23, 682 INFO [AIF]: Java platform: java1.6.0_37

    2014-07-02 19:45:33, 283 [AIF] INFO: COMM Dimension pre-processing - Multi process Validation - START

    2014-07-02 19:45:33: 293 [AIF] INFO: COMM Dimension pre-processing - Multi process Validation - END

    2014-07-02 19:45:33, 378 [AIF] INFO: LKM EBS/FS extracted members Table VS - process value Sets - START

    2014-07-02 19:45:33, 396 [AIF] INFO: LKM EBS/FS extracted members Table VS - process value Sets - END

    2014-07-02 19:45:33, 681 INFO [AIF]: extract members Ind VS EBS - fill step members Dim - START

    2014-07-02 19:45:33, 745 INFO [AIF]: extract from BSE members Ind VS - fill step members Dim - END

    2014-07-02 19:45:34, INFO 079 [AIF]: extract members Ind VS EBS - fill step members Dim - START

    2014-07-02 19:45:34, INFO 186 [AIF]: extract from BSE members Ind VS - fill step members Dim - END

    2014-07-02 19:45:34, 470 INFO [AIF]: extract members Ind VS EBS - fill step members Dim - START

    2014-07-02 19:45:34, 532 INFO [AIF]: extract from BSE members Ind VS - fill step members Dim - END

    2014-07-02 19:45:36, 383 [AIF] INFO: EBS/FS load Concat Sun - load Concat Dim members - START

    2014-07-02 19:45:36, 386 [AIF] INFO: EBS/FS load Concat Dim members - load Concat Dim member - END

    2014-07-02 19:45:41, 815 [AIF] INFO: EBS hierarchy processing Ind VS - process Insert details Ind VS - START

    2014-07-02 19:45:41, 824 [AIF] INFO: EBS hierarchy VS. IND. - Insert into process details Ind VS - END of treatment

    2014-07-02 19:45:42, 270 INFO [AIF]: extract EBS hierarchies Ind VS - fill step hierarchy Dim - START

    2014-07-02 19:45:42, 331 [AIF] INFO: from hierarchies Ind VS EBS - fill step hierarchy Dim - END

    2014-07-02 19:45:42, 897 INFO [AIF]: extract EBS hierarchies Ind VS - fill step hierarchy Dim - START

    2014-07-02 19:45:43, 044 [AIF] INFO: from hierarchies Ind VS EBS - fill step hierarchy Dim - END

    2014-07-02 19:45:43, 531 INFO [AIF]: extract EBS hierarchies Ind VS - fill step hierarchy Dim - START

    2014-07-02 19:45:43, 657 [AIF] INFO: from hierarchies Ind VS EBS - fill step hierarchy Dim - END

    2014-07-02 19:45:43, 803 INFO [AIF]: extract EBS hierarchies VS Table - fill step hierarchy Dim - START

    2014-07-02 19:45:43, 805 INFO [AIF]: extract EBS hierarchies VS Table - fill step hierarchy Dim - END

    2014-07-02 19:45:43, 891 INFO [AIF]: load COMM hierarchies unique Sun - hierarchies unique load Dim - START

    2014-07-02 19:45:45, 883 [AIF] INFO: COMM load single Dim hierarchies - load hierarchies only Dim - END

    2014-07-02 19:45:45, 989 [AIF] INFO: COMM load Concat Dim hierarchies - load Concat Dim hierarchies - START

    2014-07-02 19:45:45, 993 [AIF] INFO: COMM load Concat Dim hierarchies - load Concat Dim hierarchies - END

    2014-07-02 19:45:48, INFO 177 [AIF]: Member of Dimension COMM - treatment of the PKI attribute - attributes START

    2014-07-02 19:45:48, INFO 179 [AIF]: Member of Dimension COMM - treatment of the PKI attribute - attributes END

    2014-07-02 19:45:56, 815 INFO [AIF]: metadata HPL - Application support: HPL_DEV

    2014-07-02 19:45:56, 818 [AIF] INFO: number of dimensions to load in HPL_DEV: 3

    2014-07-02 19:45:56, 818 INFO [AIF]: query SQL building for the dimension account...

    2014-07-02 19:45:56, INFO 819 [AIF]: number of tables in HPL_DEV aliases: 1

    2014-07-02 19:45:56, 825 INFO [AIF]: loading of the dimension members: account

    2014-07-02 19:45:56, 825 INFO [AIF]: [HPLService] Info: OLU log file name is: /u01/app/oracle/product/epm/user_projects/epmdevexa1/tmp/aif_30_Account.log

    2014-07-02 19:46, 571 INFO [AIF]: property file arguments:-C /RIC: * /D:Account /RIU: * /RIR: * / RID: * / RIP: * /RIQ: * /IR

    No argument came from the command line. Subject (merged) command line:

    -C /RIC: * /D:Account /RIU: * /RIR: * / RID: * / RIP: * /RIQ: * /IR

    [Wed Jul 02 19:45:56 IS 2014] "The entry specified with the /RIQ switch request did not correspond to a key in the file properties of the"null"command if it is run as a sql query directly:

    $m.name SELECT 'account '.

    , m.PARENT 'Parent '.

    , m.DataStorage 'memory '.

    , m.VALUE1 ' Alias: Default.

    m.MemberValidForPlan1 'Plan Type (FinPlan)'

    m.MemberValidForPlan2 'Plan Type (RevPlan)'

    m.MemberValidForPlan3 'Plan Type (CapProj)'

    , m.AccountType "Account Type".

    , m.TimeBalance "time scale".

    m.VarianceReporting "Variance Reporting"

    m.SourcePlanType 'Source Plan Type'

    DE)

    SELECT mem.NAME

    CASE yesterday. PARENT WHEN '#root' NULL THEN ELSE yesterday. END OF PARENT PARENT

    yesterday. CHILD_DEPTH_NUM

    Mem. AccountType WHEN mem 'SavedAssumption' THEN 'submission saved' ELSE. AccountType END AccountType

    mem. SoldeTemps

    Mem. VarianceReporting WHEN mem "NonExpense" THEN "Non-charges" ELSE. VarianceReporting END VarianceReporting

    yesterday. MemberValidForPlan1

    yesterday. MemberValidForPlan2

    yesterday. MemberValidForPlan3

    yesterday. MemberValidForCapex

    yesterday. MemberValidForWorkforce

    yesterday. SourcePlanType

    CASE yesterday. Dtx200

    WHEN "NeverShare" THEN "never shared."

    WHEN "StoreData" THEN "Save".

    WHEN 'ShareData' THEN 'shared '.

    Another THING yesterday. Dtx200

    END Dtx200

    yesterday. HIERARCHY_ID

    HD. BASE_HIERARCHY_FLAG

    ,(Select prop.) VALUE

    PILLAR AIF_HS_DIM_PROPERTYARRAY

    WHERE prop. LOADID = mem. LOADID

    AND prop. DIMENSION = mem. DIMENSION

    AND prop. PROPERTY = "Alias."

    AND prop.nom = mem.NAME

    AND prop. KEY = "Default".

    ) VALUE1

    MEM AIF_HS_DIM_MEMBER

    Yesterday INNER JOIN AIF_HS_DIM_HIERARCHY

    ON yesterday. LOADID = mem. LOADID

    AND yesterday. DIMENSION = mem. DIMENSION

    AND yesterday. CHILD = mem.NAME

    LEFT OUTER JOIN AIF_MAP_HIERARCHIES HD

    ON HD. HIERARCHY_ID = yesterday. HIERARCHY_ID

    Mem WHERE. LOADID = 30

    AND mem. DIMENSION = "ACCOUNT1.

    ) m

    ORDER BY m.BASE_HIERARCHY_FLAG desc

    m.HIERARCHY_ID

    m.CHILD_DEPTH_NUM

    m.PARENT

    $m.name ".

    [Wed Jul 02 19:45:56 IS 2014] Trying to establish the connection of rdb entry.

    Source B.I. "EPM_REPO" on jdbc:oracle:thin:@server:1521/instace logged in successfully.

    [Wed Jul 02 19:45:56 IS 2014] Sign at the entrance to RDB successfully completed.

    [Wed Jul 02 19:45:56 IS 2014] Record header fields: Parent account, data storage, Alias: by default, Type of Plan (FinPlan), Plan Type (RevPlan), Plan Type (CapProj), Type of account, balance time, Variance Reporting, Source Plan Type

    [Wed Jul 02 19:45:56 IS 2014] Located by using 'Account' dimension and for the loading of data into the application 'HPL_DEV '.

    [Wed Jul 02 19:45:56 IS 2014] HspOutlineLoad::dateFormatSpecified is set to false, SessionHalDateFormat stored on session: null, sessionId: 886045942

    [Wed Jul 02 19:45:59 EST 2014] com.hyperion.planning.HspRuntimeException: exchange rate must be None if the Data Type is Non-monnaie, percentage, Smart List, Date or text.  Member: ATRDAY

    [Wed Jul 02 19:45:59 EST 2014] com.hyperion.planning.HspRuntimeException: exchange rate must be None if the Data Type is Non-monnaie, percentage, Smart List, Date or text.  Member: ATRDAY

    [Wed Jul 02 19:46:00 IS 2014] com.hyperion.planning.HspRuntimeException: an alias with the name of Trading member account already exists.

    [Wed Jul 02 19:46:00 IS 2014] com.hyperion.planning.HspRuntimeException: an alias with the name of Trading member account already exists.

    [Wed Jul 02 19:46:00 IS 2014] Loading dimension 'Account' has been successfully opened.

    [Wed Jul 02 19:46:00 IS 2014] A refresh of the cube operation will not be run.

    [Wed Jul 02 19:46:00 IS 2014] Create filters for safe operation will not be performed.

    Planning of vector data store finished loading processes. 1099 data records were read, 1099 data records have been processed, 1085 were accepted for loading (check the actual load with Essbase logs), 14 were rejected.

    [Wed Jul 02 19:46:00 IS 2014] Planning of vector data store finished loading processes. 1099 data records were read, 1099 data records have been processed, 1085 were accepted for loading (check the actual load with Essbase logs), 14 were rejected.

    2014-07-02 19:46, 571 [AIF] INFO: completed - account dimension load records read: 1 099, rejected records: 14, files: 1 099.

    ENTITY

    2014-07-02 19:46, 574 [AIF] INFO: building for the dimension entity SQL query...

    2014-07-02 19:46, 579 [AIF] INFO: loading of the dimension members: entity

    2014-07-02 19:46, 579 [AIF] INFO: [HPLService] Info: OLU log file name is: /u01/app/oracle/product/epm/user_projects/epmdevexa1/tmp/aif_30_Entity.log

    2014-07-02 19:46:04, 247 INFO [AIF]: property file arguments:-C /RIC: * /D:Entity /RIU: * /RIR: * / RID: * / RIP: * /RIQ: * /IR

    No argument came from the command line. Subject (merged) command line:

    -C /RIC: * /D:Entity /RIU: * /RIR: * / RID: * / RIP: * /RIQ: * /IR

    [Wed Jul 02 19:46:00 IS 2014] "The entry specified with the /RIQ switch request did not correspond to a key in the file properties of the"null"command if it is run as a sql query directly:

    $m.name SELECT 'entity '.

    , m.PARENT 'Parent '.

    , m.DataStorage 'memory '.

    , m.VALUE1 ' Alias: Default.

    m.MemberValidForPlan1 'Plan Type (FinPlan)'

    m.MemberValidForPlan2 'Plan Type (RevPlan)'

    m.MemberValidForPlan3 'Plan Type (CapProj)'

    DE)

    SELECT mem.NAME

    CASE yesterday. PARENT WHEN '#root' NULL THEN ELSE yesterday. END OF PARENT PARENT

    yesterday. CHILD_DEPTH_NUM

    Mem. AccountType WHEN mem 'SavedAssumption' THEN 'submission saved' ELSE. AccountType END AccountType

    mem. SoldeTemps

    Mem. VarianceReporting WHEN mem "NonExpense" THEN "Non-charges" ELSE. VarianceReporting END VarianceReporting

    yesterday. MemberValidForPlan1

    yesterday. MemberValidForPlan2

    yesterday. MemberValidForPlan3

    yesterday. MemberValidForCapex

    yesterday. MemberValidForWorkforce

    yesterday. SourcePlanType

    CASE yesterday. Dtx200

    WHEN "NeverShare" THEN "never shared."

    WHEN "StoreData" THEN "Save".

    WHEN 'ShareData' THEN 'shared '.

    Another THING yesterday. Dtx200

    END Dtx200

    yesterday. HIERARCHY_ID

    HD. BASE_HIERARCHY_FLAG

    ,(Select prop.) VALUE

    PILLAR AIF_HS_DIM_PROPERTYARRAY

    WHERE prop. LOADID = mem. LOADID

    AND prop. DIMENSION = mem. DIMENSION

    AND prop. PROPERTY = "Alias."

    AND prop.nom = mem.NAME

    AND prop. KEY = "Default".

    ) VALUE1

    MEM AIF_HS_DIM_MEMBER

    Yesterday INNER JOIN AIF_HS_DIM_HIERARCHY

    ON yesterday. LOADID = mem. LOADID

    AND yesterday. DIMENSION = mem. DIMENSION

    AND yesterday. CHILD = mem.NAME

    LEFT OUTER JOIN AIF_MAP_HIERARCHIES HD

    ON HD. HIERARCHY_ID = yesterday. HIERARCHY_ID

    Mem WHERE. LOADID = 30

    AND mem. DIMENSION = "ENTITY1.

    ) m

    ORDER BY m.BASE_HIERARCHY_FLAG desc

    m.HIERARCHY_ID

    m.CHILD_DEPTH_NUM

    m.PARENT

    $m.name ".

    [Wed Jul 02 19:46:00 IS 2014] Trying to establish the connection of rdb entry.

    Source B.I. "EPM_REPO" on jdbc:oracle:thin:@server:1521/instace logged in successfully.

    [Wed Jul 02 19:46:00 IS 2014] Sign at the entrance to RDB successfully completed.

    [Wed Jul 02 19:46:00 IS 2014] Record header fields: Parent, entity, Alias, data storage: by default, the Type of Plan (FinPlan), Type of Plan (RevPlan), Type of Plan (CapProj)

    [Wed Jul 02 19:46:00 IS 2014] Find and use the 'Entity' dimension for the loading of the data in the application 'HPL_DEV '.

    [Wed Jul 02 19:46:00 IS 2014] HspOutlineLoad::dateFormatSpecified is set to false, SessionHalDateFormat stored on session: null, sessionId: 872481248

    [Wed Jul 02 19:46:04 IS 2014] Loading dimension 'Entity' has been successfully opened.

    [Wed Jul 02 19:46:04 IS 2014] A refresh of the cube operation will not be run.

    [Wed Jul 02 19:46:04 IS 2014] Create filters for safe operation will not be performed.

    Planning of vector data store finished loading processes. 881 data records were read, 881 records have been processed, 881 have been accepted for loading (check the actual load with Essbase logs), 0 were rejected.

    [Wed Jul 02 19:46:04 IS 2014] Planning of vector data store finished loading processes. 881 data records were read, 881 records have been processed, 881 have been accepted for loading (check the actual load with Essbase logs), 0 were rejected.

    2014-07-02 19:46:04, 247 INFO [AIF]: load dimension complete entity - reading documents: 881, rejected records: 0 records processed: 881.

    PRODUCT

    2014-07-02 19:46:04, 249 [AIF] INFO: building SQL query for dimension product...

    2014-07-02 19:46:04, 253 [AIF] INFO: loading of the dimension members: product

    2014-07-02 19:46:04, 253 [AIF] INFO: [HPLService] Info: OLU log file name is: /u01/app/oracle/product/epm/user_projects/epmdevexa1/tmp/aif_30_Product.log

    2014-07-02 19:46:05, 556 INFO [AIF]: property file arguments: /C /RIC: * /D:Product /RIU: * /RIR: * / RID: * / RIP: * /RIQ: * /IR

    No argument came from the command line. Subject (merged) command line:

    /C /RIC: * /D:Product /RIU: * /RIR: * / RID: * / RIP: * /RIQ: * /IR

    [Wed Jul 02 19:46:04 IS 2014] "The entry specified with the /RIQ switch request did not correspond to a key in the file properties of the"null"command if it is run as a sql query directly:

    SELECT $m.name 'product '.

    , m.PARENT 'Parent '.

    , m.DataStorage 'memory '.

    , m.VALUE1 ' Alias: Default.

    DE)

    SELECT mem.NAME

    CASE yesterday. PARENT WHEN '#root' NULL THEN ELSE yesterday. END OF PARENT PARENT

    yesterday. CHILD_DEPTH_NUM

    Mem. AccountType WHEN mem 'SavedAssumption' THEN 'submission saved' ELSE. AccountType END AccountType

    mem. SoldeTemps

    Mem. VarianceReporting WHEN mem "NonExpense" THEN "Non-charges" ELSE. VarianceReporting END VarianceReporting

    yesterday. MemberValidForPlan1

    yesterday. MemberValidForPlan2

    yesterday. MemberValidForPlan3

    yesterday. MemberValidForCapex

    yesterday. MemberValidForWorkforce

    yesterday. SourcePlanType

    CASE yesterday. Dtx200

    WHEN "NeverShare" THEN "never shared."

    WHEN "StoreData" THEN "Save".

    WHEN 'ShareData' THEN 'shared '.

    Another THING yesterday. Dtx200

    END Dtx200

    yesterday. HIERARCHY_ID

    HD. BASE_HIERARCHY_FLAG

    ,(Select prop.) VALUE

    PILLAR AIF_HS_DIM_PROPERTYARRAY

    WHERE prop. LOADID = mem. LOADID

    AND prop. DIMENSION = mem. DIMENSION

    AND prop. PROPERTY = "Alias."

    AND prop.nom = mem.NAME

    AND prop. KEY = "Default".

    ) VALUE1

    MEM AIF_HS_DIM_MEMBER

    Yesterday INNER JOIN AIF_HS_DIM_HIERARCHY

    ON yesterday. LOADID = mem. LOADID

    AND yesterday. DIMENSION = mem. DIMENSION

    AND yesterday. CHILD = mem.NAME

    LEFT OUTER JOIN AIF_MAP_HIERARCHIES HD

    ON HD. HIERARCHY_ID = yesterday. HIERARCHY_ID

    Mem WHERE. LOADID = 30

    AND mem. DIMENSION = "DIM1.

    ) m

    ORDER BY m.BASE_HIERARCHY_FLAG desc

    m.HIERARCHY_ID

    m.CHILD_DEPTH_NUM

    m.PARENT

    $m.name ".

    [Wed Jul 02 19:46:04 IS 2014] Trying to establish the connection of rdb entry.

    Source B.I. "EPM_REPO" on jdbc:oracle:thin:@server:1521/instace logged in successfully.

    [Wed Jul 02 19:46:04 IS 2014] Sign at the entrance to RDB successfully completed.

    [Wed Jul 02 19:46:04 IS 2014] Record header fields: Parent, product, Alias, data storage: default

    [Wed Jul 02 19:46:04 IS 2014] Find and use the dimension 'Product' for the loading of the data in the application 'HPL_DEV '.

    [Wed Jul 02 19:46:04 IS 2014] HspOutlineLoad::dateFormatSpecified is set to false, SessionHalDateFormat stored on session: null, sessionId: 677666057

    [Wed Jul 02 19:46:04 IS 2014] java.lang.RuntimeException: com.hyperion.planning.DuplicateObjectException: an object with the name of WULC1 already exist.

    [Wed Jul 02 19:46:04 IS 2014] java.lang.RuntimeException: com.hyperion.planning.DuplicateObjectException: an object with the name of WULC1 already exist.

    [Wed Jul 02 19:46:05 EST 2014] com.hyperion.planning.InvalidDimensionMemberNameException: name of the Dimension Member 'ORDER' is a report Script command.

    [Wed Jul 02 19:46:05 EST 2014] com.hyperion.planning.InvalidDimensionMemberNameException: name of the Dimension Member 'ORDER' is a report Script command.

    [Wed Jul 02 19:46:05 IS 2014] Dimension 'Product' load has been successfully opened.

    [Wed Jul 02 19:46:05 IS 2014] Execution of the refresh cube...

    [Wed Jul 02 19:46:05 IS 2014] An error occurred during the refresh cube operation: com.hyperion.planning.HspRuntimeException: unable to connect to a Hyperion Essbase.

    Check that Hyperion Essbase is running and check your network connections.

    [Wed Jul 02 19:46:05 IS 2014] An error occurred during the refresh cube operation: com.hyperion.planning.HspRuntimeException: unable to connect to a Hyperion Essbase.

    Check that Hyperion Essbase is running and check your network connections.

    [Wed Jul 02 19:46:05 IS 2014] Impossible to get analytical information and/or perform a data load: an error occurred during the refresh cube operation: com.hyperion.planning.HspRuntimeException: unable to connect to a Hyperion Essbase.

    Check that Hyperion Essbase is running and check your network connections.

    [Wed Jul 02 19:46:05 IS 2014] Impossible to get analytical information and/or perform a data load: an error occurred during the refresh cube operation: com.hyperion.planning.HspRuntimeException: unable to connect to a Hyperion Essbase.

    Check that Hyperion Essbase is running and check your network connections.

    [Wed Jul 02 19:46:05 IS 2014] Trace of information: com.hyperion.planning.utils.HspOutlineLoad::parseAndLoadInputFile:1912, com.hyperion.planning.utils.HspOutlineLoad::halAdapterInfoAndLoad:304, com.hyperion.planning.utils.HspOutlineLoad::loadAndPrintStatus:4667, com.hyperion.planning.utils.HspOutlineLoad::outlineLoadAsyncImpl:3752, com.hyperion.planning.utils.HspOutlineLoad::outlineLoad:3692, com.hyperion.planning.utils.HspOutlineLoad::outlineLoad:3810, com.hyperion.aif.webservices.HPLService::loadMetadata:500, sun.reflect.NativeMethodAccessorImpl::invoke0:-2, sun.reflect.NativeMethodAccessorImpl::invoke:39, sun.reflect.DelegatingMethodAccessorImpl::invoke:25, java.lang.reflect.Method::invoke:597, com.hyperion.aif.servlet.ODIServlet::doPost:97, javax.servlet.http.HttpServlet::service:727, javax.servlet.http.HttpServlet::service:820,. weblogic.servlet.internal.StubSecurityHelper$ ServletServiceAction::run:227, weblogic.servlet.internal.StubSecurityHelper::invokeServlet:125, weblogic.servlet.internal.ServletStubImpl::execute:301, weblogic.servlet.internal.TailFilter::doFilter:26, weblogic.servlet.internal.FilterChainImpl::doFilter:56, oracle.security.jps.ee.http.JpsAbsFilter$ 1::run:119, oracle.security.jps.util.JpsSubject::doAsPrivileged:324, oracle.security.jps.ee.util.JpsPlatformUtil::runJaasMode:460, oracle.security.jps.ee.http.JpsAbsFilter::runJaasMode:103, oracle.security.jps.ee.http.JpsAbsFilter::doFilter:171, oracle.security.jps.ee.http.JpsFilter::doFilter:71, weblogic.servlet.internal.FilterChainImpl::doFilter:56, oracle.dms.servlet.DMSServletFilter::doFilter:163, weblogic.servlet.internal.FilterChainImpl::doFilter:56,. weblogic.servlet.internal.RequestEventsFilter::doFilter:27, weblogic.servlet.internal.FilterChainImpl::doFilter:56, weblogic.servlet.internal.WebAppServletContext$ ServletInvocationAction::wrapRun:3730, weblogic.servlet.internal.WebAppServletContext$ ServletInvocationAction::run:3696, weblogic.security.acl.internal.AuthenticatedSubject::doAs:321, weblogic.security.service.SecurityManager::runAs:120, weblogic.servlet.internal.WebAppServletContext::securedExecute:2273, weblogic.servlet.internal.WebAppServletContext::execute:2179, weblogic.servlet.internal.ServletRequestImpl::run:1490, weblogic.work.ExecuteThread::execute:256, weblogic.work.ExecuteThread::run:221

    [Wed Jul 02 19:46:05 IS 2014] Trace of information: com.hyperion.planning.utils.HspOutlineLoad::parseAndLoadInputFile:1912, com.hyperion.planning.utils.HspOutlineLoad::halAdapterInfoAndLoad:304, com.hyperion.planning.utils.HspOutlineLoad::loadAndPrintStatus:4667, com.hyperion.planning.utils.HspOutlineLoad::outlineLoadAsyncImpl:3752, com.hyperion.planning.utils.HspOutlineLoad::outlineLoad:3692, com.hyperion.planning.utils.HspOutlineLoad::outlineLoad:3810, com.hyperion.aif.webservices.HPLService::loadMetadata:500, sun.reflect.NativeMethodAccessorImpl::invoke0:-2, sun.reflect.NativeMethodAccessorImpl::invoke:39, sun.reflect.DelegatingMethodAccessorImpl::invoke:25, java.lang.reflect.Method::invoke:597, com.hyperion.aif.servlet.ODIServlet::doPost:97, javax.servlet.http.HttpServlet::service:727, javax.servlet.http.HttpServlet::service:820,. weblogic.servlet.internal.StubSecurityHelper$ ServletServiceAction::run:227, weblogic.servlet.internal.StubSecurityHelper::invokeServlet:125, weblogic.servlet.internal.ServletStubImpl::execute:301, weblogic.servlet.internal.TailFilter::doFilter:26, weblogic.servlet.internal.FilterChainImpl::doFilter:56, oracle.security.jps.ee.http.JpsAbsFilter$ 1::run:119, oracle.security.jps.util.JpsSubject::doAsPrivileged:324, oracle.security.jps.ee.util.JpsPlatformUtil::runJaasMode:460, oracle.security.jps.ee.http.JpsAbsFilter::runJaasMode:103, oracle.security.jps.ee.http.JpsAbsFilter::doFilter:171, oracle.security.jps.ee.http.JpsFilter::doFilter:71, weblogic.servlet.internal.FilterChainImpl::doFilter:56, oracle.dms.servlet.DMSServletFilter::doFilter:163, weblogic.servlet.internal.FilterChainImpl::doFilter:56,. weblogic.servlet.internal.RequestEventsFilter::doFilter:27, weblogic.servlet.internal.FilterChainImpl::doFilter:56, weblogic.servlet.internal.WebAppServletContext$ ServletInvocationAction::wrapRun:3730, weblogic.servlet.internal.WebAppServletContext$ ServletInvocationAction::run:3696, weblogic.security.acl.internal.AuthenticatedSubject::doAs:321, weblogic.security.service.SecurityManager::runAs:120, weblogic.servlet.internal.WebAppServletContext::securedExecute:2273, weblogic.servlet.internal.WebAppServletContext::execute:2179, weblogic.servlet.internal.ServletRequestImpl::run:1490, weblogic.work.ExecuteThread::execute:256, weblogic.work.ExecuteThread::run:221

    Planning of vector data store finished loading processes. 707 data records were read, 707 data records have been processed, 674 were accepted for loading (check the actual load with Essbase logs), 33 have been rejected.

    [Wed Jul 02 19:46:05 IS 2014] Planning of vector data store finished loading processes. 707 data records were read, 707 data records have been processed, 674 were accepted for loading (check the actual load with Essbase logs), 33 have been rejected.

    2014-07-02 19:46:05, 556 INFO [AIF]: charge of finished product dimension - reading documents: 707, rejected records: 33 files: 707.

    2014-07-02 19:46:05, 558 INFO [AIF]: metadata HPL charge ended with the status code returned: true

    2014-07-02 19:46:05, 653 [AIF] INFO: end process FDMEE, process ID: 30

    Hi all

    His works now.

    The Oracle guy suggested to change some variables in setCustomParamErpIntegrator.sh and it worked.

    As I got some time, I was trying to find the good exchange and found to LD_LIBRARY_PATH, which must include planning LD_LIBRARY_PATH.

    Francisco Amores

    See you soon,.
    Safiya

  • load data into hyperion application

    Hello
    I have given in the essbase cube and wants to load this data into my Hyperion application?
    What tools should I use and what is it free?

    1. If you have data in your Essbase cube and assuming that this cube is created since the planning application (database create or update) then you should be able to see the data in your forms of planning data without any sort of copy of data.

    2. If the data in a cube completely separate Essbase, then you can export the data in this cube using the Dataexport (http://docs.oracle.com/cd/E12825_01/epm.111/esb_techref/dataexport.htm) and import the result into the Essbase cube that is created from the planning application.

    Doubts?

  • Loading data into Web Forms

    Quick question: can load us data into planning directly using web forms? I believe that we can. Apart from creating a web form and begins to break in numbers, is there anything else to do?

    Web forms are given inputing so the answer is Yes, you are really enter data into essbase through planning.
    Just to design your web form and you are ready to enter data.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • QNetworkReply running into the problem of loading JSON data

    Hello

    I am a beginner with C++ and QT, but so far I'm starting to love the NDK waterfall!

    I'm trying to load a json data file that is extracted via a http request. Everything goes through, but my json data simply would not load in the QVariantList. So after a few hours of poking arround, I noticed finally that the json returned by the http request data is missing two brackets [] (an @ beginning and an end @).

    When I load the json data into a file with the two brakets included, the QVariantList load properly and I can debug through the records...

    Now my question is... how C++ can I add those parentheses []... See the code example below:

    void MyJSONReadClass::httpFinished()
    {
      JsonDataAccess jda;
      QVariantList myDataList;
    
      if (mReply->error() == QNetworkReply::NoError)
      {
        // Load the data using the reply QIODevice.
        qDebug() << mReply;
        myDataList = jda.load(mReply).value();
      }
      else
      {
        // Handle error
      }
    
      if (jda.hasError())
      {
        bb::data::DataAccessError error = jda.error();
        qDebug() << "JSON loading error: " << error.errorType() << ": "
            << error.errorMessage();
        return;
      }
    
      loadData(myDataList);
    
      // The reply is not needed now so we call deleteLater() function since we are in a slot.
      mReply->deleteLater();
    }
    

    Also, I would have thought that the jda.hasError () have captured this question... but guess not!

    I use the wrong approach or wrong classes? The basic example used is the WeatherGuesser project.

    Thanks for your help...

    It is perhaps not related to media. Try to recover data from QNetworkResponse as a QByteArray then load it into JsonDataAccess using loadFromBuffer:

     myDataList = jda.loadFromBuffer(mReply.readAll()).value();
    

    If this is insufficient, you can add media in this way (not tested, please see the documentation for the names of functioning if it won't compile):

    QByteArray a = mReply.readAll();
    a.insert(0, '[');
    a.append(']');
    myDataList = jda.loadFromBuffer(a).value();
    

    Note that if the response data are zero end (most likely it is not, but there is a possibility of it), you will need to check if the last symbol in byte array is '\0' and insert the capture media.

    QByteArray docs:

    http://Qt-project.org/doc/Qt-4.8/QByteArray.html

  • How to load data into the App MVDEMO schema example

    Hi all

    I'm a POC on Oracle Mapviewer and try to build some reports in OBIEE using MApviewer.

    This POC, I use Oracle MVDEMO example Data (11g). I think that these sample data covers few countries like the USA.

    I need to make PDS for the Brazil, I downloaded data from the map of the site as Shapefiles Brazil

    http://GADM.org/country

    in these data of the Brazil, I got from .csv files 4 extensions, .dbf, .shp and SHX

    I need to know how can I load these files into my Oracle 11 g DB? Should I load data into the same pattern of mvdemo, if yes then which table?

    Any help will be appreciated a lot.

    Thank you

    Amit

    Use the Java shapefile Converter utility (http://www.oracle.com/technetwork/database/options/spatialandgraph/downloads/index-093371.html)

    GDAL (gdal.org) FME (Safe) or or MapBuilder.

    Specify the to SRID (i.e. the SRID for loading in Oracle geoms) 4326 or 8307.

    Load into a new table named anything you want. for example brazil_gadm with the geometry named GEOMETRY column

    Once it's loaded, verify that there is an entry for the table and column (BRAZIL_GADM, GEOMETRY) in user_sdo_geom_metadata

    Create a space on brazil_gadm.geometry index if the tool has not created a.

    Add the definitions of topic for the country, State or whatever the admin areas exist in the dataset.

    Import them as layers in OBIEE.

Maybe you are looking for

  • Monitor cowards with a shaking noise

    I bought a Macbook Pro 13 "retina early 2015 version (I'm still under warranty). I recently took my Macbook and noticed a loud commotion from the internal components. It was very strange that the computer has never been ignored and pretty much lives

  • HP dv6-7220us: want to upgrade processor 7220us dv6

    Hello community HP, I'm a dv6-7220us, and I want to upgrade the processor. I already updated to the maximum of 16 GB ram, but I need more 5 cores to run a program that I need. I was wondering what I need to upgrade the processor and what would be the

  • Microsoft Update site says that I am not the administrator

    I'm the ONLY user/owner/manager on this laptop, when I go to the microsoft update page, it "scans to see if I have the harware appropriate" to use the site: then, he tells me I have to be logged on as administrator to use the site... I AM THE ADMINIS

  • Pavilion G6: Usb3

    I currently have a G6 Pavilion Windows 7 (64-bit) running about 3 years old and I am considering upgrading to an envy Touchsmart with Windows 8. My backup disk, webcam and wireless mouse are all USB2 but the urge has 3 x USB3 ports (no USB2). I'm not

  • Cannot access files after moving

    Original title: recover files I was trying to organize my files and I moved many of them by clicking on draggging, now I can't access one of them.  How can I cancel or retrieve all the?