ODI timestamp question

Hello

We have a source Date column data type (Oracle 10 g).
When we used the select date table - the output looks like 17-SEPT-10 08:02:32 '

In the target, the data type of date column is also Date (Oracle 10 g)...
It's a match one-to-one... after the execution of the interface, if we check the column date 17-SEPT-10 12:00 ', for all records, we get the same format.


What could be the problem... the timestamp is not identical to the source, instead we get 12:00 even timestamp for all records...


Any necessary changes here... Please let me know.


Thank you
Ananth

I don't know about you but,
It there a while we saw a problem with our dates between source and target DB (two 10g).
When the source, there was HH24:MI:SS filled out correctly, in the target, there was always 00:00:00.
It produces only when we were loading the data from one server to the other (never when the source and target are on the same server).

It really looks like your question...

After some research it seems comes from our JDBC (Thin one) driver...

To fix this, I add a property in topology whenever I use this JDBC url.
For each database server I add the following property on the Properties tab:
Key = oracle.jdbc.V8Compatible
Value = true

Maybe you can try with this...

Hope to help

Kind regards

Brice

Tags: Business Intelligence

Similar Questions

  • Tektronix 3052 b timestamp question

    I think that this question will apply to many, if not all scopes for LabView interface, but it's the one that I tend to use.  I have managed to connect the field of LabView application over the network and can run the demo applications quite well.

    The particular test that I am running is that awaits a coming special trigger, after that is captures 10,000 samples of 1 microsecond increments each.

    If I don't have LabView and export the data file directly, it has a timestamp value that is right; check-in starts at-40 microseconds (that's the point that the scope begins to record the sample) and flows through ten following time milliseconds.

    The time stamp that is acquired by LabView, is however, formatted in this format:

    7:00:00.009 PM
    31/12/1903

    I was able noodles autour with parameters in time waveform diagram time string Date and get the display on the screen to display microseconds by entering a string "6u %" in the time format string string date/time format.  However, there are two outstanding issues:

    1. the string that is exported to Excel ends by displaying the calendar stuff.  No amount of playing with settings in Excel doesn't allow me to reduce the string to a single integer representing the number of microseconds past

    2. the point of departure - t0 - always by default is "0", which means that the data are not aligned to the correct microseconds in the graph of the wave.

    I've seen examples in the online support files, and that's how I got the point of being able to display microseconds.  No one seems to have attacked, however, the problem simply allow the exit number by the scope as "t0" to the passage in the spreadsheet file.  Anyone got any ideas?  There is little for me...

    Thank you

    Danielle

    Thought this might help; the graphic on the screen updates correctly, but the spreadsheet contains too much junk and not in the right format or starting point in time (see attachment)

    You can have any desired accuracy. Just change the format specifier.

    If you don't know what that means, put running context-sensitive help (which you should have any time, in any case), and move your mouse on the entry in the spreadsheet file.

  • ODI upgrade questions

    I do an upgrade of ODI 10 to 11, but had few questions

    I intend in this way please let me know if I missed something or suggest me if you have a better way

    (1) save the master and working repository (just by exporting to a location)

    (2) copy all databases that are associated with ODI.

    And by following this link

    Data Integrator Oracle 11 g - upgrade to 10g - part 4 / 11g - ODIExperts.com

    (3) I need to the same process for all environments such as DEV, QC and PROD?

    (4) and other than upgrade repositories should I upgrade update level instance as intermediary if so what is the reason?

    Please suggest me...

    Thanks in advance.

    Hi, you are welcome

    (1) actually clone means ask a DBA to clone the entire schema, which means that all the tables and the data inside. For example, let's say the name of the schema that you use for the ODI repository is ODI_MASTERREP, if you ask the DBA to be cloned, he can create another scheme called ODI_MASTERREP_BKP in the database with all the data and the objects of the ODI_MASTERREP schema. It's good because if anything happens is faster for restore and also, if you want for some reason newspapers in the old depot with you ODI 10 you can just point to the ODI_MASTERREP_BKP and you

    (2) If you have access to create patterns, you can just do what I said above no need to include a DBA in this operation. But just for backup. Another that you don't need to create what whatsoever, just follow the document migration until the end.

    (3) you make in all envs because otherwise you will not be able to deploy a version 11 scenario in a repository of version 10, all should be consistent.

    There is no way to update the client, you must download the client in the version you want and install in the same way you did before. Same thing for the agent.

    I love this blog:

    More of life than that...: ODI Series Part 1

    and I have this kind of stuff from dev (for obvious reasons)

    www.devepm.com

  • ODI Installation questions _ 11.1.1.7

    Hello

    While installing ODI Version 11.1.1.7 version of Windows Server 2012, has a few problems:

    (1) Got op - ups such as below; However, I took over ignoring and installation completed successfully.

    Don't know what these files are intended to be? I downloaded the Version of delivery but do not know why they are absent; Are really required for a complete Installation ODI?


    File not found: \ODI/od/disk1/install/win64/... E: /stage/Components/Oracle.OPSS.client/11.1.1.7/Data Filesfilegroup1.jar

    File not found: \ODI/od/disk1/install/win64/... E: /stage/Components/Oracle.OPSS.client/11.1.1.7/Data Filesfilegroup2.jar


    File not found: \ODI/od/disk1/install/win64/... E: /stage/Components/Oracle.OPSS.client/11.1.1.7/Data Filesfilegroup3.jar

    (2) I selected under installation options

    Installation of the Developer (Studio (with a Local Agent) ODI, ODI SDK

    Standard installation (ODI Standalone Agent)

    The question, I got here is that the ODI Agent does not work and more I didn't know the FOLDER Structure of the AGENT under the Oracledi folder. I also found OraclParams.bat file.

    Can you advise why this is different from the ODI 11.1.1.5 version please?

    (3) also, when I test the ODI_Agent; getitng the below error:

    Caused by: ODI-1424: Agent Host or port can be reached using the HTTP://Server:20910 / oraclediagent .

    Caused by: java.net.ConnectException: connection refused: connect

    at oracle.odi.runtime.agent.invocation.RemoteRuntimeAgentInvoker.invoke(RemoteRuntimeAgentInvoker.java:278)

    ... more than 41

    Is it because of the PORT problem?

    Thank you

    UB.

    Try to download the full version of Oracle's Support

    See you soon

    John

  • Timestamp question

    Currently having a few problems with the help of a timestamp in Cascades and don't know how to overcome it.

    Initial implementation:

    listItemComponents: [
                        ListItemComponent {
                            type: "date"
                            Header {
                                title: Qt.formatDateTime(new Date(ListItemData * 1), "dddd, dd MMMM yyyy")
                            }
                        },
    
    ...
    
    function itemType(data, indexPath) {
                        if (indexPath.length == 1) {
                            return "date";
                            console.log("DATE RETURN")
                        }
                        return "item";
                        console.log("ITEM RETURN")
                    }
    
    ...
    
    GroupDataModel {
                id: dataModel1
                sortingKeys: [ "entryTimestamp", "entryTitle" ]
                grouping: ItemGrouping.ByFullValue
                sortedAscending: true
    

    The entryTimestamp is the field in the database that contains the timestamp (it looks like to-1458478723892) and this time stamp comes from the use of this: Date () .getTime (new) - the implementation above is going to sort the items in the order of the exact date, but because the timestamp is different (due to the seconds, minutes, hours of each entry in the database is different) list items are not grouped , but are rather separate list items.

    ---

    I then tried to add another field to the database (that would be the long date - Sunday, March 20, 2016) but because it's just text, the elements are not in the correct date order. So now, I'm not entirely sure on what to do - any help with that would be great

    the grouping is by comparison of strings. you want to sort by year first, then a month, then by day you format the date in this way.

    I suspect that you invert the sort key with the string that is displayed in the header.
    You can format the string header differently by defining your own header element:

    listItemComponents: [
        ListItemComponent {
            type: "header"
    
            Header {
                title: ListItem.view.headerFormat.format(ListItemData)
            }
        }
    

    headerformat is a CustomDateFormatter with this skeleton: "yMMMMEEEEd."

    (I recommend this trainer class because it gives you a lot of options)

    Do not forget to replace itemType to discern between the items of your list with the header.

  • ODI Enconding question

    Hello everyone,

    I have a problem during the transfer of data in ODI 11 g. The problem is on the encoding. I import a .csv file that contains some special characters such as (c, a, e, etc.). These char in the file is OK, but when I import into my oracle database, these tank is transformed into "A§", "that £", "Have" and so on...  The LKM in SQL file this problem occurs, when I try with another LKM file to Oracle (SQLLDR) I got an error because some fields on the front lines are too long...

    In any case... If I import using LKM SQL file, how can I shift those special tank or how can I fix the data?

    Thank you in advance...

    Kind regards...

    Hello

    What is the encoding for the file type?

    In the physical database server, try to provide the encoding property in the jdbc url.

    According to the output you got, I think that the encoding is UTF8. Try it and see. It will be useful.

  • Contract HRMS, workflow approval - approval timestamp question is bad

    We have a problem in the system of contract HRMS... possibly as part of workflow.  When a contract is approved tables are updated with the correct time stamp, however when you navigate to the contract and view history approval, the timestamp shown here is exactly 3 hours behind the actual approval time.  I almost suspect something to do with time zones.  I am very new to peoplesoft!  Any help would be greatly appreciated.

    You can check your personal settings for the time zone

  • The old TIMESTAMP question on a query

    I have a SQL query

    DATE_COL = TIMESTAMP ' 2009-01-30 00:00:00 '

    Where the date is generated by a repository with a default initialization variable.

    Now in Oracle metalink seem to suggest that we should be able to have a default initialization of DATE 30 January 2009 ' that I tried seemed to work and gave me

    DATE_COL = TO_DATE('30-JAN-2009')

    However after working once, subsequent changes to the repository caused the TO_DATE to disappear and I found myself with:

    DATE_COL = 30 JANUARY 2009"

    If someone managed to use the DATE in the default initialization?


    I'm on OBIEE 10.1.3.3

    .

  • When you rename a file change the last timestamp consulted?

    I was in a heated debate with one of my instructors today associated with a test on the access timestamp question when you rename a file. Through my research and testing to rename a file via the command prompt, it does not appear that the process of renaming a file will change the date of that last file access. Access and modify times will change to the directory that contains the file, but not the file itself.  So my understanding is that it is not, however, my instructor insists that it is, can someone shed some light on this topic please?

    Reading material more - it's interesting.

    More information under:

    Windows NT keeps track of three stamps of time associated with files and directories. These three temporal are written in creation, last access, and last. When a file or directory is created, accessible or changed, Windows NT updates the appropriate time stamp.

    http://support.Microsoft.com/kb/148126/a

    NTFS uses the change log to track information on the added files, deleted and changed for each volume.

    http://TechNet.Microsoft.com/en-us/library/cc938919.aspx

    Working with file systems

    http://TechNet.Microsoft.com/en-us/library/bb457112.aspx

  • ODI odi 64 32 components reuse

    the manual says odi 64-bit cannot be developed using compontes in 32 bits 64-bit odi

    My question is

    the components are interfaces, models, actors, etc. ?

    Thank you

    can you if you please provide the link where you saw it.

  • ODI data loading error checking logic

    Hi, I do not know if it's related to ODI or questioned our command script. We found that loading data ODI errors checking is one record of the other so that as soon as we will have an error in loading batch data (for example, a member is missing), the whole lot turns very slowly due to the error checking process.

    Everything we can fine tune the verification of the process of loading data errors ODI?

    Thank you!

    Hello

    Have a read here
    It was a bug, but is now resolved.

    Ok?

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • URGENT: Need to flashback table, how to find SNA for quite some time before the update that's happened?

    Hi all

    I have a new database 11G, I haven't used before flashback, but my retention time is a day... I have once the developer did a massive update to a table to corrupt.

    How can I find the RCS for this time so I can try this table of flashback?

    Thank you

    Cayenne

    Thank you.. I also found that it works:

    SQL > select timestamp_to_scn (to_timestamp('20140103152000','YYYYMMDDHH24MISS')) SNA double;

    SNA

    ----------

    98344246

    Although does NOT work all the time. But that's just what Oracle when you give him a timestamp to flashback to: it converts the timestamp of a SNA and then restores the data based on the SNA.

    See table of Flashback in the language SQL doc

    http://docs.Oracle.com/CD/B28359_01/server.111/b28286/statements_9012.htm

    TIMESTAMP CLAUSE

    Specify a value of timestamp corresponding to the point in time to which you want to return to the table. The expr must evaluate to a valid time stamp in the past. The table will flash back to a period of about 3 seconds to the specified timestamp.

    This precision "3 seconds" is because of this conversion to SNA - Oracle uses its internal table.

    YVERT is accurate, if you know, but you typically do not have.

    Read and consider these warnings in app dev guide advanced

    http://docs.Oracle.com/CD/B28359_01/AppDev.111/b28424/adfns_flashback.htm

    Guidelines for Oracle Flashback Query

    . . .

    General guidelines for Oracle Flashback Table

    . . .

    • For querying the latest data at a specific time, use a SNA. If you use a timestamp, questioned real time can be up to 3 seconds before the time you specify. Oracle database uses the SNA and assigns them to the timestamps to a granularity of 3 seconds.

    For example, suppose values SNA 1000 and 1005 are mapped on the timestamps 08:41 and 08:46, respectively. A query for a moment between 08:41 and 08:45:59 is mapped to the SCN 1000; an Oracle Flashback Query at 08:46 is mapped to the SNA 1005.

    By reason of this time-to-SNA map, if you specify a time slightly after a DDL (such as a table creation) Oracle database operation can use a SNA which is located just before the DDL operation, causing the error ORA-1466.

    . . .

  • Help understand the ERPi and planning

    Hi all

    Planning/Essbase. Version is 11.1.2.

    Can you please help me understand the role of the adapter of the ERPi. I have a hyperion planning that is on an essbase cube. We load the data and metadata for E-Business Suite 11. Forms and reports, I want to break through to the data of the EBS. I have installed and configured the adapter ERPi and ODI scenarios


    Question 1)

    Do I have to load data and metadata extraction for.

    Question 2)
    I put my source and target - systems where can I configure the "Drill Through".

    Thanks in advance for any help offered.

    Let me clarify, you must run the rule metadata ERPi in but you don't need to load the metadata for the application of the EMP. I think that much confusion exists around this point.

  • Simple question Timestamp

    Why is the output of this give me 12:00?

    Select the double to_timestamp(systimestamp-1);

    I think that if is specified by default the at 12:00 timestamp values.  So, indeed the part "To_timestamp' of my channel is originally the output at 12:00?

    Also how could I get this example to show the actual hours... currently less than a full day... ?

    Thank you.

    Sorry for what may seem like a vague question, any opportunity to learn is what I'm after, and that way I don't have makes a lot of values timsestamp, it seems to be a good opportunity.

    2776946 wrote:

    Why is the output of this give me 12:00?

    Select the double to_timestamp(systimestamp-1);

    I think that if is specified by default the at 12:00 timestamp values.  So, indeed the part "To_timestamp' of my channel is originally the output at 12:00?

    Also how could I get this example to show the actual hours... currently less than a full day... ?

    Thank you.

    Sorry for what may seem like a vague question, any opportunity to learn is what I'm after, and that way I don't have makes a lot of values timsestamp, it seems to be a good opportunity.

    Why are you applying to_timestamp to something that is already a timestamp?

    TO_TIMESTAMP converts a string to a timestamp.

    You need to_char to convert a timestamp into a string with the format required when you want to display it.

    Select to_char (systimestamp-1, 'dd-mon-yyyy hh24:mi:ss') twice;

    for example (or another string format, you need).

  • ODI 11 g: question of creating multiple repository work.

    Hello

    I am creating a 'development' of repository work a 'run' referential work that both related to the master repository even in ODI 11 g (11.1.1.7). I could successfully create the master repository and a working repository 'Development' but when I tried to create the second repository (repository of work 'execution') job, it appears this message - "a working repository already exists. Select Yes to reattach this repository to work. Select No crushes the repository of existing work. If I select 'Yes', then it won't let me to create a new working repository repository of work and if I select 'No' then it will allow me to create new (execution) but the former repository work (development) will become unlimited to the master repository. After completing the process of creating both repositories I can only connect to "exécution" repository work, but when I try connecting to the repository of work 'Development' is to launch the below error.

    Please can anyone suggest a solution to successfully create two directories of work related to the same master repository? I am trying to create the new repository of work performance of topology ODI.

    Error:

    oracle.odi.core.config.RepositoriesNotBoundException: ODI-10150: working with the 410 ID repository is not related to the master:

    Definition of master ID:410, name: LOW_WORK_REP_D, Timestamp:1394707069282

    Working definition of ID:411, name: LOW_WORK_REP_E, Timestamp:1394707247161.

    Some details about the rep ID:

    Master rep: 400

    Work the rep (Dev): 410

    Work the rep (Exe): 411

    Hi all

    I resolved myself. There are two problems. When I tried to create the repository of the work of the RCU and it wasn't creating the DB user for some reason any. Another problem is that I was stupid to provide the repository master IDs when creating the repository to work. So here are the steps I took to solve this problem.

    1. create the user DB for the repository to work manually using the scripts below.

    2. then connect to ODI (under the name 'Master repository' only connection)-> topology-> repositories (component)-> right click on 'Repositories work'-> new repository to work. The wizard opens and the first picture in the connection details of the user created DB (ODI_WORK_REP) and follow through.

    Scripts:

    create the user ODI_WORK_REP

    identified by

    tablespace ODI_WORK_USER default

    temporary tablespace ODI_WORK_TEMP

    unlimited quota on ODI_WORK_USER;

    Grant connect, resources, SELECT_CATALOG_ROLE to ODI_WORK_REP;

    So I've now successfully created two repositories of work (Dev/Exe) related to the master repository even. Mission accomplished

    Thank you

    Vikas.

Maybe you are looking for