export data wave graph problem

Hi all!

I have a problem with my table of waveform:

The length of the history of the chart I updated 5 X 24 X 60 X 60 = 432 000 comments. I draw a new point in all the 1 minute, so that I can preview as the last 5 days. The problem is, when I'm just trying to export to Excel, the data for the chart, the data are not in the order time (if I export the data to the Clipboard, it is all the same). The data starts with the second day. And Yes, before I export data, the x-axis are configured to be autoscaled, so I don't see 5 days together data curves. But after export, in the table opening Excel (2010 office, LabView 2011, silver chart) the data are really mixed upward...

Anyway, the workaround is simple: 2 clicks in Excel and it puts the data in order, but I'm curious to know why it happens... I guess I'm doing something wrong with the berries of waveform of construction?

Is this a bug?

Thanks in advance!

PS. : I have attached a graphic exported to jpg format what I see, and excel table with "mixed up" sequence of data, as well as the Subvi, that I use to generate and send the three points of the chart made every minute.

Hey,.

as far as I know, it was a change in the interface ActiveX of Office 2007 to Office 2010.

You wouldn't have this bug in Exel 2007.

Try to use the the VI 'Export waveforms to spreadsheet' or if you use a PDM file.

Oxford

Sebastian

Tags: NI Software

Similar Questions

  • Problem with export data from an application and importing data into another application

    Hello

    I need to give a stream to an application (target) of one of the other application (source) of data. So what I did is I exported the data necessary for the application target using the DATAEXPORT function in text files and you now want to import data in text files in the application of the source using rules files.

    I'm trying to create files for each exported text file separate compilation, but I'm not able to do this successfully.
    I traced all members of the source to the target application, but still there is something wrong is going on is to not let the successful import or do not allow that data to go to the right places.

    There all the specifications, while using this function DATAEXPORT as format or the kind of DATAEXPORTOPTIONS that I use to make it work?

    Here is the first part of my script to calc. For all other data I wanted to export that I wrote similar statements by simply setting on individual members.

    SET DATAEXPORTOPTIONS
    {
    DATAEXPORTLEVEL ALL;
    DATAEXPORTCOLFORMAT ON;
    DATAEXPORTOVERWRITEFILE ON;
    };

    DIFFICULTY ("ST", "BV", "SSV", "ASV", "ESV", "W", "HSP_InputValue", "SÜSS", "TBSD", "ABU", "AC", "LB", "LT", "VAP", "Real", "FY11", @REMOVE (@LEVMBRS (period, 0), @LIST (BegBalance)));
    DATAEXPORT 'File' ',' '...\export Files\D3.txt ';
    ENDFIX;


    Please let me know your opinion on this. I really need help with that.
    ~ Hervé

    So, you have 12 periods through the columns of the export data?

    If so, you must assign each column of data at a time. The data column property is used only when you have completely defined the dimensions either through other fields or other areas and the header and the left point only possible is data. This does not work when you have several columns of data that you find, just so assign the right period; doing this means there is no column of data .

    Kind regards

    Cameron Lackpour

  • Export data to Excel does not work in the build (.exe)

    I'm new to LabView 2010, but made a small application which works as expected when run as a VI in the Labview environment. This is a simple data acquisition where the results are displayed in a waveform graph. I used the method "Export data to Excel" to present the request to transfer the results on the chart in an Excel file programmatically.

    However, when I build the application and run the exe, the application works very well outside the export to Excel. Excel does not start, and when I right click on the graph for "export data to Excel" manually, it doesn't work anymore. The data can be exported to the Clipboard and then pasted into Excel, which is doable, but not very elegant solution...

    Any thoughts?

    Hi CFNoergaard,

    This issue has been corrected in LabVIEW 2010 SP1.

    If you upgrade to this version, it should do the trick.

  • failure to export a wav (damaged codec?)

    I use energyXT 2.5 (on win7), which is a software of recording for a good while now without any problem, I use it to record old cassettes and records CD and finalize audio wav file. Everything is fine until around November 24, 2014, this is happening, check and back up fine, but when export to wav finishes the resulting file is also great it should be at 4.3 GB, but don't play that 4 min 01 sec (should be around 48 min) tried another record (11 altogether different recording sessions) same thing (produced a file correctly size but only about first min 11 parts), tried all methods of saving ACE, destinations files, update driver, dianosis system, hp (without help) support, energyXT support (no response), energyXT uninstalled/reinstalled and no solution, I found similar problems people have described and sounds like a corrupted audio codec can be blamed which would explain why the uninstall/reinstall didn't work. If so, what could do this? do not have change all parameters (or re - open the program) between the last successful export (nov 21) and first failure (24 November), the only reason I can think is an automatic update of a certain type screwed up something pretty bad, but I don't know (my last solution is to install on and implementing a dedicated) (, computer disconnected, restored to a version of win I KNOW works). How can I fix this before it comes to that?

    recovery system was my first idea, but recovery registry filled and emptied, and I can go back too far. I had hopes for audacity, it came with energy xt on a disc with a few other audio stuff freeware but older version would not manage the sampling frequency my audio interface to not use but improved version appears comparable if not better than energyxt, and even still it fails AND the exact same way (produces a playable track of 4 min 01 sec to 4.3 GB after the recording of the disc even at the same rate of 192 kHz and exporting) lossless WAV). something is wrong with the way windows renders 32 bit wav files float. (oh that 16-bit works but I have wasted enough of my time) so I'm going or backup I need to do anyway and restore factory image from manufacturer, which may be a bit drastic, but will for sure work or idea mentioned before where I just use a different tour according to the question if I can acquire one and how long my patience. I very much appreciate your suggestions in my digital tribulations! Thank you

  • Is it possible to export data from fields to fill in indesign?

    Just the basics:

    I'm on a mac, running CC

    We are PDF forms where users will type circulating information. Instead of copying and pasting data into indesign, we wanted to see if there was a way to export the data entered directly in indesign? I tried to export data in Excel, but we lack excel for mac 2011 and I wonder if there is a compatibility problem with exports (which must constantly be repaired). I also find that when the files are repaired, that the entire form is included.

    If I can't make a direct export to indesign is there a way I can only export the data to excel and create my own .csv to a data merge in indesign?

    Thank you

    Rich

    Yes, JavaScript is case-sensitive, so the name of a field must be used in the same way, it has been defined in the form. If you have a field named "Field 1", you can not access via "field 1" - these two names is different. ""

  • Data loading 10415 failed when you export data to Essbase EPMA app

    Hi Experts,

    Can someone help me solve this issue I am facing FDM 11.1.2.3

    I'm trying to export data to the application Essbase EPMA of FDM

    import and validate worked fine, but when I click on export its failure

    I am getting below error

    Failed to load data

    10415 - data loading errors

    Proceedings of Essbase API: [EssImport] threw code: 1003029 - 1003029

    Encountered in the spreadsheet file (C:\Oracle\Middleware\User_Projects\epmsystem1\EssbaseServer\essbaseserver1\app\Volv formatting

    I have Diemsion members

    1 account

    2 entity

    3 scenario

    4 year

    5. period

    6 regions

    7 products

    8 acquisitions

    9 Servicesline

    10 Functionalunit

    When I click on the button export its failure

    I checked 1 thing more inception. DAT file but this file is empty

    Thanks in advance

    Hello

    Even I was facing the similar problem

    In my case I am loading data to the Application of conventional planning. When all the dimension members are ignored in the mapping for the combination, you try to load the data, and when you click Export, you will get the same message. . DAT empty file is created

    You can check this

    Thank you

    Praveen

  • Error in activities > export while exporting data admin

    Hello

    I am facing problem with activities > export.

    The details are as below:

    1. the user is to have all access (Prodikaadmin).

    2 EnvironmentSetting.config entry is as below:

    < DataExchangeService configChildKey = 'name' refscope = 'Application' factory = "Class: Xeno.Prodika.Services.DataExchange.DataExchangeServiceFactory, PlatformExtensions" >

    < name varenv = "DexConfiguration" handler = "Class: Xeno.Prodika.Services.DataExchange.Configuration.DexConfigSectionHandlerFactory, PlatformExtensions" >

    "< DataExchangeConfig system ="staged"NotifierEmail="@@VAR:Prodika.DataExchangeService.Notifier.EmailAddress @ "EncryptionFilter =" Class: Xeno.Prodika.Services.DataExchange.Serialization.RijndaelEncryptionStreamFilterFactory, PlatformExtensions ">

    < TargetSystems >

    Production of < system > < / system >

    < / TargetSystems >

    < SourceSystems > < / SourceSystems >

    < / DataExchangeConfig >

    < / varenv >

    < / DataExchangeService >

    3 generate "token" is not visible in the system under activities.

    4. but able to access 'Generate token' with ' ~ / portal/DataAdmin/DataAdmin.aspx. ContentKey = GenerateToken' link.

    5. able to generate tokens and saved.

    6 browse the same and clicked on the button "Upload token.

    7 has got the error "Import is not targeted to this system expected real GetByteArray, staged '."

    Kindly let me know if I'm missing something.

    Question:

    1. how to turn on "Generate the token" in the user interface.

    2. how to fix the above error.

    Thank you!!

    If the server as configuration file above, it can only export data. If you want to import a server config, please see How to set up an IMPORT environment?

  • How to export data to excel that has 2 tables with the same number of columns and the column names?

    Hi everyone, yet once landed upward with a problem.

    After trying many things to myself, finally decided to post here...

    I created a form in form builder 6i in which clicking on a button, the data gets exported to the excel sheet.

    It works very well with a single table. The problem now is that I cannot do the same with 2 tables.

    Because the tables have the same number of columns and the columns names.

    Here are the 2 tables with column names:

    Table-1 (MONTHLY_PART_1) Table-2 (MONTHLY_PART_2)
    SL_NOSL_NO
    MODELMODEL
    END_DATEEND_DATE
    U-1U-1
    U-2U-2
    U-4U-4
    ..................
    ..................
    U-20U-20
    U-25U-25

    Given that the tables have the same column names, I get the following error :

    402 error at line 103, column 4

    required aliases in the SELECT list of the slider to avoid duplicate column names.

    So how to export data to excel that has 2 tables with the same number of columns and the column names?

    Should I paste the code? Should I publish this query in 'SQL and PL/SQL ' Forum?

    Help me with this please.

    Thank you.

    Wait a second... is this a kind of House of partitioning? Shouldn't it is a union of two tables instead a join?

    see you soon

  • When exporting data to essbase with L0 members, shared members are also get exported.

    Hi, when exporting data to essbase with L0 members, shared members are also get exported.

    But I don't want to export another hierarchy!

    Example:

    project

    XYZ +.

    ABCD +.

    P00001 +.

    p00002 +.

    GFE +.

    p00003 +.

    p00004 +.

    all the xyz (Label only)

    ABCD (shared member)

    EFG (shared member)

    I want data to be exported only with P00001, 2, 3, 4.

    I think that the OP cannot use Level0 in DATAEXPORTOPTIONS, because not all dimensions are zero-level.

    Cameron, thanks - Yes, it's the behavior of the OP is seen (and disliking).  The share of 'level zero' which is in fact not level zero (in the primary hierarchy) is included in the output.  I think it's the problem rather than replication / repetition.

  • Import and export data between XE 10.2 and 11.2 XE

    This Oracle documentation reference link:
    http://docs.Oracle.com/CD/E17781_01/install.112/e18802/TOC.htm#autoId22

    This process migrates all schemas, storage, data and applications of the Apex?
    The process seems to avoid patterns associated with Apex... see exclusions...

    [excerpt]
    d.Export data database of 10.2 XE to the dump folder.

    expdp system/system_password full = Y
    EXCLUDE = PATTERN:------"AS \'APEX_%\'\",SCHEMA:\"LIKE \'FLOWS_%\'\".
    Directory = DUMP_DIR dumpfile = DB10G.dmp logfile = expdpDB10G.log
    expdp system/system_password TABLES is FLOWS_FILES. WWV_FLOW_FILE_OBJECTS$
    Directory = DUMP_DIR dumpfile = DB10G2.dmp logfile = expdpDB10G2.log

    I'm trying to migrate from Apex 4.1 on 10Gex on a single server, at Apex 4.1 on 11Gex on another server.
    Suggestions on the best way to achieve this?

    Thank you
    Rich

    Has anyone else tried this?

    I know a few people who have migrated successfully in this way. But there are many users who have failed, either because they don't follow the documentation carefully, they fell on a charset conversion problem, or that they had made the "non-standard stuff" in their database. See below.

    This process migrates all schemas, storage, data and applications of the Apex?

    In general: Yes. But there may be problems, however.
    * Charset problems:
    If you have the 'Unicode-Edition"of 10.2 XE installed right now, he probably won't a problem. If you have ISO-Charset, UTF-8 conversion can raise several issues...

    * documentation questions:
    It is not just the expdp you mentioned above. You also need the result of the 'geninst.sql', including 'gen_apps.sql' to your APEX applications.

    * No Standard stuff:
    There are some restrictions not documented in the Guide of XE, because they would be "non-standard" from the point of view XE. For example, you cannot downgrade your APEX applications, that is, if you have already upgraded to 4.1 in your XE 10.2 APEX, you must upgrade to 4.1 in your 11.2 XE before , you can import your applications.
    Another problem could be synonymous and subsidies for APEX objects your drawings may have: If you have an older version of the APEX on your delivered 4.0 with 11.2 XE XE 10.2, you'll have to re - generate these for the workflow scheme 11.2 XE manually.
    And this kind of "non-standard" things you might have done in your instance of XE 10.2... You have yourself to care. I hope that you have a list of things to watch. ;)

    -Udo

  • export data from the table in xml files

    Hello

    This thread to get your opinion on how export data tables in a file xml containing the data and another (xsd) that contains a structure of the table.
    For example, I have a datamart with 3 dimensions and a fact table. The idea is to have an xml file with data from the fact table, a file xsd with the structure of the fact table, an xml file that contains the data of the 3 dimensions and an xsd file that contains the definition of all the 3 dimensions. So a xml file fact table, a single file xml combining all of the dimension, the fact table in the file a xsd and an xsd file combining all of the dimension.

    I never have an idea on how to do it, but I would like to have for your advise on how you would.

    Thank you in advance.

    You are more or less in the same situation as me, I guess, about the "ORA-01426 digital infinity. I tried to export through UTL_FILE, content of the relational table with 998 columns. You get very quickly in this case in these ORA-errors, even if you work with solutions CLOB, while trying to concatinate the column into a CSV string data. Oracle has the nasty habbit in some of its packages / code to "assume" intelligent solutions and converts data types implicitly temporarily while trying to concatinate these data in the column to 1 string.

    The second part in the Kingdom of PL/SQL, it is he's trying to put everything in a buffer, which has a maximum of 65 k or 32 k, so break things up. In the end I just solved it via see all as a BLOB and writing to file as such. I'm guessing that the ORA-error is related to these problems of conversion/datatype buffer / implicit in the official packages of Oracle DBMS.

    Fun here is that this table 998 column came from XML source (aka "how SOA can make things very complicated and non-performing"). I have now 2 different solutions 'write data to CSV' in my packages, I use this situation to 998 column (but no idea if ever I get this performance, for example, using table collections in this scenario will explode the PGA in this case). The only solution that would work in my case is a better physical design of the environment, but currently I wonder not, engaged, as an architect so do not have a position to impose it.

    -- ---------------------------------------------------------------------------
    -- PROCEDURE CREATE_LARGE_CSV
    -- ---------------------------------------------------------------------------
    PROCEDURE create_large_csv(
        p_sql         IN VARCHAR2 ,
        p_dir         IN VARCHAR2 ,
        p_header_file IN VARCHAR2 ,
        p_gen_header  IN BOOLEAN := FALSE,
        p_prefix      IN VARCHAR2 := NULL,
        p_delimiter   IN VARCHAR2 DEFAULT '|',
        p_dateformat  IN VARCHAR2 DEFAULT 'YYYYMMDD',
        p_data_file   IN VARCHAR2 := NULL,
        p_utl_wra     IN VARCHAR2 := 'wb')
    IS
      v_finaltxt CLOB;
      v_v_val VARCHAR2(4000);
      v_n_val NUMBER;
      v_d_val DATE;
      v_ret   NUMBER;
      c       NUMBER;
      d       NUMBER;
      col_cnt INTEGER;
      f       BOOLEAN;
      rec_tab DBMS_SQL.DESC_TAB;
      col_num NUMBER;
      v_filehandle UTL_FILE.FILE_TYPE;
      v_samefile BOOLEAN      := (NVL(p_data_file,p_header_file) = p_header_file);
      v_CRLF raw(2)           := HEXTORAW('0D0A');
      v_chunksize pls_integer := 8191 - UTL_RAW.LENGTH( v_CRLF );
    BEGIN
      c := DBMS_SQL.OPEN_CURSOR;
      DBMS_SQL.PARSE(c, p_sql, DBMS_SQL.NATIVE);
      DBMS_SQL.DESCRIBE_COLUMNS(c, col_cnt, rec_tab);
      --
      FOR j IN 1..col_cnt
      LOOP
        CASE rec_tab(j).col_type
        WHEN 1 THEN
          DBMS_SQL.DEFINE_COLUMN(c,j,v_v_val,4000);
        WHEN 2 THEN
          DBMS_SQL.DEFINE_COLUMN(c,j,v_n_val);
        WHEN 12 THEN
          DBMS_SQL.DEFINE_COLUMN(c,j,v_d_val);
        ELSE
          DBMS_SQL.DEFINE_COLUMN(c,j,v_v_val,4000);
        END CASE;
      END LOOP;
      -- --------------------------------------
      -- This part outputs the HEADER if needed
      -- --------------------------------------
      v_filehandle := UTL_FILE.FOPEN(upper(p_dir),p_header_file,p_utl_wra,32767);
      --
      IF p_gen_header = TRUE THEN
        FOR j        IN 1..col_cnt
        LOOP
          v_finaltxt := ltrim(v_finaltxt||p_delimiter||lower(rec_tab(j).col_name),p_delimiter);
        END LOOP;
        --
        -- Adding prefix if needed
        IF p_prefix IS NULL THEN
          UTL_FILE.PUT_LINE(v_filehandle, v_finaltxt);
        ELSE
          v_finaltxt := 'p_prefix'||p_delimiter||v_finaltxt;
          UTL_FILE.PUT_LINE(v_filehandle, v_finaltxt);
        END IF;
        --
        -- Creating creating seperate header file if requested
        IF NOT v_samefile THEN
          UTL_FILE.FCLOSE(v_filehandle);
        END IF;
      END IF;
      -- --------------------------------------
      -- This part outputs the DATA to file
      -- --------------------------------------
      IF NOT v_samefile THEN
        v_filehandle := UTL_FILE.FOPEN(upper(p_dir),p_data_file,p_utl_wra,32767);
      END IF;
      --
      d := DBMS_SQL.EXECUTE(c);
      LOOP
        v_ret := DBMS_SQL.FETCH_ROWS(c);
        EXIT
      WHEN v_ret    = 0;
        v_finaltxt := NULL;
        FOR j      IN 1..col_cnt
        LOOP
          CASE rec_tab(j).col_type
          WHEN 1 THEN
            -- VARCHAR2
            DBMS_SQL.COLUMN_VALUE(c,j,v_v_val);
            v_finaltxt := v_finaltxt || p_delimiter || v_v_val;
          WHEN 2 THEN
            -- NUMBER
            DBMS_SQL.COLUMN_VALUE(c,j,v_n_val);
            v_finaltxt := v_finaltxt || p_delimiter || TO_CHAR(v_n_val);
          WHEN 12 THEN
            -- DATE
            DBMS_SQL.COLUMN_VALUE(c,j,v_d_val);
            v_finaltxt := v_finaltxt || p_delimiter || TO_CHAR(v_d_val,p_dateformat);
          ELSE
            v_finaltxt := v_finaltxt || p_delimiter || v_v_val;
          END CASE;
        END LOOP;
        --
        v_finaltxt               := p_prefix || v_finaltxt;
        IF SUBSTR(v_finaltxt,1,1) = p_delimiter THEN
          v_finaltxt             := SUBSTR(v_finaltxt,2);
        END IF;
        --
        FOR i IN 1 .. ceil( LENGTH( v_finaltxt ) / v_chunksize )
        LOOP
          UTL_FILE.PUT_RAW( v_filehandle, utl_raw.cast_to_raw( SUBSTR( v_finaltxt, ( i - 1 ) * v_chunksize + 1, v_chunksize ) ), TRUE );
        END LOOP;
        UTL_FILE.PUT_RAW( v_filehandle, v_CRLF );
        --
      END LOOP;
      UTL_FILE.FCLOSE(v_filehandle);
      DBMS_SQL.CLOSE_CURSOR(c);
    END create_large_csv;
    
  • Export data then reload

    Hello

    I have the such Scenario than real dimension, forecast, Budget, forecast 2, FY10 Budget, etc. I want to export data for only 2 forecasts and Budget FY10, so here is my script:

    ESS_LOCALE English_UnitedStates.Latin1@Binary
    SET DATAEXPORTOPTIONS
    {
    DataExportLevel 'all '.
    DataExportColFormat
    WE DataExportDynamicCalc;


    };
    DIFFICULTY ("FY10 Budget", "Preview 2");
    DATAEXPORT 'File' ',' 'D:\testexport.txt ';
    ENDFIX;

    The file is generated, and then erase FY10 Budget and forecasting 2 block and try to load the data in the cube with the same dimensoin, I got error such as:

    «Unkonwn Item ["Feb", "Mar", "CalQ1", "Apr"...»

    What I'm doing wrong here?

    Thank you

    Published by: Alain on November 15, 2010 07:50

    Published by: Alain on November 15, 2010 07:51
    I feel is my delimiter, I use the comma in my calc script, but it seems it is tab or space, which is the problem here, how can I specify tab or space as the delimiter in my dataexport option?

    ^ ^ ^ I'm not sure that the separator is the issue, but certainly try it. If you put a "" you will get a space as a separator. If you put "" (not really a tab, but must be EAS/editor of your choice I can't tab in the my computer window) you get tab as a delimiter. That's all there is to it. It's a thread from last week - I learn new things all the time on this forum.

    Kind regards

    Cameron Lackpour

  • Export data from the database to excel file using the procedure

    Hello

    I need to export data from database to oracle 10 g for the excel file, I try this code:

    First, I create directory to the user sys and give permition to user that I'm working on it
    create or replace directory PALPROV_REPORTS as 'c:\temp';
    
    grant read, write on directory PALPROV_REPORTS to user12 ;
    then I run this code
    declare
        output utl_file.file_type;
    begin
        output := utl_file.fopen( 'user12' , 'emp1.slk', 'w',32000 );
        utl_file.put_line(output, 'line one: some text');
        utl_file.fclose( output );
    end;
    the problem appears as
    ORA-29280: invalid path ORA-06512: at "SYS." UTL_FILE", line 29 ORA-06512: at"SYS." UTL_FILE", line 448 ORA-06512: at line 4

    Notice that I use the operating system windows as a client and a linux as a server database

    The file will be written to the database server or your GNU / linux and I'm quite sure, there is no folder named "c:\temp" on linux. It will probably be ' / tmp' on a unix server.

    And open the file, you must give the name logic directory 'PALPROV_REPORTS' it instead of the user name "utilisateur12".

  • Export data from the database Table in the CSV file with OWB mapping

    Hello

    is it possible to export data from a database table in a CSV with an owb mapping. I think that it should be possible, but I didn't yet. Then someone can give me some tips how to handle this? Someone has a good article on the internet or a book where such a problem is described.

    Thank you

    Greetings Daniel

    Hi Daniel,.

    But how do I set the variable data file names in the mapping?

    Look at this article on blog OWB
    http://blogs.Oracle.com/warehousebuilder/2007/07/dynamically_generating_target.html

    Kind regards
    Oleg

  • Developer SQL 2.1.0.63.73 exports DATE as a TIMESTAMP

    In my view, that it is a bug. When I export a table to a XLS file, the values in the DATE columns are saved as if they were TIMESTAMP:
    for example. December 31 09 12:00:00 AM exported as 31 December 09 12.00.00.000000000 AM

    Not really a huge deal until you try to import, in which case you cannot import a TIMESTAMP into a DATE column. First of all, you will get an error from the Miss AM / A.M.. or AM/PM. You cannot explicitly specify the date format then import either as rejects Oracle it since it is not supported. The proper way is to cast to a date, but you can do this through the import feature.

    Nevertheless, I think the export function should export DATEs according to the NLS Date Format settings, but it is not.

    If it makes any difference, I use the 64-bit version of Windows of SQL Developer on Windows 7 64 - bit with the Oracle 64-bit client.

    Hello

    Don't know if it's something related to my previous problem.
    My SQL Dev gives the correct date format to export to Excel, but fails on an insert export

    Verdin, a member of the sql dev team gave this workaround solution that solved my problem
    >
    You can add the following in the sqldeveloper.conf to ensure that the driver tell not the column of a DATE in the TIMESTAMP column type.

    AddVMOption - Doracle.jdbc.mapDateToTimestamp = false
    >

    as suggested in this thread
    Re: 2.1 EA1: problems with Date type columns

    Hope this helps,

    Buntoro

Maybe you are looking for