Import ASCII dates with different format

Hello

I need to import data from a CSV of ASCII.

The problem is the date format in this particular data file (.csv)

The date format for day< 10="" and="" month="" from="" jan="" to="" sep="">

AAAA/_M/_Dthe character '_' is a simple space, not an underscore literally

The format of the date days > = 10 and Jan to Sep month is:

_M/JJ/AAAA

The format of the date days > = 10 and months of Oct to dec is:

YYYY/MM/DD

The date format for day< 10="" and="" months="" from="" oct="" to="" dec="">

YYYY/MM/_D

Possible solution:

I already create a routine that recognizes the date in the title and one of the 4 (.stp) filter uses for data processing. These are the works.

Next problem:

The days are separated into 2 files, a file from 09:00 to 21:00 one day and the other from 21:00 to 09:00 on the following day. It's the way I've lost data when the day changes from 9th to the 10th of each month and the month change of Sept Oct and Dec to Jan.

I just need to delete the blankspace unconfortable before the changes of dates 2-digit, but I do not know how to deal with the before CSV imported to tiara (10.2).

My other idea is to recognize this file and any double (once with each please) but I need to exactly position the import is not the problem with the data of novalue. (Until know I got complicated for a simple mistake of blankspace).

I hope that you have ideas...

Thanks in advance...

I don't know if I understand that measures are not clear. Here's what I think you want to do:

(1) load data from different files in DIAdem

To do this, you must use "DataFileLoad("E:\Customer_Requests\caracasnet\log(111231).csv","caracasnet_log","Load") call.

You call DataFileLoad for each of the files.

(2) you want to concatenate the groups.
This should be no different than what you've done up to now

(3) you want to store the data in a file (TDM).
To do this, you must call DataFileSave (...)

Let me know if you have any other questions...

Tags: NI Software

Similar Questions

  • How to read the data with different XML schemas within the unique connection?

    • I have Oracle database 11g
    • I access it via JDBC: Slim, version 11.2.0.3, same as xdb.
    • I have several tables, each has an XMLType column, all based on patterns.
    • There are three XML schemas different registered in the DB
    • Maybe I need to read the XML data in multiple tables.
    • If all the XMLTypes have the same XML schema, there is no problem,
    • If patterns are different, the second reading will throw BindXMLException.
    • If I reset the connection between the readings of the XMLType column with different schemas, it works.

    The question is: How can I configure the driver, or the connection to be able to read the data with different XML schemas without resetting the connection (which is expensive).

    Code to get data from XMLType is the implementation of case study:

     1   ResultSet resultSet = statement.executeQuery( sql ) ; 
    2   String result = null ;
    3    while(resultSet.next()) {
    4   SQLXML sqlxml = resultSet.getSQLXML(1) ;
    5   result = sqlxml.getString() ;
    6   sqlxml.free();
    7   }
    8   resultSet.close();
    9    return result ;

    It turns out, that I needed to serialize the XML on the server and read it as BLOB. Like this:

     1    final Statement statement = connection.createStatement() ;  2    final String sql = String.format("select xmlserialize(content xml_content_column as blob encoding 'UTF-8') from %s where key='%s'", table, key ) ;  3   ResultSet resultSet = statement.executeQuery( sql ) ;  4   String result = null ;  5    while(resultSet.next()) {  6   Blob blob = resultSet.getBlob( 1 );  7   InputStream inputStream = blob.getBinaryStream();  8   result = new Scanner( inputStream ).useDelimiter( "\\A" ).next();  9   inputStream.close(); 10   blob.free(); 11   } 12   resultSet.close(); 13   statement.close(); 14  15   System.out.println( result ); 16    return result ; 17
    

    Then it works. Still, can't get it work with XMLType in resultset. On the customer XML unwrapping explodes trying to pass to another XML schema. JDBC/XDB problem?

  • acquisition of data with different sampling rates high

    I have a few questions on the use of the OMB-DAQ-3005 with different sampling rates high.

    For our application, we have 8 analog inputs. Which two are a quick response and should be sampled frequently.  We have an encoder quadrature (CPR 1000 running at 1800 rpm).  We plan to sample X 4 encoder.  For the analog inputs for the quick response, we want to trigger a sample of each pulse or each a few pulses, thus creating a timestamp with the position of the encoder with respect to position index as well as two fast analog inputs.  We have data correlating the analog inputs with the position of the encoder.   Other analog inputs, we want to measure relatively slowly (for example once every 5 dry or similar).

    How can I go on the configuration of the two (or more) sampling rates different wherein I can taste entered at different frequencies?  Also, is there a way to reset the encoder count after outbreak of the index as I have the position of the encoder with respect to the index?

    Maybe you'll find someone here who uses the OMB-DAQ-3005, but this forum is really more designed for LabVIEW programming issues.

    I've never used the OMB-DAQ-3005, but out of curiosity, I took a glance at the Manual of OMB-DAQ-3005.  The answer to both your questions are:

    1. you cannot run a hardware DQA Multiplex (like this one) at independent rates by channel.

    2. the OMB-DAQ-3005 supports an Index Z feature to reset the counter - look for documentation on how to configure any software interface you are using.  If you get stuck, you can try to discover media appropriate for instrument channel.

    Best regards

  • Different users accessing the data with different rights

    Hello
    I just started with Oracle and I don't know if
    This is the right way for what I was intending.

    I want to allow access to the data (tables, views,...) to different users with different rights.
    For example a user with select only the rights and another with select and insert rights.

    Database Oracle 10 g Express Edition is.

    Here are the steps I did:

    -This user will be allowed to select only
    create user UserSelectOnly...;

    -This user will have select and insert privileges
    create user UserSelectAndInsert...;

    -This user only contains data
    create user BaseDB...;

    -creation of test in BaseDB data
    create the table tdTest (...);

    -definition of role in BaseDB for right to select
    create the role RoleSelectOnly;

    -granting SELECT privilege for RoleSelectOnly
    Grant select on tdTest to RoleSelectOnly;

    -select grant (role) to the user UserSelectOnly
    grant RoleSelectOnly to UserSelectOnly;

    -definition of role in BaseDB to select and insert privileges
    create the role RoleSelectAndInsert;

    -grant select and insert privilege for RoleSelectAndInsert
    Grant select, insert on tdTest to RoleSelectAndInsert;

    -grant select and insert (role) to the user UserSelectAndInsert
    grant RoleSelectAndInsert to UserSelectAndInsert;

    My problem is that the definition of user (BaseDB) works as a database.
    Are there more effective ways in Oracle 10 g Express?

    Thank you
    Wilfried

    Hello Wilfried and welcome to the forum.

    First of all, it's a property that you have given time to review the security of the database.

    My immediate comments:

    You have 3 users:

    -This user only contains data
    create user BaseDB...;

    Very well, good thing. Use an administrative account/data owner.
    You might consider locking when you do not have administrative tasks in the schema.

    -This user will be allowed to select only
    create user UserSelectOnly...;

    -This user will have select and insert privileges
    create user UserSelectAndInsert...;

    OK, but no need to take account of their privileges in the names. You could simply call their

    SomeUser
    AnotherUser

    You have two roles

    -definition of role in BaseDB for right to select
    create the role RoleSelectOnly;

    -definition of role in BaseDB to select and insert privileges
    create the role RoleSelectAndInsert;

    Seems ok, it's a good thing here for the description of privileges in the role itself.

    Maybe I would call them something like
    RoleSelect or RoleRead or BaseDBSelectRole
    RoleInsert or RoleWrite or BaseDBUpdateRole

    Should be updated (and DELETE), as well?

    Another approach would be to separate between the tables.

    For a group of tables, select, insert, update, delete are granted to a role.

    For another group of tables, different privileges are granted to another role.

    That is, there is no general answer, correct, it depends on your application.

    Often the roles are used for ease of maintenance. In other words, if you have several users who need
    the same set of privilege, these are maintained by a role.

    Maybe you don't want to use roles, but I like them.
    But attention not only

    Select on all tables to a single role
    Insert on every table to another role
    Update on all tables to a third party role.

    Unless, of course, it gives meaning to your scenario.

    My problem is that the definition of user (BaseDB) works as a database.
    Are there more effective ways in Oracle 10 g Express?

    Not sure what you mean. Schema in Oracle and user are often used for the same thing.

    In your case, BaseDB is the owner of (all) the tables, no problem.
    As mentioned, you might choose to block the account.

    The main thing is, no application, or ad-hoc user should log in as BaseDB

    And the users of your system should be exactly the privileges they need to do their jobs
    nothing more, nothing less.

    Concerning
    Peter

  • Order by date with a format mask column

    10g - 10.2.0

    Hello

    This is my request
    select to_char(exp_Date,'Mon-YYYY') dt, count(*) from exp_main
    where exp_type like 'Income%Photo%'
    group by to_char(exp_Date,'Mon-YYYY')
    order by exp_date
    
     
    When I run this I get: not a GROUP BY expression

    If I remove the order by, it works fine.

    Is it possible to order by since the output of the query is sorted character.

    I search online most suggested to use the order of column_name. But does not work for me.

    Thank you!
    Ryan

    Hello

    Keep the date information as a DATE, with the exception (if required) for display. That means that GROUP BY and ORDER BY DATE, not a string:

    SELECT    TO_CHAR ( TRUNC (exp_Date, 'MONTH')
                  , 'Mon-YYYY'
                )          AS dt
    ,       COUNT (*)           AS cnt
    FROM        exp_main
    WHERE        exp_type     LIKE 'Income%Photo%'
    GROUP BY  TRUNC (exp_Date, 'MONTH')
    ORDER BY  TRUNC (exp_date, 'MONTH')
    ;
    
  • Excerpt from SQL data with specific format.

    All,

    I have the oracle database version 10.2.0.4 in which I need to extract data from a table using the format below.

    1.1 st 3 positions of the postal code, followed by
    2.1 st letter of the name, followed by
    3.1 st 2 consonants in the name of the street, followed
    4 2 (for companies)

    I can use the function substr to 1 and 2 above.
    For 3, I tried regexp_substr. but the output includes vowels too. I don't know how to proceed. Is there another function or is there another way to do it?

    I am concatenating 3 columns and display data as below.
    for example
    Col1            Col2         Col3     
    FLOOR INC  24540       67 FRANKLIN TPKE STE 114 
    
    Displayed as 
    245FFR2     
    234AAS2 
    Help, please.
    -- testdata:
    with yourtable as
    (
      select 'abcdefg' name from dual union all
      select '1aBexea' name from dual
    )
    -- query for first two consonants:
    select substr(regexp_replace(regexp_replace(name,'[^a-zA-Z]',''),'[aeiouAEIOU]',''),1,2)
    from yourtable;
    

    Explanation:
    -Internal Regexp removes everything except the letters.
    -Outer Regexp removes vowals and keep the consonants.
    -substr selects the first two consonants.

  • validation of digital data with a format mask

    Hello

    I have the following situation:

    field 'Amount' COMP (11.0) in a table.
    a form for this amount with a p8_amount parameter and a format mask. for example 1234 becomes 1,234.

    I entered a large amount of data.

    Now, I want to build a validation.
    But I get an error of conversion (character number) takes the apex the field amount 1234 makes 1.234
    and that's why I no longer use to_number (causing the conversion error).

    How to convert the parameter to a number?

    Any response will be appreciated.

    Thanks in advance.

    Leoa

    Hello

    TO_NUMBER (String) may have problems if there is non-numeric characters, however, you can apply a format string so that it can be used in the conversion:

    TO_NUMBER(string, '9G999')
    

    Andy

  • regexp: how to deal with different date formats?

    the date comes as varchar2 with '-'and':' as separators for day part and part time respectively. but it may come with different formats such as

    1993-05-17
    1993-05-17 13:04:23
    1993-05-17 13:04
    1993-05-17 13:4
    1993-05-17 13:04
    1993-05-17 13:4:2
    1993-5-17-13:4:2
    1993-5-7-13:4:2

    and so on. date of final format should be ' YYYY-MM-DD HH24-SS'. is it possible to have an intelligent way to deal with different formats of date with separators above and convert the final using regexp format, so that it is compact and universal as possible with assumptions/examples above?

    Thank you

    As others have said, the smartest way is to keep your input in a format method and store your data in a date column.
    But is a simple way to deal with this mess, without regexp,

    with test as (
    select '1993-05-17' d from dual
    union all select '1993-05-17 13:04:23' from dual
    union all select '1993-05-17 13:04' from dual
    union all select '1993-05-17 13:4' from dual
    union all select '1993-05-17 13:4:2' from dual
    union all select '1993-5-17 13:4:2' from dual
    union all select '1993-5-7 13:4:2' from dual
    union all select '1993-05-17 1:4 PM' from dual
    )
    select d
         , to_date( translate( d, 'xampAMP', 'x' )
                  , 'yyyy-mm-dd hh24:mi:ss'
                  )
         + case when instr( upper( d ), 'P' ) > 0 then 0.5 else 0 end cd
    from test
    
  • first instance LR 6 cc import goes well with the various cards, that one card does not work, usb HDD with only images, map 9571 photos 115GB, sometimes only 300 photos, then wait all night nothing, try the separate subfolders, only two (including su subfo

    trial LR 6 cc import goes well with different cards, that one card does not work, usb HDD with only images, map 9571 115GB, sometimes only 300 photos pics, then wait all night nothing, try separate subfolders, only two subfoldrs(including subfolder) done well, 175, and 125 peaks, other subfolders doesnot work, opendLR as administrator : no difference. I7 import about 60-70%. on the import of only about 300 visible pictures, the other empty square. solution?

    Hi willemm,

    first instance LR 6 cc import goes well with different cards, that one card does not work.

    Could you please develop the issue you are facing?

    What is the image format you import?

    USB HARD drive with only images, card 9571 115GB, sometimes only 300 photos pics

    I would not recommend to import gb 115 library at once.

    Kind regards

    Assani

  • IMP on a full dump but a single importation of data

    Dear Experts,

    When I import the successful dump, I drop the existing schema ' SQL > drop user username cascade; "and import the dump by"system impdp... ". ". I would like to import an image to an existing instance but only the import of data and let the current and other packages metadata intact and unchanged on the instance of the existing said.

    1. do I need to abandon the user prior to importation if my demands are foregoing?

    2. If I have to drop the user, what should be the script.

    3. for the import itself, should what settings I use?

    4. What are the necessary things, I need to take into account before doing the import.

    Your help and advice is greatly appreciated.

    ARO
    Hades

    TheHades0210 wrote:
    Hello

    Thank you for the answers.

    My concern is the data in the dump are updated but not the metadata from the destination instance is use for development, metadata are updated but not the data.

    Is there a way where I can update the data on the said instance without changing its metadata?

    Thank you for your time.

    Kind regards
    Hades

    I already mentioned to use with the 'CONTENT' option, see below. It will be important that data with the option of "DATA_ONLY".

    $impdp help=y
    
    Import: Release 11.2.0.2.0 - Production on Fri Feb 15 01:46:43 2013
    
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    
    .....
    
    "CONTENT"
    Specifies data to load.
    Valid keywords are: [ALL], DATA_ONLY and METADATA_ONLY.
    
  • value of two different format

    Hello
    I see two formats based on the value selected by the user, it is possible the same field with different formats?

    my code is like this

    <? choose:? >

    <? When: $Measures = "Killo? > <? price *.001? > <? end when? >

    <? otherwise:? > <? Price? > <? end otherwise? > <? end to choose? >


    examples if I have a value like this
    40.42 for display so except killo Killo, IIR is then it should display 0.04042

    Please let me know,

    Thank you
    ANU

    Published by: user7498756 on Sep 6, 2012 12:43

    1

    I tried still, it shows that the value that is set to true.

  • FTP adapter to query multiple files (different formats)

    Hello

    We have a requirement of survey and treat multiple files from a FTP location (either treat all or refuse-reject if there is no failure in one of the files).
    Functioning of the FTP Get File (Opaque schema) adapter, we are able to read three different file formats and can get the name of the file (FileName variable use of header).

    And we hope to process the contents of the file based on the name of the file we get from the above step.

    (1) how to convert the contents of the Opaque file to the specific schema (ex: file1.csv file2.csv and file3.csv with different formats)?
    (2) given that the requirement must treat all files at the same time, can we use correlation in the poller process to route to the simple controller process (an instance of the process controller for all files)?


    Any help would be appreciated.



    Thank you...

    Hello

    If you choose that one branch will be triggered at any time... to process all files in a single instance of u may have to use sync operation in the file adapter for the rest of 2 files of reading and the instance will be triggered by the adpater file first read operation.

  • How can I get date uniform and exit time to get-stat on servers located in areas with different regional settings?

    Hello together,

    I use a HSP on different vCenters script all over the world to sample performance data. Unfortunetaly vCenter servers concerninbg time Windows locale and date are different. For this reason that I ve had different and output date at the time of the csv. Here is an example to read a virtual computer processor performance.

    Get-Stat - feature $vm.name - Stat "cpu.usage.average" - IntervalMins 5 - (Get-Date).adddays(-1)-.addminutes(-5)-MaxSamples Start to finish (Get-Date) 288. Select-Object timestamp, value, unit | Export-Csv

    vCenter Server located in the United States with the American Time (12-hour AM/PM) and the settings of Date (day/month/year):

    'Timestamp', 'Value', 'unit '.

    "2010-06-27 12:00 PM","2.75","%.

    "

    "

    vCenter Server located in Europe with countries European regionalen (24 hours) and Date settings (day.month.year):

    Timestamp, value, unit

    "21.06.2010 20:00:00", "3.06", %

    "

    "

    For the import of these data to our central database, the formats must be uniform.

    Someone at - it an idea to get the uniform format?

    Thanks in advance

    Concerning

    Oliver

    Why don't you use the solution in Use-Culture - Culture culture - Script {scriptblock} ?

    You do all the writing to your database in 1 selected (for example en-US culture).

    This way all your date and and digital formats have the same layout.

    ____________

    Blog: LucD notes

    Twitter: lucd22

  • Import data from different DBs to HFM Application using FDQM

    Hi gurus

    1. how to import data from different databases to HFM Application using FDQM?
    Do we need to write scripts for integration in FDQM, or is there an alternative method, we need to do?

    concerning
    Dev

    Data import is long to explain,

    To be more precise,
    1. you would establish a connection with your source systems and destination where you pull the data.
    You can do this by registering adapters of integration for send_break_action source machines
    2. you would then upward some formats for importing, maps and tables of control.
    Please go through the Administrator's guide,

    One of the experts will help you better understand.

    Kind regards

    David Martin

  • convert the data in the Format of Date imported from MS SQL Server.

    I imported data from MS SQL Server. The column 'Date' received in digital format as 41017.6361109954. How can I convert Date in Oracle SQL.

    If I import the same data in Excel and you change the Type of column to this day. He passes. But in Oracle, I tried the To_Date function with different parameters, but it did not work.

    Published by: XAVER 22 April 2012 02:31
    select timestamp '1970-01-01 00:00:00' + numtodsinterval(41017.6361109954,'day') from dual;
    
    TIMESTAMP'1970-01-0100:00:00'+NUMTODSINTERVAL(41017.6361109954,'DAY')
    ---------------------------------------------------------------------------
    20-APR-82 03.15.59.990002560 PM
    
    SQL> 
    

    SY.

Maybe you are looking for