UTL_FILE import csv file

Hello

I am writing a procdeure use the utl_file.get_line to read csv files and then proceed to a treatment before inserting data into a table.

The csv file has four columns. The third column data have a newline when reading the file I get an error value.

The procedure cannot detect the length of the 3rd field, it cannot detect the 3rd decimal point because this column has a line break.

To differentiate the length of the column between each ',' part of the code I use is below.

BEGIN

Starts: = UTL_FILE. FOPEN ("D:\TEST','CT. CSV', 'R', 32765);

loop

UTL_FILE. GET_LINE (starts, input_String);

delChar varchar2 (1): = ', ';

-first field
v_Pos: = instr (input_String, delChar, 1, 1);
v_lenString: = v_Pos - 1;
v_compname: = substr(input_String,1,v_lenString);

v_startPos: = v_Pos + 1;

-This will get the second field
v_Pos: = instr (input_String, delChar, 1, 2);
v_lenString: = v_Pos - v_startPos;
v_comptype: = substr (input_String, v_startPos, v_lenString);

v_startPos: = v_Pos + 1;

-3rd field
v_Pos: = instr (input_String, delChar, 1, 3);

v_lenString: = v_Pos - v_startPos;
v_notes: = substr (input_String, v_startPos, v_lenString);


v_startPos: = v_Pos + 1;
-the last land - there is no delimiter for last field
v_Pos: = length (input_String) + 1;
v_lenString: = v_Pos - v_startPos;
v_comptypemodel: = substr (input_String, v_StartPos, v_lenString);

My csv file looks like this when it is open with notepad ++,.

CA COOLER VSM 89 E, MSM 89 E, 'EL. AC-MOTOR UNITS.

ABB WADA 500L2L BSMH 11000V/1720KW/105 A/3584 RPM 60 HZ S1
COOLING WATER FLOW 12.3 m3/h
AMBIENT TEMP. 45 DEG", 123421
ABB WADA 500L2L BSMH 11000, V/1720KW/105 A/3584 RPM 60 HZ, EL S1. AC-MOTOR UNITS. ABB WADA, EL. AC-MOTOR UNITS.

The value of v_pos = 0 and v_lenSting =-30 for the 3rd field.

Hope someone can point me in the right direction on how to solve this problem.

Thanks in advance.

OK, my apologies, I don't see the real problem you're trying to solve in the first post (it helps us if you format code / data using a font courier etc. in the Advanced Editor)

Here is the response from Tom Kyte...

https://asktom.Oracle.com/pls/Apex/f?p=100:11:0:P11_QUESTION_ID:2818047000346046084

While there is an option in 11 GR 2 from an external preprocessor allows to process the file (which can be useful for files of decompression etc - see pretreatment: preprocess external Tables) it will probably not help in your case, because you will not easily be able to determine what 'new line' characters to substitute something else for you to make the difference when loading the data.

Creating the CSV data can be modified to replace the line break characters in text by something else?  or use a different delimiter to the end of line?  That would be the ideal solution.

Tags: Database

Similar Questions

  • Not possible to export a list of virtual machines that are created in the past 7, 30, 120 and 180 days since an imported csv file containing the date of creation of virtual machine

    Not possible to export a list of virtual machines that are created in the past 7, 30, 120 and 180 days since an imported csv file containing the date of creation of virtual machine. My questions is the correct statement to the variable: $VmCreated7DaysAgo: $_CreatedOn "-lt" $CDate7.

    # #SCRIPT_START

    $file = "C:\Users\Admin\Documents\WindowsPowerShell\08-18-2014\VM-Repo.csv".

    $Import = import-csv $file

    $VMCreatedLast7RDayRepoFile = "C:\Users\Admin\Documents\WindowsPowerShell\08-18-2014\Last7Days.csv".

    $start7 = (get-Date). AddMonths(-1)

    $CDate7 = $start7. ToString('MM/dd/yyyy')

    $VmCreated7DaysAgo = $Import | Select-object - property name, Powerstate, vCenter, VMHost, Cluster, file, Application, CreatedBy, CreatedOn, NumCpu, MemoryGB | Where-Object {$_.} CreatedOn - lt $CDate7} | Sort-Object CreatedOn

    $TotalVmCreated7DaysAgo = $VmCreated7DaysAgo.count

    $VmCreated7DaysAgo | Export-Csv-path $VMCreatedLast7RDayRepoFile - NoTypeInformation - UseCulture

    Write-Host "$TotalVmCreated7DaysAgo VMs created in 7 days" - BackgroundColor Magenta

    Invoke-Item $VMCreatedLast7RDayRepoFile

    # #SCRIPT_END

    You can use the New-Timespan cmdlet in the Where clause, it returns the time difference between 2 DateTime objects.

    An example of this cmdley

    New-TimeSpan-start (Get-Date). AddDays(-7)-end (Get-Date). Select days - ExpandProperty

    In your case, you could do

    Where {(New Timespan-démarrer ([DateTime] $_.))} CreatedOn) - end $start7). {7 days - gt}

    But beware of negative numbers.

  • Numbers does not correctly Import CSV files

    Using numbers v 3.6.2 under OS X El Capitan v 10.11.6 CSV files are not imported correctly. A file with 8 columns appears in number with the first two columns containing data of 7 columns and the second containing the remaining column data. This does not happen with numbers v 2.1 I have on another Mac OS x 10.6.8

    Well, I know one answer would be to always download on the Mac of the latter, but I'm trying to stop using it, once all the data I need is available on the other, later, Mac.

    If Apple could provide a program that does work properly, why on Earth can't a version later do the same thing, it is a fundamental requirement and I don't want to play anywhere to try to sort out just that it works as it should.

    might be useful to provide an example of the file, or, perhaps to share data... Nobody here can change the numbers, we are all of the users like you.  The only options that we are thus solutions using what is already there, or work around issues by other means.

    You can still provide feedback to post using the menu item "numbers > provide Feedback numbers.

  • Import CSV file in contacts

    I followed the procedure to import contacts in hotmail to the letter, but now I get the same error message "there was a problem importing your contacts, try to re-import: I tried again and again and again, two questions for you: 1, what is the problem 2, how to fix? It's frustrating me so, your phone technical support was no help either.

    This a question that should really be addressed in the Windows Live forum since the problem occurs with your HotMail account.

    A thread that may be of particular interest in the forum WindowsLiveHelp refers to how the field names are defined in the CSV file (apparently of various responses CSV field names must match the fields in Hotmail)

    How to import contacts from one. CSV file?
    http://www.windowslivehelp.com/thread.aspx?ThreadId=f5a002fc-02FC-489F-958b-cab40ebfb721&page=1

  • Import CSV file cell text in Memo field in FDM

    Hello world

    I try to import data in MDF Memo fields. I have a source CSV file that is imported via the batch loader. I suspect that I have to set a script but I don't know which. Can someone give nme (in a better sense of practice) some advice regarding:

    -What should be the layout of the source CSV file? (this is an extract a HFM prob app so not very flexible)

    -What script will need to be adjusted to import data in Memo fields?

    Some information:

    The source application is HFM which produces a CSV file for import of FDM. The data source contains the text amount and the cell for the amounts. FDM is used as an ETL to load data to another application of HFM. I understand theat LOADB Action script exports data of Memo fields to the text in the cell, provided that integration is enabled. So the second part of this load of data should work.

    Hello

    There are different options to achieve this.

    You can import your column with information about the cell to a dimension of the attribute. Then before the validation of data in the main table (script of the event ImportAction / sous-evenements PostWorkToMainProcess), you can create positions for memory based on this attribute and data loading. You can use SQL to insert the memo point tables or use fMemoAddItem (class ArchiveMgr)

    You must enable "Text the loading cell" of the adapter in the order that these memorandum items are loaded in the form of texts of cell in HFM.

    Hope that specifies

  • Need help importing CSV file with commas in the rest of the data

    I have a linux script that gets the CSV files from a remote server and import the data into a table using SQLLoader.

    The problem I have is that the data in the CSV files sometimes contains commas in the text field that is interfering with the import and I have no control over what is in the file.

    my table:

    Create the table my_tab
    * (TIME_STAMP date, *)
    REQUEST_IP varchar2 (30),
    User_id varchar2 (30),
    FACILITY_ID VARCHAR2 (255),
    Number SUBFACILITY_ID,
    DETAIL VARCHAR2 (255)
    *);*

    example of a CSV file:

    * 27032011 story 232708,162.108.20.61,user123,cstmr_view_hlr_history,0,Viewed with 10 results per page for 1234567890 *.
    * 232737,162.108.20.61,user123,cstmr_view_customer,0,Facility 27032011 [display] selected *.
    * 232744,162.108.20.76,user123,cstmr_add_gprs,0,Facility 27032011 [Add x] selected *.
    * 232759,162.108.20.94,user456,cstmr_hlr_request,0,Facility 27032011 [x orders] selected *.
    * 27032011 added 232806,162.108.20.94,user123,cstmr_hlr_request,2,Customer note: note [x], MSISDN [1234567890] *.
    * 232806,162.108.22.96,user789,cstmr_hlr_request,0 27032011, queue update: Action [46], IMSI [1234567890] old [, MSISDN IMSI [1234567890] *.
    * 232815,162.108.20.67,user789,cstmr_view_customer,0,Facility 27032011 [display] selected *.
    * 27032011 232822,162.108.20.67,user123,cstmr_view_customer,5,Screen ' display customer details ': MSISDN [1234567890] *.
    * 27032011 232702,162.108.20.57,user456,cstmr_hlr_request,0,Queue update: Action [45], IMSI [1234567890] old [, MSISDN IMSI [12345678901], AFN [], [PDPREC] *.
    * 27032011 connection 232825,162.169.22.108,user456,adm_login,1,Successful: user [user_name] *.
    * 232829,162.169.22.108,user456,cstmr_view_customer,0,Facility 27032011 [display] selected *.

    How can I get around the commas in the text?

    Published by: cinnamon on April 4, 2011 08:07

    There is a way to do it, but it depends on a few assumptions, and (b) you have to jump through hoops to do it.

    First of all, this only works if there is no comma request_id, user_id or facility_id. (if there is, then you're stuck, because there is no way to determine which commas are part of data and those that are separators.)

    Second, he must know if there are any particular character - say | -who never appears in the detail.

    If you can do both of these assumptions, then what you have to do is:
    (1) create an array that consists of a single VARCHAR2 column large enough to hold longer line of your input data.
    (2) to import your data into this table.
    (3) replace the first five commas in each row of the table with. (or whatever your "unused" character).
    For example:

    UPDATE temp_data_table
    SET row_text = REPLACE(SUBSTR(row_text, 1, INSTR(row_text, ',', 1, 5)), ',', '|') || SUBSTR(row_text,  INSTR(row_text, ',', 1, 5) + 1);
    COMMIT;
    

    (4) write a SQL INSERT command to build your target table according to the values separated by the | characters.

    -Don

  • Import CSV file and the conversion of an array

    Hi all

    I'm working on a site that is to import a CSV (comma separated values) published via actionscript 3.0 URLloader().  Now I just try to get actionscript to successfully enter the data imported from the CSV file as a table, then the CSV file has 1 single cell that contains "athleticMaroon, charcoal, colonialBlue, kellyGreen, fullColor".

    Here is the code I use:

    create table

    var shirtLiveIntense_btn_Colors:Array = new Array();

    run the import from CSV data

    URLLoaderCSV();

    shirtLiveIntense_btn.addEventListener (MouseEvent.CLICK, selectingLogo);

    function selectingLogo(e:MouseEvent):void {}

    trace ("current logo');

    var colorButtons:Array = this [e.currentTarget.name + "_Colors"];

    for (var i: uint = 0; i < colorButtons.length; i ++) {}

    .Ivar colorButtons [i] = i;

    colorButtons [i] .addEventListener (MouseEvent.CLICK, shirtColorOption);

    }

    }

    CSV data import function

    function URLLoaderCSV() {}

    var loader: URLLoader = new URLLoader();

    configureListeners (loader);

    var request: URLRequest = new URLRequest ("https://docs.google.com/spreadsheet/pub?hl=en_US & hl = en_US & key = 0AlJnOKOffTSxdFk0RVlEUTVHeF9 DMHZfZ0JzSkJjZFE & single = true & gid = 1 & output = csv");

    try {}

    Loader.Load (request);

    } catch (error: Error) {}

    trace ("unable to load requested document.");

    }

    }

    function configureListeners(dispatcher:IEventDispatcher):void {}

    dispatcher.addEventListener (Event.COMPLETE, completeHandler);

    dispatcher.addEventListener (Event.OPEN, openHandler);

    dispatcher.addEventListener (ProgressEvent.PROGRESS, progressHandler);

    dispatcher.addEventListener (SecurityErrorEvent.SECURITY_ERROR, securityErrorHandler);

    dispatcher.addEventListener (HTTPStatusEvent.HTTP_STATUS, httpStatusHandler);

    dispatcher.addEventListener (IOErrorEvent.IO_ERROR, ioErrorHandler);

    }

    function completeHandler(event:Event):void {}

    var loader: URLLoader = URLLoader (event.target);

    trace ("completeHandler:" + loader.data);

    shirtLiveIntense_btn_Colors.push (loader.data);

    }

    function openHandler(event:Event):void {}

    trace ("openHandler:" + event);

    }

    function progressHandler(event:ProgressEvent):void {}

    trace ("progressHandler loaded:" + event.bytesLoaded + "total:" + event.bytesTotal);

    }

    function securityErrorHandler(event:SecurityErrorEvent):void {}

    trace ("securityErrorHandler:" + event);

    }

    function httpStatusHandler(event:HTTPStatusEvent):void {}

    trace ("httpStatusHandler:" + event);

    }

    function ioErrorHandler(event:IOErrorEvent):void {}

    trace ("ioErrorHandler:" + event);

    }

    Here is the result:

    openHandler: [event type = "open" bubbles = false cancelable = false eventPhase = 2]

    progressHandler loaded: 57 total: 0

    httpStatusHandler: [HTTPStatusEvent type = "httpStatus" bubbles = false cancelable = false eventPhase = 2 status = 200]

    completeHandler: athleticMaroon, colonialBlue, kellyGreen, charcoal, fullColor

    chosen logo

    ReferenceError: Error #1056: cannot create ivar String property.

    at main_fla::MainTimeline/selectingLogo()

    Review of results of release, I see it's clearly to load the data from the CSV file correctly, but what I think it does is important data as a single string, aka 'athleticMaroon, charcoal, colonialBlue, kellyGreen, fullColor' and push for shirtLiveIntense_btn_Colors:Array = new Array().  But, as I see the error selectingLogo(e:MouseEvent) service cannot treat the table because it contains a string of.

    If I switch to shirtLiveIntense_btn_Colors.push (loader.data); with shirtLiveIntense_btn_Colors.push (charcoal, colonialBlue, kellyGreen, athleticMaroon, fullColor); everything works like a charm, but I need the table to assign dynamic in the CSV file data

    Can anyone help to get imported CSV data to strings pushed a table accessible?

    Thank you!


    It is an error caused by:

    var colorButtons:Array = this [e.currentTarget.name + "_Colors"];

    for (var i: uint = 0; i

    .Ivar colorButtons [i] = i;

    colorButtons [i] .addEventListener (MouseEvent.CLICK, shirtColorOption);

    }

    your table is an array of strings.  If you try to force these strings into objects on the timeline that contains your code, use array notation:

    var colorButtons:Array = this [e.currentTarget.name + "_Colors"];

    for (var i: uint = 0; i

    This .ivar [colorButtons [i]] = i;

    This [colorButtons [i]] .addEventListener (MouseEvent.CLICK, shirtColorOption);

    }

  • Error importing CSV files with "hidden" characters using the external Table

    Hi people

    Bit of a strange here.

    Well, we are accustomed to the use of the external Table method to load data from CSV files in the database, but a recent event presented us a problem.

    We have received some CSV files that "look like" regular CSV files, but Oracle will not load them.

    When we looked at the CSV using VIM on a UNIX machine, we saw the following characters 'hidden' between each regular character in the file.
    ^@
    If a string that looks like this when opened in Excel/Wordpad etc.
    "TEST","TEXT"
    Looks like this when exmained with VIM
    ^@"^@T^@E^@S^@T^@"^@,^@"^@T^@E^@X^@T^@"
    Has anyone encountered this before?

    Thank you very much

    Simon Gadd
    Oracle 11g 11.2.0.1.0

    Hi Simon,.

    ^ @ represents the ZERO character (0x00).
    So, most likely, you have a file encoded in Unicode.

    You need to specify the character set in the record specification (and if necessary the byte order mark), for example:

    CREATE TABLE ext_table
    (
      col1 VARCHAR2(10),
      col2 VARCHAR2(10)
    )
    ORGANIZATION EXTERNAL
    (
      TYPE ORACLE_LOADER
      DEFAULT DIRECTORY dump_dir
      ACCESS PARAMETERS
      (
       RECORDS DELIMITED BY '
    ' CHARACTERSET 'UTF16'
      FIELDS TERMINATED BY ','
      )
      LOCATION ('dump.csv')
    )
    REJECT LIMIT UNLIMITED;
    

    http://download.Oracle.com/docs/CD/E11882_01/server.112/e16536/et_params.htm#i1009499

  • How create the CSV file delimited by tabs using ORACLE utl_file?

    How create the CSV file delimited by tabs using ORACLE utl_file? Please provide the code sampl.

    This isn't a problem with Oracle, it is a problem with the way you open the data in Microsoft Excel.

    In Excel, you want (depending on your version may vary slightly)...

    Office 2010...

    1. go in the Ribbon "Data".

    2. click on 'text '.

    3. Locate and select your file, and then click "import."

    4 step 1 of the wizard - choose "Delimited", then click on "next >".

    5. step 2 of the wizard - choose "Tab" as the delimiter and click on "next >".

    6. step 3 of the wizard - define types of column as needed (if necessary) and click on "Finish".

    7. check where you want the data in the worksheet.

    Data is loading now in single cells as you expect.

    If you just double-click the CSV, Excel is apparently assuming that it will be separated by commas and does not recognize tabs as separators, unlike when you rename the file with a .xls extension where it examines the file, complains that it is not a content .xls and asks you to confirm that you want to continue loading and then intelligently recognizes the tabs and the format for you.

    As I said, not a problem with Oracle, just a problem with the MS Excel software.

  • Import address book csv file, in the pop-up window, it is only left list of columns displayed.

    See the attached pictures.
    windoew work is 'left & associarions columns.
    -of - columns are displayed to match items

    I met the 'left only column.
    -only - items are displayed, the left column is empty exercise.

    continuation of the process, there was a message that
    "error imporint duning happened that no address is imported.
    (Korean message Translsted)

    That I can't import the address file which is the csv file.

    I don't know what the problem only occurs on the Korean version.
    I tried only on the Korean version.

    How can I solve this problem?

    no photos.

    Open your CSV is a spreadsheet and make sure that it is really a CSV file. Sounds to me like the first line doesn't have a domain names at this topic. or they are not set properly.

  • import a csv file into AddressBook

    I'm migrating from outlook to Thunderbird. Import all the emails and addresses does not work. So I exported by e-mail addresses from my server to a csv file. I imported the csv file into Thunderbird. I have to OK the addresses import. Is it possible to import in one click

    If you export all your contacts in a single csv file, you should be able to import the file in a single step, although you may need to edit the csv file before you do:

    https://support.Mozilla.org/en-us/questions/1012084

  • Address book import of csv file gets empty address book

    I have a .csv file from a Windows 'Contacts' window. It contains four hundred entries, each composed of a name and E-mail address. The first line of the file says .csv "name, Email '; the entries are separated by returns and the areas that fall within the comma.

    When I import to intoThunderbird, I say to import an address book from the file and to import only the Email address and display name fields. When the import is executed, it is very fast and no errors are displayed. The new address book is displayed among the Thunderbird address books, but it is empty.

    Thank you
    Joe Nelan

    To import an address book, try the following steps:

    Of the Treasury Board, click "Address book" (or 3-bar menu-> tools-> address book).

    In the list in the left pane address books, click the one you want to import (or use the "personal address book").

    On the menu bar, select Tools-> import. Opens a new window "import."

    Click on "Address books" and "Next"

    Select "Text file", click "Next".

    Near the lower-right corner replace LDIF 'separated by commas.

    Navigate to the folder where your CSV file. Left click on it once. Click on 'open '.

    For CSV, there is no standard for the number or the order of the fields. The screen you see allows you to 'match' your entry with the fields of TB.
    With respect to mapping of the fields, you have two columns: one of your CSV names and one of the CT. What you're trying to get first name, last name first name family name, etc.. If you are lucky, they will be already matched to the top. But if not, you can click on one and move it upwards or downwards in the list until it is opposite the name of the corresponding field. This will get names, email, phone, etc. in the right places. Make sure the fields are checked and the ones you don't want is unchecked.
    Once you have everything set, click OK.

    Note 1: the file name of your CSV file becomes the name of the address book (for example AddrBook.csv will produce an address book named "AddrBook").
    Note 2: when you first watch the imported address book, it can be empty. Click on another (for example, "personal address book"), and then return to that matter.

  • Importing several files .csv in the data portal

    I'm looking to import data that contains multiple .csv files in the data portal. I have a code to succeed, which can erase all data from the data portal, open a dialoge box to find new data and open new data using a .csv use. I want to improve this script to be able to have the ability to load multiple datasets in the data at a time portal, but I have a limited knowledge of the VBS coding structure. Can someone advise measures to reach this solution with VBS minimal coding experience? Thank you all in advance for your help.

    Hi dc13.

    The tiara FileDlgShow command has the ability to load several files at the same time. Please take a look by using tiara for more details and an example.

    Greetings

    Walter

  • How to import a csv file in a histogram

    Hi, I'm looking for a way to be able to import data from a csv file in LabVIEW 8.6 have it create a histogram chart. I've seen this done before, so I know it's possible, I just need to some resources are started vi get me in the right direction. I have thousands of entries, and it gets teadious importation vi macros QI. Does anyone know of a vi, or the Toolbox that will give me the building blocks to do so. Appreciate any help, thanks, AJ

    Bob, the op clearly LabVIEW 8.6.

    AlphaDog,

    You have a shortened exaple of the data file you?  There are several channels or just one?

    No matter, if you look in mathematics-> palette of probability & statistics, you will find a Histogram.vi that will take a table of data and do a histogram out of it.  To display it, use a chart and change the type of route to be a bar chart.

  • Import contacts in Windows 8.1 .csv file MAIL app

    I have a new laptop, have a problem with the MAIL APP: can't find how to import my contacts from the contact.csv file that I saved in my old pc running VISTA. The mail client was LIVE MAIL.

    Went to APP CONTACTS but could not find the way to imprt leave a file, but only from facebook, twitter and so on.

    Can help?

    Thanks in advance, Tom

    Yes, I solved the problem, sorry for the late update

Maybe you are looking for