UTL_SMTP line wraps in csv. files after 990 characters.
I have a pkg and it does some validation. My requirement is to capture this validation result and send it to the user by e-mail.
For this I use UTL_SMTP.
In the package, I use a CLOB variable to get all the data required by the separation with commas.
Like this
IF g_error_tab. COUNT > 0
THEN
BECAUSE me in 1... g_error_tab. COUNTY
LOOP
l_msg_clob: = l_msg_clob | TO_CHAR (g_error_tab (i));
END LOOP;
END IF;
and then I call my UTL_SMTP package.
SEND_MAIL. SEND_MAIL (p_to = > ' [email protected]', )
p_from => ' [email protected] ',
p_subject = > 'Message of Test. "
p_text_msg = > "This is a test message."
p_attach_name = > 'test.csv ';
p_attach_mime = > ' text/plain; charset = us-ascii',.
p_attach_clob = > l_msg_clob,
p_smtp_host = > 'localhost');
I am able to send the email with the data you want, BUT I see a few lines get shortened with exclamation point!. Like this
1759 | 110 | 0 | 0 | 19926 | 0 | 0 | 00! |
I tried different MIME types, but did not work.
Appreciate your help.
Hello
The problem here is the way the linesize treats smtp server and not the utl_smtp package.
According to RFC 822, the maximum total length of a line of text including the
Some mail servers have their own limitation on the size of the line and insert a space or '!' as a newline after the maximum line length has been reached. The solution for this is to handle this by inserting the UTL_TCP newline character sequence. CRLF programmatically before reaching the length of the line max (authorized by the mail server). Note: It would normally be concatenate Chr (10) /
Reference: Extra spaces and punctuation marks in the e-mail message when you send mail with utl_smtp (Doc ID 461659.1) Kind regards Ravi
Tags: Database
Similar Questions
-
Analysis 1 row into two lines from a CSV file
Hello
Im trying to read a csv of the external table data in a table target.
the problem is, I have a few lines, two names and names separated by spaces zwo (aspect ID2 and ID4)
the csv data have this format:
Source CSV file
ID1 | "" Max Miller ' | "Lyonerstr 99" | " "1000" | " "" Berlin "| "' The Germany.
ID2. «Hans Meyer Heidi Meyer «|» "Lyonerstr 100" | " "1000" | " "" Berlin "| "' The Germany.
ID3 | "" Stefan Tek | "Lyonerstr 200" | " "1000" | " "" Berlin "| "' The Germany.
ID4. ' José María Acero Acero ' |. ' "" Abcstr 111 | "2000" | " "" Hamburg ". "' The Germany.
Target table
ID1 | Max Miller | 99 Lyonerstr | 1000 | Berlin | Germany
ID2. Hans Meyer | Lyonerstr 100 | 1000 | Berlin | Germany
ID2. Heidi Meyer | Lyonerstr 100 | 1000 | Berlin | Germany
ID3 | Stefan Tek | Lyonerstr 200 | 1000 | Berlin | Germany
ID4. José Acero | Abcstr 111 | 2000. Hamburg | Germany
ID4. Maria Acero | Abcstr 111 | 2000 "|" Hamburg | Germany
Thank you very much.
with
external_table as
(select 'ID1' u_id, f_l_name 'Max Miller', ' Lyonerstr 'address, zip ' 1000' 99, "Berlin" city, country "Germany" in double union ")
Select 'ID2', 'Hans Meyer Heidi Meyer', ' Lyonerstr 100 ', ' 1000', 'Berlin', 'Germany' of the dual union all.
Select 'ID3', "Stefan Tek", "Lyonerstr 200 ', ' 1000', 'Berlin', 'Germany' of the dual union all.
Select "ID4", 'José Acero Acero Maria', ' Abcstr 111 ', ' 2000', 'Hamburg', 'Germany' from dual.
)
Select u_id, f_l_name, address, zip, city, country
from (select u_id,
-case when instr (f_l_name,' ') > 0
so to case when level = 1
then substr (f_l_name, 1, instr (f_l_name,' ')-1)
of another substr (f_l_name, instr (f_l_name,' ') + 2)
end
another case when level = 1
then f_l_name
end
end f_l_name
address, zip, city, country
of external_table
connect by level<=>=>
and prior u_id = u_id
and the previous address = address
zip and rar = prior
and prior city = city
and prior country = country
and prior sys_guid() is not null
)
where f_l_name is not null
U_ID F_L_NAME ADDRESS ZIP CITY COUNTRY ID1 Max Miller 99 Lyonerstr 1000 Berlin Germany ID2 Hans Meyer Lyonerstr 100 1000 Berlin Germany ID2 Heidi Meyer Lyonerstr 100 1000 Berlin Germany ID3 Stefan Tek Lyonerstr 200 1000 Berlin Germany ID4 José Acero Abcstr 111 2000 Hamburg Germany ID4 Maria Acero Abcstr 111 2000 Hamburg Germany Concerning
Etbin
-
Attempt to send email from MASS of Thunderbird. I have the address of a CSV file in a column named Email. I {{Email}} in the TO: field of the email - and go through file-Mail Merge and have the CSV file in the dialog box. When I click on OK - I get the message 'no recipients were determined... ". »
Wile, we do not support the add-on here.
As a convenience.
Most of the problems of this nature with the add-on will focus on how the email header is defined in the CSV file. Make sure it is defined as the domain name, you think it is and that it is surrounded by double quotes. Also make sure that none of your "data" contains quotes like send the CSV into a spin, introducing this amount on a Cape randomly in the middle of data.Failure to comply with this e-mail from the author of the add-on. His email address is on the page of download add-on.
-
Error importing CSV files with "hidden" characters using the external Table
Hi people
Bit of a strange here.
Well, we are accustomed to the use of the external Table method to load data from CSV files in the database, but a recent event presented us a problem.
We have received some CSV files that "look like" regular CSV files, but Oracle will not load them.
When we looked at the CSV using VIM on a UNIX machine, we saw the following characters 'hidden' between each regular character in the file.
If a string that looks like this when opened in Excel/Wordpad etc.^@
Looks like this when exmained with VIM"TEST","TEXT"
Has anyone encountered this before?^@"^@T^@E^@S^@T^@"^@,^@"^@T^@E^@X^@T^@"
Thank you very much
Simon Gadd
Oracle 11g 11.2.0.1.0Hi Simon,.
^ @ represents the ZERO character (0x00).
So, most likely, you have a file encoded in Unicode.You need to specify the character set in the record specification (and if necessary the byte order mark), for example:
CREATE TABLE ext_table ( col1 VARCHAR2(10), col2 VARCHAR2(10) ) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY dump_dir ACCESS PARAMETERS ( RECORDS DELIMITED BY ' ' CHARACTERSET 'UTF16' FIELDS TERMINATED BY ',' ) LOCATION ('dump.csv') ) REJECT LIMIT UNLIMITED;
http://download.Oracle.com/docs/CD/E11882_01/server.112/e16536/et_params.htm#i1009499
-
specific line of reading a file
is there a VI to read a specific line of a csv file. I have a csv file that is too large to load and unfortunately whlle spreadsheet file reading, is useful in some cases, you can not specify which row to start from, it can begin in the first line.
I saw a lot of other suggestions on recent discussions, but none of them are a convincing solution.
It is absolutely wrong to say that reading worksheet cannot begin until the first line. There is an entry called 'start reading offset '.
Reading in blocks is to use this with a shift register. For example, if you wanted to read 1000 rows at a time, you must specify that the number of lines to read and the release of 'brand after reading' would be connected to a shift to the right and left, register wired to the "start of playback. The shift register may be initialized to 0.
-
Several Variables import from CSV file
Hello PowerCLI Masters!
I've been dabbling in PowerCLI now for about 6 months, especially for larger deployments VM. I've been tweaking my scripts to automate more over time. I started with setting all the variables manually and just import the virtual machine to a txt file names, then created several loops in a single script that would incorporate different text files, depending on model, data store, disc sizes VLAN, etc..
I think that the next step here is to import variables by vm (essentially, a line of a CSV file). We receive our demands for construction as a spreadsheet, so it would be relatively easy to fill in the correct fields in a spreadsheet and run a script that will import values. I can't seem to put on the right track with the simple search, so I thought I'd post it here and see if:
1. someone already did and can provide guidance
2. can someone put me on the right track to how I would go all this
3. someone tell me that it is not possible (in the hope that it does not go down like that, but it is certainly possible)
This IS JUST a SAMPLE of the SCRIPTS - may or may not have a syntax perfect at this stage...
# SET VARIABLES #.
$vmname = get-content C:\PowerShell\serverlist_1.txt
$vmname | {foreach-object
$vmname = $_$templatename = "TMPLT-GSI.
$folder = "GSI".
$date = get-Date - uformat "%Y %m %d '.
$expdate = "20110930".
$reqno = "00034.
$sysowner = "GSI".$esxhost = "esxhost1.test.com".
$datastore = "DATASTORE_1".$cpu = 1
$mem = 4096
$datadrive1 = 26214400$datadrive2 = 52428800
$pgname = "ECOM_902".#
# BUILD/CONFIGURE VM #.
#
New-VM - vmhost $esxhost - name $vmname - model $templatename - Datastore $datastore - thick DiskStorageFormat - location $folder
New-disk-hard - vm $vmname - CapacityKB $datadrive1
New-disk-hard - vm $vmname - CapacityKB $datadrive2
Get-VM-name $vmname | $mem set-VM - MemoryMB - NumCPU $cpu - confirm: $false
Get-networkadapter $vmname | together-networkadapter - networkname $pgname - confirm: $false
Get-VM-name $vmname | Game-CustomField-name "creation Date" - value $date - confirm: $false
Get-VM-name $vmname | Game-CustomField-name 'expiry Date' - value $expdate - confirm: $false
Get-VM-name $vmname | Game-CustomField-name "Application number" - value $reqno - confirm: $false
Get-VM-name $vmname | Game-CustomField-name "Proprietary system" - value $sysowner - confirm: $false}
So, rather than setting these values in the script, I want to shoot them from a spreadsheet that looks like this (a partial list at condition of course):
$vmname $templatename $cpu $mem $pgname $datadrive1 $datadrive2 Testdeploy001 TMPLT RHEL6 1 4 'ECOM_902 '. 26214400 52428800 Testdeploy002 TMPLT-2008_ENT_64 2 8 'ECOM_902 '. 26214400 104857600 Testdeploy003 TMPLT-2008_ENT_64 2 8 'ECOM_906 '. 26214400 104857600
Thanks in advance for any help provided!-Brent
vmname TemplateName central processing unit MEM pgname datadrive1 datadrive2 folder Testdeploy001 TMPLT RHEL6 1 4 'ECOM_902 '. 26214400 52428800 GSI Testdeploy002 TMPLT-2008_ENT_64 2 8 'ECOM_902 '. 26214400 104857600 GSI Testdeploy003 TMPLT-2008_ENT_64 2 8 'ECOM_906 '. 26214400 104857600 GSI I would do something like this:
$date= Get-Date -uformat "%Y%m%d" $expdate="20110930" $reqno="00034" $sysowner="GSI" $esxhost="esxhost1.test.com" $datastore="DATASTORE_1" Import-CSV C:\PowerShell\serverlist.csv | Foreach { New-VM -vmhost $esxhost -Name $_.vmname -Template $_.templatename -Datastore $datastore -DiskStorageFormat thick -Location $_folder New-Harddisk -vm $_.vmname -CapacityKB $_.datadrive1 New-Harddisk -vm $_.vmname -CapacityKB $_.datadrive2 Set-VM -VM $_.vmname -MemoryMB $_.mem -NumCPU $_.cpu -Confirm:$false get-networkadapter $_.vmname | set-networkadapter -networkname $_.pgname -Confirm:$false Get-VM -Name $_.vmname | Set-CustomField -Name "Creation Date" -Value $date -Confirm:$false Get-VM -Name $_.vmname | Set-CustomField -Name "Expiration Date" -Value $expdate -Confirm:$false Get-VM -Name $_.vmname | Set-CustomField -Name "Request Number" -Value $reqno -Confirm:$false Get-VM -Name $_.vmname | Set-CustomField -Name "System Owner" -Value $sysowner -Confirm:$false }
-
Load a CSV file into a table like in dataworkshop
Workshop of data has a function to load a CSV file and create a table based on it, the same, I want to create in my application.
I went through the forum http://forums.oracle.com/forums/thread.jspa?threadID=334988 & start = 60 & tstart = 0
but not able to download all the files (application, package HTMLDB_TOOLS and PAGE_SENTRY function) could not find the PAGE_SENTRY function.
AND when I open this link http://apex.oracle.com/pls/apex/f?p=27746
I could not run the application. I've provided a CSV file and when I click on SEND, I get the error:
ORA-06550: line 1, column 7: PLS-00201: identifier ' HTMLDB_TOOLS. PARSE_FILE' must be declared
tried in apex.oracle.com host as shown in the previous post.
any help pls..,.
Another method to load data into the tables..., (as dataworkshop)Hello
I have check app works very well.
Have you read instructions?
Load a CSV file in a table
>
Create a small csv file ascol1, col2, col3
VARCHAR2 (10), Number, "Number (10.2)" "
Cat, 2, 3.2
dog, 99, 10.4
>
First row must have valid column names. To verify that your first line of titles have no spaces or those who are not words reserved.
Second line of the CSV file must have column of table data types.When you meet these requirements app works perfectly
Kind regards
Jari -
Conversion of out-line of output to a. CSV file
Hi all
I was wondering if it would be possible to convert the output of the out-file cmdlet to a .csv file. When I use out-file it creates a text file that puts each item on its own line.
For example, the text file would be:
Server2
Server3
Server4
I would like to convert this .txt file to a .csv file, which would be:
Server2, Server 3, Server 4, etc.
I tried to use the Export-csv cmdlet instead of the out-file cmdlet, but I can't seem to make it work, so instead I was wondering if it would be possible to convert the text using a pre-made PowerCLI command or some type of line of the script to remove and replace characters and delimiters.
Thank you very much for any help or assistance that anyone can give.
Best
Oops, my mistake.
See if it works for you
(Get-Content 'C:\text.txt' | %{"'$_'"}) - join ',' |) Out-file "C:\csv.csv."
-
count the lines in a text/csv file.
Can you advice me how can I get the total number of lines in a text/CSV file please using java code.
I will get the contents of the text/csv file in a string variable to no not as a file.
EX: string var = "123\n234\n123\n3456\nsdfsd\n" this is \n in the new line.
for that I have to get the total rows 5.
Please advice.
Thank you.I think I have to try a split of the string and see how big the table is either simply to count the number of newline characters. Since you said you have a huge file and your data are contained in 1 chain, then you should be able to do it more conveniently.
-
CSV file will not work in Excel after update of the Sierra
Excel is not able to properly open CSV files more. I have two MacMinis to work with the same version of Excel/Office, but one with macOS Sierra displays all the data in the first column.
Must be a problem that macOS as Sierra mess up to the deliminter. Very annoying.
Barebones Textwrangler can force CSVs to use a standard delimiter that Office can understand. Try to open your CSV with that and save it as MS-DOS text file.
-
Correct format of the cells after the writing of csv file?
I tried, still struggling with what are probably simple enough to do. I write the data to a csv file, but my data is not from the way in which I need to look into the worksheet once its been written. Each of the string from the Panel data must be in their own column. Please see attached csv for what I aim to do in a finished program. Right now, I can write in the csv file, but the data and the fields are not in their correct columns/rows. Can someone make me jump started on this please? I am very new and green in Labview. This is my first program. Thank you.
Wire "T" in "transpose"?
-
Hello
its difficult for me to open the .csv file in labview. If someone you suggest program labview or suspicion for bellows file attached?
its really a challenge to open it in Labview.
So free then try to solve this problem.
Thank you
Use the worksheet in reading file. You can then use the table to Index to get the first colum and search for an empty string. This will give you the break between the header data and the actual data. Take all the data after this line and convert it to a number (String number floating/Exp).
-
How to scan more than 100 items of csv file in labwindows/CVI
Hello
I need little help related to playback of content from. CSV file.
My code is as follows:
FP = OpenFile ("FileName.csv", VAL_READ_ONLY, VAL_OPEN_AS_IS, VAL_ASCII);
ReadLine (FP, Line,-1);
I use the Scan() function to store all these values in the separate variable.
But I am getting error near following Scan() format string.
Please help me in this regard.
Thank you
Herald
Hi Ruben,.
the method simpler and faster to scan more than 100 arguments from a line is probably to use the keyword "rep" in the format string, as you can see in the onluen documentation and in following example taken from the ICB help: "String with separated by commas of the ASCII values in real table" search in the linked page. After reading return values, you can decide how to divide your table of values in important variables in your.
Another option would be to read online and then manually manage in a loop using strtok () function of line in single values.
-
SQL Loader loading data into two Tables using a single CSV file
Dear all,
I have a requirement where in I need to load the data into 2 tables using a simple csv file.
So I wrote the following control file. But it loads only the first table and also there nothing in the debug log file.
Please suggest how to achieve this.
Examples of data
Source_system_code,Record_type,Source_System_Vendor_number,$vendor_name,Vendor_site_code,Address_line1,Address_line2,Address_line3
Victor, New, Ven001, Vinay, Vin001, abc, def, xyz
Control file script
================
OPTIONS (errors = 0, skip = 1)
load data
replace
in the table1 table:
fields ended by ',' optionally surrounded "" "
(
Char Source_system_code (1) POSITION "ltrim (rtrim (:Source_system_code)),"
Record_type tank "ltrim (rtrim (:Record_type)),"
Source_System_Vendor_number tank "ltrim (rtrim (:Source_System_Vendor_number)),"
$vendor_name tank "ltrim (rtrim (:Vendor_name)),"
)
in the Table2 table
1 = 1
fields ended by ',' optionally surrounded "" "
(
$vendor_name tank "ltrim (rtrim (:Vendor_name)),"
Vendor_site_code tank "ltrim (rtrim (:Vendor_site_code)),"
Address_line1 tank "ltrim (rtrim (:Address_line1)),"
Address_line2 tank "ltrim (rtrim (:Address_line2)),"
Address_line3 tank "ltrim (rtrim (:Address_line3)).
)the problem here is loading into a table, only the first. (Table 1)
Please guide me.
Thank you
Kumar
When you do not provide a starting position for the first field in table2, it starts with the following after a last referenced in table1 field, then it starts with vendor_site_code, instead of $vendor_name. So what you need to do instead, is specify position (1) to the first field in table2 and use the fields to fill. In addition, he dislikes when 1 = 1, and he didn't need anyway. See the example including the corrected below control file.
Scott@orcl12c > test.dat TYPE of HOST
Source_system_code, Record_type, Source_System_Vendor_number, $vendor_name, Vendor_site_code, Address_line1, Address_line2, Address_line3
Victor, New, Ven001, Vinay, Vin001, abc, def, xyz
Scott@orcl12c > test.ctl TYPE of HOST
OPTIONS (errors = 0, skip = 1)
load data
replace
in the table1 table:
fields ended by ',' optionally surrounded "" "
(
Char Source_system_code (1) POSITION "ltrim (rtrim (:Source_system_code)),"
Record_type tank "ltrim (rtrim (:Record_type)),"
Source_System_Vendor_number tank "ltrim (rtrim (:Source_System_Vendor_number)),"
$vendor_name tank "ltrim (rtrim (:Vendor_name)).
)
in the Table2 table
fields ended by ',' optionally surrounded "" "
(
source_system_code FILL (1) POSITION.
record_type FILLING,
source_system_vendor_number FILLING,
$vendor_name tank "ltrim (rtrim (:Vendor_name)),"
Vendor_site_code tank "ltrim (rtrim (:Vendor_site_code)),"
Address_line1 tank "ltrim (rtrim (:Address_line1)),"
Address_line2 tank "ltrim (rtrim (:Address_line2)),"
Address_line3 tank "ltrim (rtrim (:Address_line3)).
)
Scott@orcl12c > CREATE TABLE table1:
2 (Source_system_code VARCHAR2 (13),)
3 Record_type VARCHAR2 (11),
4 Source_System_Vendor_number VARCHAR2 (27),
5 $vendor_name VARCHAR2 (11))
6.
Table created.
Scott@orcl12c > CREATE TABLE table2
2 ($vendor_name VARCHAR2 (11),)
3 Vendor_site_code VARCHAR2 (16).
4 Address_line1 VARCHAR2 (13),
5 Address_line2 VARCHAR2 (13),
Address_line3 6 VARCHAR2 (13))
7.
Table created.
Scott@orcl12c > HOST SQLLDR scott/tiger CONTROL = test.ctl DATA = test.dat LOG = test.log
SQL * Loader: release 12.1.0.1.0 - Production on Thu Mar 26 01:43:30 2015
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: classics
Commit the point reached - the number of logical records 1
TABLE1 table:
1 row loaded successfully.
Table TABLE2:
1 row loaded successfully.
Check the log file:
test.log
For more information on the charge.
Scott@orcl12c > SELECT * FROM table1
2.
RECORD_TYPE SOURCE_SYSTEM_VENDOR_NUMBER $VENDOR_NAME SOURCE_SYSTEM
------------- ----------- --------------------------- -----------
Victor Ven001 new Vinay
1 selected line.
Scott@orcl12c > SELECT * FROM table2
2.
$VENDOR_NAME VENDOR_SITE_CODE ADDRESS_LINE1 ADDRESS_LINE2 ADDRESS_LINE3
----------- ---------------- ------------- ------------- -------------
Vinay Vin001 abc def xyz
1 selected line.
Scott@orcl12c >
-
Configure vswitches from a .csv file - problem
I have a script that works very well for the installation of my virtual past by using the updatevirtualswitch method. (Thanks to LucD, see here: ) http://communities.VMware.com/message/1556669#1556669 )
I now want to go further and to feed my script with variables from a .csv file.
It works fine for all, with the exception of the definition of vmnic. Description of the problem:
Excerpt from my .csv file:
lannics; dmznic1; dmznic2; dmznic3; storagenics; vmotionnics;
@("vmnic0"); @("vmnic1", "vmnic2");
Then import the settings from the csv file, example:
$lannics = $parameterfile.dmznic1
Now, if I check what's in $dmznic1, I have the good: @("vmnic1", "vmnic2")
But it seems to be a string, not a true table. Therefore, I can not pass it to my updatevirtualswitch function:
function standardvswitch {}
Param ($esx, $vs, [string []] $dmznic1)
....
$ns. UpdateVirtualSwitch ($vs, $vsSpec)
}
So the question is: how could I get my .csv file information, so that it can be used for a definition of vmnic compatible with the UpdateVirtualSwitch method?
Thanks for your help or ideas!
I've done a few tests more and saw that my previous solution does not work. "But one that follows). I'll try to explain how it works. In the .csv file, a semicolon is used as a field separator. This means that you can use a comma in a field to separate the members of the group. The Import-CSV cmdlet reads the .csv file and - separator "," parameter instructs the cmdlet that a semicolon is the field delimiter. The output of the Import-CSV cmdlet is piped in a ForEach-Object cmdlet. In the scriptblock ForEach-Object according to the value of the dmznic1 property string is shared by the comma, so each string before, between and after the comma will become a member of separate table. This table is assigned to the dmznic1 property. Loop loop through all the members of the Group and displays them on separate lines. So you can see that it is really a picture.
Import-CSV -Path LanNics.CSV -Delimiter ";" | ` ForEach-Object { $_.dmznic1 = ($_.dmznic1).split(",") for ($i=0;$i -lt $_.dmznic1.length; $i++) { Write-Output $_.dmznic1[$i] } }
See the attached screenshot for output.
I think that this solution is more beautiful that create a different column for each vmnic because in my solution you don't have to know in advance how many cards you have.
Post edited by: RvdNieuwendijk
Maybe you are looking for
-
I need some files in my G4 long neglected. (I tried to use mode target for the information, but there are four internal drives, and only the first shows). Needless to say, I don't remember the password (I will never forget it!), and it is not in my l
-
I have HP Envy 5530, how to print images on T-shirts?
I have a HP Envy 5530 and have no word, does. How to reverse the image for printing on t-shirts? Anyone able to help. I need help by Tuesday, May 13, if possible. Thank you phylboyse
-
Hello I have a Pavilion dv7 - 2230ev running windows 7 complete edition x 64 after a new installation of the operating system. Although I could solve the problem with my buttons to quick launch to upgrade the BIOS of my product page and they are now
-
8600 pro Plus scanning for paper A4 on Mac
8600 pro Plus scanning for paper A4 on Mac I'm unable to scan A4 paper on my device of 8600 by using the HP scanning Application. I have validated that I use the latest software/firmware. I can print on A4 paper just can't sweep. Any help would be a
-
Error 1074384569 in NI Veristand Custom device while deplying code.
Hello I'm relatively new to products EITHER, but I've been learning on the go while working. Here's the problem I've been meeting as he tried to use Ni Veristand. The code for custom device, the system definition file and the XML file was all given t