Unable to display data from a csv file data store
Hi all
I'm using ODI 11 g. I'm trying to import metadata from a csv file. To do this, I have created physical and logical diagrams corresponding. Context is global.
Then, I created a model and a data store. Now, after reverse engineering data store, I got the file headers and I changed the data type of columns to my requirement and then tried to view the data in the data store. I am not getting any error, but can't see all the data. I am able to see only the headers.
Even when I run the interface that loads data into a table, its operation without error, but no data entered...
But the data is present in the source file...
Can you please help me how to solve this problem...
Hi Phanikanth,
Thanks for your reply...
I did the same thing that you suggested...
In fact, I'm working on the ODI in UNIX environment. So I went for the record separator on UNIX option in the files of the data store tab and now its works well...
in any case, once again thank you for your response...
Thank you best regards &,.
Vanina
Tags: Business Intelligence
Similar Questions
-
Configure vswitches from a .csv file - problem
I have a script that works very well for the installation of my virtual past by using the updatevirtualswitch method. (Thanks to LucD, see here: ) http://communities.VMware.com/message/1556669#1556669 )
I now want to go further and to feed my script with variables from a .csv file.
It works fine for all, with the exception of the definition of vmnic. Description of the problem:
Excerpt from my .csv file:
lannics; dmznic1; dmznic2; dmznic3; storagenics; vmotionnics;
@("vmnic0"); @("vmnic1", "vmnic2");
Then import the settings from the csv file, example:
$lannics = $parameterfile.dmznic1
Now, if I check what's in $dmznic1, I have the good: @("vmnic1", "vmnic2")
But it seems to be a string, not a true table. Therefore, I can not pass it to my updatevirtualswitch function:
function standardvswitch {}
Param ($esx, $vs, [string []] $dmznic1)
....
$ns. UpdateVirtualSwitch ($vs, $vsSpec)
}
So the question is: how could I get my .csv file information, so that it can be used for a definition of vmnic compatible with the UpdateVirtualSwitch method?
Thanks for your help or ideas!
I've done a few tests more and saw that my previous solution does not work. "But one that follows). I'll try to explain how it works. In the .csv file, a semicolon is used as a field separator. This means that you can use a comma in a field to separate the members of the group. The Import-CSV cmdlet reads the .csv file and - separator "," parameter instructs the cmdlet that a semicolon is the field delimiter. The output of the Import-CSV cmdlet is piped in a ForEach-Object cmdlet. In the scriptblock ForEach-Object according to the value of the dmznic1 property string is shared by the comma, so each string before, between and after the comma will become a member of separate table. This table is assigned to the dmznic1 property. Loop loop through all the members of the Group and displays them on separate lines. So you can see that it is really a picture.
Import-CSV -Path LanNics.CSV -Delimiter ";" | ` ForEach-Object { $_.dmznic1 = ($_.dmznic1).split(",") for ($i=0;$i -lt $_.dmznic1.length; $i++) { Write-Output $_.dmznic1[$i] } }
See the attached screenshot for output.
I think that this solution is more beautiful that create a different column for each vmnic because in my solution you don't have to know in advance how many cards you have.
Post edited by: RvdNieuwendijk
-
Update of field Notes from a csv file
Hello
I want to update the Notes field for all of the virtual machines in my environment from a .csv file. To do this, I did and the export of my environment in a file .csv using this command:
Get - VM | Select Name, Notes | Export-Csv-path "c:\output\notes.csv" - NoTypeInformation
I now have a list of all virtual machines and their existing Notes. I manually changed the notes fields keeping the existing descriptions and adding, for all the virtual machines that had no notes. I would now like to merge the changes into vCenter and to crush any descriptions in vCenter but my code is any error on:
Import-Csv "c:\output\notes.csv" | % {Set - VM $_.} VMName-note $_. {Note - confirm: $false}
Any ideas?
Well, the field of your CSV is 'Name', not "VMName".
So, try:
Import-Csv "c:\output\notes.csv" | % {Get - VM $_.} Name | Set-VM-note $_. {Note - confirm: $false}
-
Anyone has any experience of importing a csv file and store it in a database in MS SQL
Anyone has any experience of importing a csv file and store it in a database MS SQL. Outside sql injection there is any another converns security?Draft of the steps.
1. use
to download the file from the client to the server see
http://www.dennismiller.TV/index.cfm/2007/12/26/file-upload-using-ColdFusion-and-Flex2. use
to read the contents of the file into a variable. 3. perform a loop on the content of variables for each line, dealing with the variable as a list delimited by a newline rather then a comma
4. use list functions on each line to get the data you need, and then to transmit these data to SQL with CFQUERY or CFPROCPARAM.
-
Hi, please help me whit this query
Im trying to extrate the data in a file csv and im using the ULT_FILE package
I have this query that read the file and the first field, but if the field has a different length does not work as it shouldFor example if I had this .csv file:
1, book, laptop
2, pen, Eraser
3, notebook, paper
And in the table, I had to insert like this
ID descrption1 description2
laptop 1 book
Eraser pen 2
paper laptop 3
For now, I have this query, which displays only with DBMS:
Declare
-Variables
Cadena VARCHAR2 (32767).
Vfile UTL_FILE. TYPE_DE_FICHIER;
Dato varchar2 (200); -Date
dato1 varchar2 (200);
dato2 varchar2 (200);
Identifier varchar2 (5): = ', '; -Identifier (en)
v_ManejadorFichero UTL_FILE. TYPE_DE_FICHIER; -For exceptions
-Table variables
I_STATUS GL_INTERFACE. % OF STATUS TYPE.
I_LEDGER_ID GL_INTERFACE. TYPE % LEDGER_ID;
I_USER_JE_SOURCE_NAME GL_INTERFACE. TYPE % USER_JE_SOURCE_NAME;
I_ACCOUNTING_DATE GL_INTERFACE. TYPE % ACCOUNTING_DATE;
I_PERIOD_NAME GL_INTERFACE. TYPE % PERIOD_NAME;
I_CURRENCY_CODE GL_INTERFACE. CURRENCY_CODE % TYPE;
I_DATE_CREATED GL_INTERFACE. DATE_CREATED % TYPE;
I_CREATED_BY GL_INTERFACE. CREATED_BY % TYPE;
I_ACTUAL_FLAG GL_INTERFACE. TYPE % ACTUAL_FLAG;
I_CODE_COMBINATION_ID GL_INTERFACE. TYPE % CODE_COMBINATION_ID;
I_ENTERED_DR GL_INTERFACE. TYPE % ENTERED_DR;
I_ENTERED_CR GL_INTERFACE. TYPE % ENTERED_CR;
I_ACCOUNTED_DR GL_INTERFACE. TYPE % ACCOUNTED_DR;
I_ACCOUNTED_CR GL_INTERFACE. TYPE % ACCOUNTED_CR;
I_TRANSACTION_DATE GL_INTERFACE. TRANSACTION_DATE % TYPE;
I_REFERENCE1 GL_INTERFACE. REFERENCE1% TYPE;
I_REFERENCE2 GL_INTERFACE. REFERENCE2% TYPE;
I_REFERENCE3 GL_INTERFACE. REFERENCE3% TYPE;
I_REFERENCE4 GL_INTERFACE. REFERENCE4% TYPE;
I_REFERENCE5 GL_INTERFACE. REFERENCE5% TYPE;
I_REFERENCE10 GL_INTERFACE. REFERENCE10% TYPE;
I_GROUP_ID GL_INTERFACE. GROUP_ID % TYPE;
BEGIN
Vfile: = UTL_FILE. FOPEN ('CAPEX_ENVIO', 'comas.csv', 'R');
loop
UTL_FILE. GET_LINE(Vfile,Cadena,32767);
dato1: = substr (cadena, instr(cadena, identificador,1,1)-1, instr(cadena, identificador,1,1)-1);
dato2: = substr (cadena, instr (cadena, identifier, 1, 1) + 1, instr(cadena, identificador,3,1)-3);
dbms_output.put_line (dato1);
dbms_output.put_line (dato2);
-The evidence
-dbms_output.put_line (cadena);
-dbms_output.put_line (substr (dato, 3, instr(dato, identificador,1,1)-1));
-dbms_output.put_line (substr (dato, instr (dato, identifier, 1, 2) + 1, instr(dato, identificador,1,1)-1));
-dbms_output.put_line (substr (cadena, 1, length (cadena)-1));
end loop;
UTL_FILE. FCLOSE (Vfile);
-----------------------------------------------------------------------------------EXCEPTIONS------------------------------------------------------------------------------------------------------------------------------------------------------------
EXCEPTION
When no_data_found then
dbms_output.put_line ('Todo Correcto');
When utl_file.invalid_path then
UTL_FILE. FCLOSE (V_ManejadorFichero);
RAISE_APPLICATION_ERROR (-20060,'RUTA DEL ARCHIVO NULLIFIED: (');)
WHEN UTL_FILE. INVALID_OPERATION THEN
UTL_FILE. FCLOSE (V_ManejadorFichero);
RAISE_APPLICATION_ERROR ('-20061,'EL ARCHIVO NO PUDO SER ABIERTO ");
WHEN UTL_FILE. INVALID_FILEHANDLE THEN
UTL_FILE. FCLOSE (V_ManejadorFichero);
RAISE_APPLICATION_ERROR (-20062, 'INVALIDO MANAGER');
WHEN UTL_FILE. WRITE_ERROR THEN
UTL_FILE. FCLOSE (V_ManejadorFichero);
RAISE_APPLICATION_ERROR (-20063, 'ESCRITURA ERROR');
WHEN UTL_FILE. INVALID_MODE THEN
UTL_FILE. FCLOSE (V_ManejadorFichero);
RAISE_APPLICATION_ERROR (-20064, 'MODO INVALIDO');
WHEN UTL_FILE. INTERNAL_ERROR THEN
UTL_FILE. FCLOSE (V_ManejadorFichero);
RAISE_APPLICATION_ERROR (-20065, 'ERROR INTERNO');
WHEN UTL_FILE. READ_ERROR THEN
UTL_FILE. FCLOSE (V_ManejadorFichero);
RAISE_APPLICATION_ERROR (-20066, 'LECTURA ERORR');
WHEN UTL_FILE. FILE_OPEN THEN
UTL_FILE. FCLOSE (V_ManejadorFichero);
RAISE_APPLICATION_ERROR ('-20067,'EL ARCHIVO ARE ESTA ABIERTO ");
WHEN UTL_FILE. THEN ACCESS_DENIED
UTL_FILE. FCLOSE (V_ManejadorFichero);
RAISE_APPLICATION_ERROR (-20068, 'REFUSED ACCESS');
WHEN UTL_FILE. DELETE_FAILED THEN
UTL_FILE. FCLOSE (V_ManejadorFichero);
RAISE_APPLICATION_ERROR (-20069, 'OPERACIÓN BORRADO FALLO');
WHEN UTL_FILE. RENAME_FAILED THEN
UTL_FILE. FCLOSE (V_ManejadorFichero);
RAISE_APPLICATION_ERROR (-20070, 'OPERATION SOBREESCRITURA FALLO');
END;
Hello
Try something like this:
POS1: = INSTR (cadena, idntificador, 1, 1);
POS2: = INSTR (cadena, idntificador, 1, 2);ID: = SUBSTR (cadena, 1, pos1 - 1);
description1: = SUBSTR (cadena, pos1 + 1, (pos2 - pos1)-1);
Description2: = SUBSTR (cadena, pos2 + 1);where pos1 and pos2 are numbers.
Rather than use UTL_FILE, consider creating an external table. You won't have to write any PL/SQL, and this means that you won't be tempted to write a bad article of EXCEPTION.
-
Delete snapshots by using data from a csv file
I have a csv file that was exported in the form of:
VM
SERV1
serv2
SERV3
(The file name is snaps4.csv)
I want to delete the associated clichés a vm in this file csv; However I can't get anything to work. Is the closest I've come by manually removing the header in the csv file (i.e. VM) then using the get-content command.
$vms = get-Content C:\scripts\Output\snaps.csv
Get-Snapshot - $vms vm | Remove-Snapshot - RemoveChildren-confirm: $false
The above command works, but I have to remove the header first (which I am fine with, if I do this, but I'm trying to automate this process for people of our operations and have a manual step is not ideal).
Someone help me? I know I'm missing just something simple here, but can't understand it.
Have you tried it?
Import-Csv C:\scripts\Output\snaps.csv | %{
Get-Snapshot - vm $_. VM | Remove-Snapshot - RemoveChildren-confirm: $false
}
-
Analysis 1 row into two lines from a CSV file
Hello
Im trying to read a csv of the external table data in a table target.
the problem is, I have a few lines, two names and names separated by spaces zwo (aspect ID2 and ID4)
the csv data have this format:
Source CSV file
ID1 | "" Max Miller ' | "Lyonerstr 99" | " "1000" | " "" Berlin "| "' The Germany.
ID2. «Hans Meyer Heidi Meyer «|» "Lyonerstr 100" | " "1000" | " "" Berlin "| "' The Germany.
ID3 | "" Stefan Tek | "Lyonerstr 200" | " "1000" | " "" Berlin "| "' The Germany.
ID4. ' José María Acero Acero ' |. ' "" Abcstr 111 | "2000" | " "" Hamburg ". "' The Germany.
Target table
ID1 | Max Miller | 99 Lyonerstr | 1000 | Berlin | Germany
ID2. Hans Meyer | Lyonerstr 100 | 1000 | Berlin | Germany
ID2. Heidi Meyer | Lyonerstr 100 | 1000 | Berlin | Germany
ID3 | Stefan Tek | Lyonerstr 200 | 1000 | Berlin | Germany
ID4. José Acero | Abcstr 111 | 2000. Hamburg | Germany
ID4. Maria Acero | Abcstr 111 | 2000 "|" Hamburg | Germany
Thank you very much.
with
external_table as
(select 'ID1' u_id, f_l_name 'Max Miller', ' Lyonerstr 'address, zip ' 1000' 99, "Berlin" city, country "Germany" in double union ")
Select 'ID2', 'Hans Meyer Heidi Meyer', ' Lyonerstr 100 ', ' 1000', 'Berlin', 'Germany' of the dual union all.
Select 'ID3', "Stefan Tek", "Lyonerstr 200 ', ' 1000', 'Berlin', 'Germany' of the dual union all.
Select "ID4", 'José Acero Acero Maria', ' Abcstr 111 ', ' 2000', 'Hamburg', 'Germany' from dual.
)
Select u_id, f_l_name, address, zip, city, country
from (select u_id,
-case when instr (f_l_name,' ') > 0
so to case when level = 1
then substr (f_l_name, 1, instr (f_l_name,' ')-1)
of another substr (f_l_name, instr (f_l_name,' ') + 2)
end
another case when level = 1
then f_l_name
end
end f_l_name
address, zip, city, country
of external_table
connect by level<=>=>
and prior u_id = u_id
and the previous address = address
zip and rar = prior
and prior city = city
and prior country = country
and prior sys_guid() is not null
)
where f_l_name is not null
U_ID F_L_NAME ADDRESS ZIP CITY COUNTRY ID1 Max Miller 99 Lyonerstr 1000 Berlin Germany ID2 Hans Meyer Lyonerstr 100 1000 Berlin Germany ID2 Heidi Meyer Lyonerstr 100 1000 Berlin Germany ID3 Stefan Tek Lyonerstr 200 1000 Berlin Germany ID4 José Acero Abcstr 111 2000 Hamburg Germany ID4 Maria Acero Abcstr 111 2000 Hamburg Germany Concerning
Etbin
-
How to display records from the xml file
Hi all
I created a region his area of research in reality
with 5 elements and array of result area
I want to search records based on the 5 elements and want to view output table
I have the table name as hr_api_transactions, which contains many columns
and also, this table contains a column
name TRANSACTION_DOCUMENT of type CLOB()
that files xml columns for each record
I want to extract data from this xml file and want to display.Instance of the view - put the name of VO with which you are extracting data
Discover the attribute and the attribute of file View - could see attribute by which you found the content of the file
File name substitution - is not mandatory, let him
The MIME type of the file: do not place anything, need to better manage programmatically through. Put the code in the below processRequest() method
OADataBoundValueViewObject contentBoundValue = new OADataBoundValueViewObject(downloadBean, "FileContentType"); // here "downloadBean" is bean of message Download item downloadBean.setAttributeValue(FILE_CONTENT_TYPE, contentBoundValue);
-Anand
-
Script to remove commas from a csv file
Hi all
I have the following output to a csv file:
VM, VI-SVC-VM014, 0000ed, 0000ee,
VM, VI-SVC-VM103, 0000f3, 0000f2,
VM, VI-SVC-VM104, 0000f6, 0000f6,
LDEV, SVC PROD Cluster01, 0000e2, 0x04
LDEV, SVC PROD Cluster01, 0000de, 0x00
LDEV, SVC PROD Cluster01, 0000df, 0x01
LDEV, SVC PROD Cluster01, 0000e2, 0x04I'm looking to remove commas end so that the output looks like this:
VM, VI-SVC-VM014, 0000ed, 0000ee
VM, VI-SVC-VM103, 0000f3, 0000f2
VM, VI-SVC-VM104, 0000f6, 0000f6
LDEV, SVC PROD Cluster01, 0000e2, 0x04
LDEV, SVC PROD Cluster01, 0000de, 0x00
LDEV, SVC PROD Cluster01, 0000df, 0x01
LDEV, SVC PROD Cluster01, 0000e2, 0x04The column length for each line (given the number of objects), modular in this output. By this, I mean that some outputs will be like this:
VM, VI-SVC-VM014, 0000ed, 0000ee, 0000ef
VM, VI-SVC-VM104, 0000f6, 0000f6VI-SVC-VM104, 0000f6, 0000f5, 0000f7, VM, 0000f4
LDEV, SVC PROD Cluster01, 0000e2, 0x04
LDEV, SVC PROD Cluster01, 0000de, 0x00Is it possible to import the data (get-content), delete all non alpha-numeric end of line? Maybe there is an easier way.
Thank you
Rob.
Try something like
Get-Content file.csv | %{ $_.TrimEnd(',') } | Set-Content newfile.csv
-
Unable to display data for the date where there is no entry in the table
Hello
I need a urgent, described below:
I have a table named as 'dirty', consisting of three columns: empno, sale_amt and sale_date.
(Please ref. The table with data script as shown below)
Now, if I run the query:
"select trunc (sale_date) sale_date, sum (sale_amt) total_sale of the sales group by order trunc (sale_date) by 1.
It then displays the data for the dates there is an entry in this table. But it displays no data for the
date in which there is no entry in this table.
If you run the Table script with data in your schema, then u will see that there is no entry for the 28th. November 2009 in
sales table. Now the above query displays data for the rest as his dates are in the table of the sale with the exception of 28. November 2009.
But I need his presence in the result of the query with the value "sale_date' as '28. November 2009 "and that of"total_sale"as
« 0 ».
Y at - there no way to get the result I need?
Please help as soon as POSSIBLE.
Thanks in advance.
Create the table script that contains data:
------------------------------------------
CREATE TABLE SALE
(
NUMBER EMPNO,
NUMBER OF SALE_AMT
DATE OF SALE_DATE
);
TOGETHER TO DEFINE
Insert into SALES
(EMPNO, SALE_AMT, SALE_DATE)
Values
(100, 1000, TO_DATE (DECEMBER 1, 2009 10:20:10 ',' DD/MM/YYYY HH24:MI:SS'));))
Insert into SALES
(EMPNO, SALE_AMT, SALE_DATE)
Values
(100, 1000, TO_DATE (NOVEMBER 30, 2009 10:21:04 ',' DD/MM/YYYY HH24:MI:SS'));))
Insert into SALES
(EMPNO, SALE_AMT, SALE_DATE)
Values
(100, 1000, TO_DATE (NOVEMBER 29, 2009 10:21:05 ',' DD/MM/YYYY HH24:MI:SS'));))
Insert into SALES
(EMPNO, SALE_AMT, SALE_DATE)
Values
(100, 1000, TO_DATE (NOVEMBER 26, 2009 10:21:06 ',' DD/MM/YYYY HH24:MI:SS'));))
Insert into SALES
(EMPNO, SALE_AMT, SALE_DATE)
Values
(100, 1000, TO_DATE (NOVEMBER 25, 2009 10:21:07 ',' DD/MM/YYYY HH24:MI:SS'));))
Insert into SALES
(EMPNO, SALE_AMT, SALE_DATE)
Values
(200, 5000, TO_DATE (NOVEMBER 27, 2009 10:23:06 ',' DD/MM/YYYY HH24:MI:SS'));))
Insert into SALES
(EMPNO, SALE_AMT, SALE_DATE)
Values
(200, 4000, TO_DATE (NOVEMBER 29, 2009 10:23:08 ',' DD/MM/YYYY HH24:MI:SS'));))
Insert into SALES
(EMPNO, SALE_AMT, SALE_DATE)
Values
(200, 3000, TO_DATE (NOVEMBER 24, 2009 10:23:09 ',' DD/MM/YYYY HH24:MI:SS'));))
Insert into SALES
(EMPNO, SALE_AMT, SALE_DATE)
Values
(200, 2000, TO_DATE (NOVEMBER 30, 2009 10:23:10 ',' DD/MM/YYYY HH24:MI:SS'));))
Insert into SALES
(EMPNO, SALE_AMT, SALE_DATE)
Values
(300, 7000, TO_DATE (NOVEMBER 24, 2009 10:24:19 ',' DD/MM/YYYY HH24:MI:SS'));))
Insert into SALES
(EMPNO, SALE_AMT, SALE_DATE)
Values
(300, 5000, TO_DATE (NOVEMBER 25, 2009 10:24:20 ',' DD/MM/YYYY HH24:MI:SS'));))
Insert into SALES
(EMPNO, SALE_AMT, SALE_DATE)
Values
(300, 3000, TO_DATE (NOVEMBER 27, 2009 10:24:21 ',' DD/MM/YYYY HH24:MI:SS'));))
Insert into SALES
(EMPNO, SALE_AMT, SALE_DATE)
Values
(300, 2000, TO_DATE (NOVEMBER 29, 2009 10:24:22 ',' DD/MM/YYYY HH24:MI:SS'));))
Insert into SALES
(EMPNO, SALE_AMT, SALE_DATE)
Values
(300, 1000, TO_DATE (NOVEMBER 30, 2009 10:24:22 ',' DD/MM/YYYY HH24:MI:SS'));))
COMMIT;
Any help will be necessary for me
Kind regardsWITH tab AS (SELECT TRUNC(sale_date) sale_date, SUM(sale_amt) total_sale FROM sale GROUP BY TRUNC(sale_date) ORDER BY 1 ) SELECT sale_date, NVL(total_sale,0) total_sale FROM tab model REFERENCE refmodel ON (SELECT 1 indx, MAX(sale_date)-MIN(sale_date) AS daysdiff , MIN(sale_date) minsaledate FROM tab) dimension BY (indx) measures(daysdiff,minsaledate) main main_model dimension BY (sale_date) measures(total_sale) RULES upsert SEQUENTIAL ORDER ITERATE(1000) until (iteration_number>refmodel.daysdiff[1]-1) ( total_sale[refmodel.minsaledate[1]+iteration_number]=total_sale[cv()] ) ORDER BY sale_date
using a clause type
Ravi Kumar
-
Add-adgroupmember are ZERO values as possible when you import users from a csv file?
I run the PS command against a csv file containing a list of ad groups followed by ADUsers below.
The script works very well as long as all fields are filled out, there at - it a command / switch that ignores a value not as such?
import-csv c:\admin\powershell\ADGroupMembers.csv | foreach {add-adgroupmember-identity $_.} {Ad group $_.member1, $_.member2, $_.member3, $_.member4, $_.member5}
ContentsADGroup of CSV file member1, member2, Member3, Member4, Member5 AD-Test1, Minnie, Mickey, Donald, Daisy, goofy AD-Test1, Minnie, Mickey, Donald, Minnie, Mickey, Donald, Pete, AD-Test1, SpikeHello
Your question is more complex than what is generally answered in the Microsoft Answers forums. It is better suited for Exchange Server on Technet. Please post your question in the Technet forums. You can follow the link to your question:
-
Helps to replace a string in a txt file with a string from a csv file
Hi all
I worked on the script following since a few days now, determined to exhaust my own knowledge before asking for help... Unfortunately, it didn't take very long to exhaust my knowledge :-(
Basically, I need to replace a value in a single file (raw.txt) with the value of another file (userids.csv) when a variable is. Then I released the results of a third file.
To do this, I divided the "raw" file into variables using the ',' as the separator, the problem is that some variables are intentionally empty, where the fi $variable = "statements.
It is currently what I want to do but only when the userids.csv file contains a single line. It is obviously because of the foreach ($user in import)... What I need to figure out is how to loop through the file raw.txt, text replacement when a variable in the user ID file is the text in raw.txt... I hope that makes sense?
The user ID file is in the following format - user, service, Dept that can contain dozens of lines
I would appreciate any pointers :-)
See you soon
# Treatment
$importraw = get-content i:\raw.txt
$import = import-csv i:\userids.csv-en-tete UserAccount, functional, Dept{foreach ($user in $import)
$useraccount = $user. UserAccount
$userfunction = $user. Functional
$userdept = $user. Dept
{foreach ($line in $importraw)
$first, $second, $third, $fourth, $fifth, $sixth, $seventh, $eighth, $ninth = $line - split(",")
$linesproc = $linesproc + 1
If ($sixth - eq ") {}
$temp6 = "6TEMP".
Write-Host "field Null detected - assigning temporary value:"$temp6 ".
$sixth = $temp6 # the assignment of a temporary value so that - statement to replace later works
}
If ($seventh - eq ") {}
$temp7 = "7TEMP".
Write-Host "field Null detected - assigning temporary value:"$temp7 ".
$seventh = $temp7 # the assignment of a temporary value so that - statement to replace later works
}
If ($fifth - eq $user.) UserAccount) {}
$line - $seventh, replace $user. Dept | Add content i:\Output.txt
}
else {}
$line - $seventh, replace "/ / customer. Add content i:\Output.txt
}
}}
Try the attached version.
The problem, in my opinion, was in nested ForEach loops.
Instead I've implemented with a lookup table
-
Lack of space in the unable to consolidate data store, what can I do now?
Hi all:
Today, a single server VM was off because of the latest snapshot of veeam.
I wen to vm machine to the Snapshot Manager and delete the snapshot.
After you remove the server has been started and it seems that everything is OK except the message this need of consolidation server.
I tried to regroup, but when I try to consolidate, it shows me this error
An error occurred in consolidating the discs: msg.disklib.NOSPACE.
After trial of vmware knowledge base I'm cloning virtual machine in another data store that is has a bit more space (temporary) so I test if it works this VM cloned as the original.
I have a few questions. the original VM is up and runing correctly... but I don't know what to do next, please can you help me?
1. I can move the original VM to another store of data with more than one space the machine consolidate? or it's very risky?
2 I can always use the actual virtual machine without group until I have buy new data store and the passage to the new data store and consolidate? or maybe to clone the virtual machine to the new data store and remove them the old VM?
3º if the cloned version works like the original VM machine, I guess I can pass version clone production and delete the virtual machine no consolidated this original data store?
Kind regards
There's nothing uncommon to 'Migrate' computer virtual data to another store. With Essentials it will unfortunately take time out well (without live migration of storage available).
In any case, the cloning should also work. The clone will get however a new address UUID and MAC, (i.e. you may need to reconfigure the network), and the cloned VM will get a new ID of virtual machine, which can be important if you use a virtual machine based backup application. However, cloning does not touch the source, so if something does not work as expected you still have the source available.
André
-
Unable to display data from database using persistence mobile A-team
Hello
We use the accelerator of persistence for Mobile A-team with application of the MAF. We built the REST services based for CRUD operations on a table. We are unable to use the POST CREATE service by application of the MAF. We tried using saveTask and addTask data control methods it generates automatically. Although it is saved locally to the iterator of data control, it doesn't push the data from database. We tried to test the service REMAINS independent and works well.
Can anyone help regarding if lack us measures in the same?
Thank you
Vignesh
Vignesh,
Indeed, there are two issues here:
-the attributes of the payload are different
-the structure of the payload is different, the GET request has an attribute "Task" that returns an array, the POST request submitting a single object through the attribute "taskDetails.
By default, the accelerator of persistence assumes that the POST request has the same payload structure and attributes as the GET request that you used to 'discover' the data objects.
However, you can easily change this as follows:
-To add additional attributes required in the POST, you should restart the wizard REST-JSON directly access the data object attributes screen and click on the button 'add an attribute '. Add each attribute you need in the POST request and the attribute name to the value required in the POST request payload. No matter that this attribute of the payload does not exist in the payload GET, it will be ignored during extraction of the tasks.
-To fix the second issue, you must change the persistence - mapping.xml:
-go to the createMethod task
-Remove the payloadElementName attribute
-Add the payloadRowElementName attribute and set the value to "TaskDetails.
It's a little strange that you post in order to get other attributes, it is there another resource GET that returns all attributes for a task?
In addition, you cannot derive columns audit as created_by on the server-side? It's a bit unusual set these values at the level of the customer.
Steven.
-
Links of images imported from a CSV file into a SQL table using phpMyAdmin
I have .csv with five areas, the first is images.
images150/9310VSony02.jpg.
This imported into a sql database using phpMyAdmin (first time user) table called rsIMAGESTEST.
Using Dreamweaver is applying insert for php objects/dynamic Table/Recordset XXX pages
The table is at - he saw in explores the image url is displayed rather than the jpeg format. image.
The rest of the data in other areas is fine stright txt. Please can someone explain what I am doing wrong.
Flowing is the sql I can to get what we need to change?
@mysql_select_db ($database_pauls, $pauls);
$query_Recordset3 = SELECT * FROM rsIMAGESTEST;
$Recordset3 = mysql_query ($query_Recordset3, $pauls) or die (mysql_error ());
$row_Recordset3 = mysql_fetch_assoc ($Recordset3);
$totalRows_Recordset3 = mysql_num_rows ($Recordset3);
To view the image, you must pass the value of the recordset to the src attribute of the tag. Assuming that the domain name is 'image ':
You can also use Insert a picture. In the dialog box select the Source Image, the value of "Select file name of" to "Data Sources." You can then select the domain name correct of your recordset.
Maybe you are looking for
-
Re: Need the bluetooth code for mouse PA3573E-1ETB
Hello I fight this mouse but unfortunately I lost my user manual. My problem is, I fight a new one PC I need bluetooth code to use this mouse. I tried 0000 and 1234, but it does not work.Someone has the manual of this product or code bluetooth help?
-
Satellite A50 cmos battery replacement
Hi, does anyone know the bios battery where satellite A50 series? mine need to replace soon.
-
Nice day I installed win7 64 b on HP nc8430... When I I c onnecter for external LCD screen, it says no signal do I have to load drivers AVG? If Yes can you kindly helped me with them. Thank you
-
Screenshot on tps2000 series oscilloscopes
Hello I've been trying to solve this problem for the last days. I have a Tektronixs TPS2024B oscilliscope that uses RS232 to communicate with Labview 8.0 and 2012 (both have the same result). The goal is to get a screenshot using commands on PAPER of
-
My wireless connection turns off when I restart my computer.
My Internet AT & T Uverse vendor sent me an adapter for my HP Compaq Presario windows XP computer's free wireless. However, once I installed it, it turned off everytime I restarted my computer. AT & T Uverse Tech Support says that it wasn't a problem