Deleting the same files of many virtual machines remotely

Hi all

I tried to delete the same file several virtual machines remotely using the following script PowerCLI:

$a = Get - VM | %{

foreach ($nic as $_.) Guest.Nics) {}
Write-output $nic. IPAddress
}
}
foreach ($i in $a)
{$b = get-WmiOBJECT-query "SELECT * from CIM_DataFile where Name='C:\\test\\deleteme.exe'"-computername $i}
$b.delete)

}

It works perfectly when I run the script on the host virtual machines while I am running the script PowerCLI. When I Connect-viserver to another host however and try to run the script, I get the dreaded "RPC server is unavailable error (HRESULT: 0x800706BA)" for Get-WmiObject.  Virtual machines don't have Windows Firewall and have all TCP/IP, RPC and WMI services started, so who could eliminate any possibility.

I am also able to run other scripts that use the Get-wmiobject across all virtual machines such as scripts to change the DNS or Netbios name across many machines on different hosts.

Any comments would be greatly appreciated and please feel free to comment if you know maybe other ways which can delete files on multiple virtual machines in any other way.  Installation PowerCLI to run the script locally on each host would be a laborious process to do.

Thank you very much!

Best regards

To solve the problem with the quotes, you can use double quotes inside the string. Like this:

«del "«c:\documents and settings\text.exe»»»

Or you can use for the string single quotes and double quotes inside:

"del"c:\documents and settings\text.exe"

Tags: VMware

Similar Questions

  • Can I use the same windows on my virtual machine?

    I had windows pre-installed in my dell laptop (Latitude E6420). I did not have the cd key but I extracted using a program called Magical jelly bean key finder. Now, I have the following information on my windows installation

    Windows 7 Professional Service Pack 1
    Product no.: XXXXXXXX
    Installed from media "OEM."
    Product ID: XXXXX-OEM-XXXXXXXX
    CD Key: XXXXXXXXXXXXXXXXXXXXXXXX
    Computer name: XXXXX
    Registered owner: XXXXX
    Registered organization: Microsoft
    Now, I want to install linux on my machine and use the current windows in a virtual machine inside the linux host. Is is possible if I completely uninstall this windows of my computer and reinstall the virtual machine with the same key?
    I'm not very familiar with this also can someone guide me what are the steps I have to take if this is possible.

    Hi Andre,

    Thank you for your answer, but this does not completely answer my question. I also read that OEM installation does not install the same windows on another machine (virtual machine is like different machine even if I do this on the same machine) and installation of the mine is OEM type.

    More important still, how install windows in the virtual environment? If I format the computer and install linux, in addition to this install virualbox and install inside windows that use the cd - key and it will work. ? I doubt it because the windows is already activated in this machine.

    The extract of the contract (EULA) for Windows 7 and applies to the retail and OEM of Windows 7 versions. So, Yes, you can reinstall it in a Virtual Machine.

    Yes, you will need to format the machine under Linux, then reinstall Windows 7 in a computer program virtual suitable as Oracle Virtual Box 4.2, VMWare Player 5.0 or VMWare Workstation. You can then use the product key located on the COA sticker to reactivate the license.

  • 10-key Windows product exactly the same for all my virtual machines?

    I have a dozen of VMS I thought I should go to WIN10 and then continue to use as WIN7/WIN8 until I have enough time to deal with possible issues. I have the latest version of everything (less than a week old 12.x).

    Problem is, I get the same product key for machines that are totally different (all on the same host, however). Product ID Windows (!) stands, MAC is different, is different from the amount of memory, 32/64-bit is different... I even created a new virtual machine from scratch with a product key, I have never really used before... and the product key for WIN10 is the same as for everything else. HUH?

    Yes, I thought that my method to get the product key is bad, but when a virtual machine is created from scratch, there is no way that vm could 'know' anything a VM I just deleted.

    I thought it might be I use NAT for all virtual machines and it would affect anything, but as they have different MAC that would be foolish. So I would not be able to update one of my basic machinery is .

    Summer try a lot of things during this time to understand, like making a copy, by selecting "I copied it" but once I went to the deepest that create a new virtual machine and always get the same product key I have to say I am confused.

    Maybe I do not understand how Microsoft does these days. Maybe a different product id and a different product key = another activation and I would have no problem. I want to test it with 10 + vm that could start the Microsoft Genuine tests to fail, and all of a sudden I have to rebuild my entire network or sit in the management of queues for a week? Not really.

    If I understand Windows 10 facilities that are updated to Windows 7 and Windows 8.1 the activaion is a digital entitlement which is linked to a unique material ID online in the Windows store. The generic product key is only for compatibility with things that do not include digital rights.

    Wwhat you see with the same generic product key in each installation is normal behavior for upgrades of Windows 10.

  • Location of the VMX file for a virtual machine

    My problem,

    I have a bunch of VM that names the real record of the VM and files below are NOT the same as the VM name shown at the VI appointed client. -J' it can't wait... makes more complicated

    I want to know is if there is a way to a virtual machine that is in inventory and get the path to his report and the real VMX path of query file, which I guess would give me the name of the data store and the record in the data store.

    My question for example:

    My - vma: (in VIclient)

    [myds0100]/web34/web34.vmx (because if I get off by accessing the site after clicking on 'my - vma' VM in viclient)]

    You can do

    Get - VM | Select the path Name,@{N="VMX"; E={$_. ExtensionData.Config.Files.VmPathName}}

    You could also have a look at the VIProperty called VmxDatastoreFullPath.

  • Need help updated the VMX file on several Virtual Machine

    All,

    Can anyone help with a problem I'm having?  Here is a brief explanation of what is happening and what we need to solve.

    Recently we found out that we need to add a line in the file VMX of several virtual machines.  In the past, I was able to do this, but would have cold start the virtual machine for the changes to take effect.  Starts cold won't be a problem, because we can program it.

    However, no one knows a way to add this line in the file VMX virtual machines in a cluster.

    Here's the line I need added:

    Devices.hotplug = "false".

    Here is a script that was used earlier (esx 3.0 days) to keep the VM tools updated on reboot.  At this point, if the virtual machine has been turned on, this change could not do through the UI.  However, this script worked to update the .vmx file and then we have cold reboots for the changes to take effect.

    Get-viserver - < Server > - < user > - < password >

    $viview = get-Cluster-name NOMCLUSTER | Get - VM | foreach-object {get-view ($_.ID)}
    $viview | {foreach-object
    $vmConfigSpec = new-Object VMware.Vim.VirtualMachineConfigSpec
    $vmConfigSpec.Tools = new-Object VMware.Vim.ToolsConfigInfo
    $vmConfigSpec.Tools.afterPowerOn = 'True '.
    $vmConfigSpec.Tools.afterResume = 'True '.
    $vmConfigSpec.Tools.beforeGuestStandby = 'True '.
    $vmConfigSpec.Tools.beforeGuestShutdown = 'True '.
    $vmConfigSpec.Tools.ToolsUpgradePolicy = "UpgradeAtPowerCycle".
    $_. ReconfigVM ($vmConfigSpec)
    }

    The exact script can be found in Disable HotPlug

  • Run the script file from a virtual machine

    Hello

    I have a script (c:\script.ps1) I want to run on a virtual machine. Can someone tell me how to do?

    When I want to run the single command on vm I follow these steps:

    > > $cred = get-Credential

    > > Invoke-VMScript - win1 VM - GuestCredential $cred - ScriptText "dir".

    but I don't know how to put the script in a variable so that I can place it after the parameter - ScriptText.

    PS Script is not on the virtual machine, it's on the computer I have installed PowerCLI.

    LucD thanks for help

    I managed to do this way:

    $script = Get-Content - Path C:\script.ps1 | Out-string

    Invoke-VMScript - win1 VM - GuestCredential $cred - $script ScriptText

    without out-String I received the error "failed to convert System.Object [] to the System.String type.

  • GUI: how to run the same command on multiple virtual machines at the same time?

    Hello

    Suppose we have several operations on multiple virtual machines (for example as updated virtual hardware, install vmtools - but it could be something else of course...).

    I would avoid clicking on each virtual machine (especially when we have hundreds of different groups).  It is possible to select sort

    (they are not side by side) and perform on a particular operation on all selected both?

    Hello

    If you use vCenter 5.1 + you can also try to use marking and assign tags on the items in the inventory.

    VSphere 5.5 Documentation Center - apply a label to an object

    Search for items by this tag.

    If you want to use PowerCLI 5.5, you can also search the inventory by this tag.

    Get-VM-Tag which lists all the virtual machines with which

    Here are some links for getting started with vSphere PowerCLI

    Back to Basics: part 1 - installation PowerCLI | VMware PowerCLI Blog - Articles from VMware

    vSphere PowerCLI Documentation

  • Show desktop not the same number of vcpu virtual machine parent

    Hello

    I created a pool of 4 vcpu (Sockets 2 X 2 hearts).

    The created Office use only 1vcpu...

    any idea?

    If I understand the question, then it would be possible that the used snapshot happens to have 1 vCPU while the current state of the virtual machine that you used has 4 vCPUs?

    In other words, when you base the pool on a snapshot, I believe that the State of the equipment at the time of the snapshot will be used, not the State of the current material.

  • Time Machine, delete the old files from MacAir?

    We had a MacAir and a MacPro at home and both were using backups time machine on our 2 TB Time Capsule. I sold the MacAir and replaced by a new MacBook. Can I delete the backups Time Machine of old MacAir and make room on my time Machine 2 TB for Macbook? I get a message of disk space for the first backup of the MacBook.

    Can I delete the backups Time Machine of old MacAir and make room on my time Machine 2 TB for Macbook?

    Yes you can.  Your next question may be 'how' to do it.

    Use the Finder to open the disc Time Capsule, named "Data", except if you have renamed the reader

    Here, you will see two files sparsebundle, each bearing the name of Mac, who has been the object of a backup

    Click on the name of the sparsebundle file on behalf of the Mac to highlight the file

    Click the gear just above the window, and then click place in the trash

    He is not coming back, so make sure that you delete the correct file.

    Once the deletion over... it should only take a few minutes... Turn off and restart the time Capsule and you should be all set.

  • Not possible to export a list of virtual machines that are created in the past 7, 30, 120 and 180 days since an imported csv file containing the date of creation of virtual machine

    Not possible to export a list of virtual machines that are created in the past 7, 30, 120 and 180 days since an imported csv file containing the date of creation of virtual machine. My questions is the correct statement to the variable: $VmCreated7DaysAgo: $_CreatedOn "-lt" $CDate7.

    # #SCRIPT_START

    $file = "C:\Users\Admin\Documents\WindowsPowerShell\08-18-2014\VM-Repo.csv".

    $Import = import-csv $file

    $VMCreatedLast7RDayRepoFile = "C:\Users\Admin\Documents\WindowsPowerShell\08-18-2014\Last7Days.csv".

    $start7 = (get-Date). AddMonths(-1)

    $CDate7 = $start7. ToString('MM/dd/yyyy')

    $VmCreated7DaysAgo = $Import | Select-object - property name, Powerstate, vCenter, VMHost, Cluster, file, Application, CreatedBy, CreatedOn, NumCpu, MemoryGB | Where-Object {$_.} CreatedOn - lt $CDate7} | Sort-Object CreatedOn

    $TotalVmCreated7DaysAgo = $VmCreated7DaysAgo.count

    $VmCreated7DaysAgo | Export-Csv-path $VMCreatedLast7RDayRepoFile - NoTypeInformation - UseCulture

    Write-Host "$TotalVmCreated7DaysAgo VMs created in 7 days" - BackgroundColor Magenta

    Invoke-Item $VMCreatedLast7RDayRepoFile

    # #SCRIPT_END

    You can use the New-Timespan cmdlet in the Where clause, it returns the time difference between 2 DateTime objects.

    An example of this cmdley

    New-TimeSpan-start (Get-Date). AddDays(-7)-end (Get-Date). Select days - ExpandProperty

    In your case, you could do

    Where {(New Timespan-démarrer ([DateTime] $_.))} CreatedOn) - end $start7). {7 days - gt}

    But beware of negative numbers.

  • Cannot delete the plist file system.

    My Time Machine is not automatically back up my drive. He worked for some time (don't know why) and then stops working automatically. I tried to implement the procedure laid down by Linc Davis, delete the file com.apple.TimeMachine.plist, but make the message: 'com.apple.TimeMachine.plist' cannot be changed or deleted because it is required by OS X.  I tried to change the permissions in the file information window to include Administrator Read and Write (only has this permission system. Which translates the message: "the operation cannot be performed because you don't have the necessary authorization."

    Using the terminal window and sudo rm command produces the same message.

    I do a backup manually through the system preferences window, but it shows immediately that a backup disk has not been selected. Next time, I select the backup disk, perform a manual backup. Repeat the next time, I have run a manual backup.

    10.11.1 on new iMac, retina 5 K with 16 GB of memory running. Has made a migration to this previous computer Mac Pro 10.7.5.

    Also the Time Machine backup drive icon backup weather icon to see the Machine. Relaunch the Finder does not fix issue.

    Question: How can I delete the plist file?

    Thanks in advance for any advice.

    A bug in OS X 10.11.1 prevents automatic backups that happens when a desktop Mac (iMac, Mac mini or Mac Pro) is connected to a UPS (UPS) with the USB feature.

    Please, back up all data before proceeding.

    First, check the App Store for a later update to OS X. If it is available when you read this review, install it and see if the problem is resolved. I speak not beta versions.

    Otherwise, you can work around the bug as follows.

    These must be run as administrator. If you have only one user account, you are the administrator.

    Please triple - click anywhere in the line below on this page to select:

    sudo defaults write /Library/Preferences/com.apple.TimeMachine RequiresACPower 0

    Copy the selected text in the Clipboard by pressing Control-C key combination.

    Launch the Terminal application integrated in one of the following ways:

    ☞ Enter the first letters of his name in a Spotlight search. Select from the results (it should be at the top).

    ☞ In the Finder, select go utilities ▹ of menu bar or press the combination of keys shift-command-U. The application is in the folder that opens.

    ☞ Open LaunchPad and start typing the name.

    Paste in the Terminal window by pressing the command-V key combination. I tested these instructions only with the Safari browser. If you use another browser, you may need to press the return key after pasting. You will be asked for your login password. Nothing displayed when you type. Type carefully, and then press return. If you do not have a password, you will need to configure one before you can run the command. You can get a warning to be careful. Confirm. You don't need to display the warning.

    If you see a message that your user name "is not in the sudoers file", then you have not logged as an administrator. Log in as one and start over.

    Wait for a new line ending with a dollar sign ($) below what you entered. You can then quit Terminal.

    Restart the computer and test.

    (Credit for this solution for chrfr of logicielmac.com).

  • TDMS-2503 error recovery - delete the index file does not

    My team worked an acquisition of data on some test machines, and I went today to begin the analysis of the data.  It turns out that two of the TDMS files that we generated cannot be opened in Diadem, Excel or use the controls file TDMS Opend in LabVIEW.  I also tried the repair of PDM tool, with the same result.  The test machine has since been demolished, so there is no way to re - run tests.  Is it possible to have the files was looking to see if all the data can be recovered?

    The files were generated with LabVIEW 2013 SP1.  Attemtped to open them in the same version of LabVIEW, and Diadem 2015, and the importer of excel (excel 2010).

    TDMS files are rather large (> 1 GB), but the tdms_index files that were generated with them were 0kB.  Delete the tdms_index files and re - open the TDMS files had no effect.

    A first look into files (using the primitives to open the binary in LabVIEW), the initial section of the appearance of correct files (readable metadata that matches from a work file in a non-working file).

    He seems to have no notable connection between the two files that will not open, nor the differences in the situation of these files for work files.

    Any help would be greaty appreciated.

    TDMS files are written in chuncks.  At the beginning of each chunck belongs to the human readable semi which is data such as channel names and groups, but contains no actual data.  Then follows the binary data.  The index is intended for all semi human readable file and any data.  This should make the reading of the file easier because the file offset and such are in the index file smaller.

    This being said if you have some sections corrupted in your DDHN file you can always get the data out of it.  There is a code floating around that pull each of the segments and find out what is the problem, then all allows you to get all the data before the wrong section.  This thread talking about it, and a short code displayed around page 3.

    http://forums.NI.com/T5/LabVIEW/corrupted-TDMS-file/TD-p/2165954

  • CC synchronizes the same files over and over again?

    I have a 2.63 GB folder full of I want to transfer files.


    I tried zip but the Zip file was too big; CC apparently has the same size as the free services limit? Disappointing; I made the files larger than 2 GB single PSB.


    So I copied the entire folder to the creative Cloud Files folder, at the request of my computer to wait 8 hours before closing off the coast (it would take 6 hours) and left for Thanksgiving.

    When I got home, it was not done yet, some files have been synchronized, but not all. It says "sync files # 33", still going to last for 1 to 8 hours. Every day I start my computer, let it sync all day, and at the end of the day, he always says "Synchronizing folder # 33. I checked it often in recent days, it has been up to left 'file 22 of 33' with 1 hour, but it always resets to "Syncing file 1 of 33" with 6 + hours remaining. During today it seems not having synchronized a single file to another computer.

    Bottom line, I've been running 3 days and even to leave the machine on during the night. The thing is supposed to be "timing", but I did not always transfer the skinny 2.63 GB. I suspended / rebooted several times I need sometimes width soundtrack using to the top. I could have delivered a USB key everywhere in the country now if I knew it was so unable. I feel to chip away by this system, I need something that actually facilitates collaboration.

    Hello

    You can try and delete the two files that are originally a problem once we can check what are the two files is that from there, we can go. Have you sent in your log files? above details...

    Thank you

    Warner

  • Recovering files on a virtual machine that is inaccessible

    .

    Hi all

    My vm is deceased and I need help, recover, or at least the files it contains. I think I might of a lack of space because I have tried to delete some files and copied them accidentally, causing its gel and get this tint gray on the window. I then tried to shut it down and it would not, so I went into the Task Manager and killed the process, and now when I try to open it, it gives me "generic error". Is there a way I can recover the files on this virtual machine?

    eodnhoj wrote:

    Is this a problem?

    It is of course!

    You need you have extra space for other things like the .vmem file which is the size of the RAM that is assigned to the Virtual Machine and this isn't the only thing that you need extra space for the.

    Anyway you should make a copy of the virtual machine on a different drive having a more adequate space then and try to execute it or attempt to recover data from the user.

  • Make my MXF project format MXF spews on more than 25 copies of the same file: different sizes?

    When making my MXF project format MXF spews on a ton of makes MXF the same file: different sizes?

    The reason I want to make an another MXF is in the first place, because when I go directly to the project with all the various effects, Magic Bullet Looks, etc, etc.

    in another format, there are always problems.

    Therefore, I have not choice but to descend a generation to gather footage and made an exact copy of what is in the project to the MXF format before I make an MPEG or a H.264.

    The size of my project is more than 100 GB...

    In any case, my most likely choice is to go with the file with most gigabytes. As I say... He spits the different sizes.

    Is there a way to stop the rendering to spit on a ton of MXF files and streamline all in MXF format?

    I searched and searched the Internet for the answer and have checked the forums without success.

    The other thing that concerns me is that a project of more than 100 gigabytes becomes a 3.4 GB file to a 97-minute film. (It also makes a MXF to 1.5 Gig size. Everything is exactly the same thing on this record as far as I know, the rendering of Gigabyte 3.4) that she return 25 to 3.4 GB MXF files. (It's also take up a ton of space on my drive hard whenever I visit)

    It doesn't seem wise because when I visited about 8 minutes from the beginning of the film, he made a 2.9 Gigabyte file contrast with Gigbyte 3.4 file for the entire film.

    Somehow, I think that I can get the same quality when rendering the entire production.

    In addition, it takes more than 5 hours to make a MXF and I feel if it was just on 1 file, such as difference in more than 25 of the exact same rendering MXF files, rendering time would be faster and more efficient.

    Any help or explanations would be appreciated...

    Thank you

    James

    P.S. I deleted a whole bunch of MXF files to make, believeing that they were all duplicates. Of course, they were not, because the one I saved, I put in first to check and it displays only one segment of the film. Therefore, I think that all of the MXF should be connected and when you transfer one in, full MXF is displayed, however... If you delete everything except a MXF thought they are all duplicates, you only this segment of production. Seems to work quite the way to crazy? lol that sounds too wise in what concerns the gigabytes. However, first give the properties for that 1 file that you are importing, even if it is connected to all the other files. Projects likely to be total MXF gigabytes. Because I deleted all the files except one, I'm re - make everything. I'll let him place in case someone else runs into the same situation, I am sure someone will... lol

    the effects were interfering with a good record.

    It's probably not the right explanation.  Many export (which is different a rendering, take note of that) their projects directly from the sequence, effects and everything and get perfectly workable results.  In fact, I would say more that everybody does things like that.  I use MB effects for years, and yes it's hard reading.  But export always out just fine even on my system of $800.

    While your exports come out wobbly, it would probably be to fix it the problem to be solved.  But I would probably start looking for somewhere other than the effects.  Only those should not create the issue you described.

Maybe you are looking for