Getting the data center of VirtualMachine with Get-View information

Hello

Y at - it a faster way to get the name of data using Get-View and the viewtype VirtualMachine Center.

I found the following:

Get-View -ViewType VirtualMachine -filter @{ "name" = "mtl1fsit02" } | Select-Object -Property Name,

@{ Label = "GuestOSName"; Expression = { $_.summary.guest.guestfullname } },

@{ Label = "Datacenter"; Expression = { (Get-view (Get-View (Get-view $_.parent).parent).Parent).name } }

Thank you guys

I think that the property calculated for the data center does not work in all situations.

It assumes that your virtual machines are 3 levels down from the data center, which is not always the case.

I personally use a loop, passing up through the parents, until he finds an object data center.

something like that

@{N = 'Center'; E = {}

$parentObj = get-view $_. Parent

While ($parentObj - isnot [VMware.Vim.Datacenter]({))

$parentObj = get-view $parentObj.Parent

}

$parentObj.Name

}

Tags: VMware

Similar Questions

  • How to get the data center moref moref VM in c# using

    Hi all

    Could someone tell how to get managedobjectreference of data center using the VM moref? or any other easy way is? If anyone has examples of code in c# please share with me?

    Thank you

    Vijaya

    You can get the data center in which the virtual machine resides by the following search:

    VM Moref-> Parent (vmFolder)

    Moref folder VM-> Parent (data center)

    Parent of the virtual computer object will always be the vmFolder and the vmFolder object will always be a data center.

    For more details on the model of article inventory of vSphere, take a look at this blog post - http://www.doublecloud.org/2010/03/vsphere-inventory-structure-deep-dive/

  • Data store 'datastore1' is in conflict with a store of data that exists in the data center which has the same URL (.), but is back different physical storage

    Hello

    I am new to vCenter Server, so you can assume that I'm missing something obvious.

    I installed vCenter Server 5.5 and one of the two hosts of re-usable ESXi 5.0.0 connected correctly. When I try to connect to the second 5.0.0 ESXi host, I get the error message:

    Data store 'datastore1' is in conflict with a store of data that exists in the data center which has the same URL (.), but is supported by different physical storage

    I Googled it and found what I think are the best answers that are successful, but they all seem pretty unique in their situation (they have a cluster, I do not have) etc. Some solutions involve disconnecting the store of data and reconstruction of things. I would not make things worse and can live without doubt with the help of vSphere client (and not the web client) until maybe I can start again with a 5.5 installation to a new host once I have back up everything from the host unconnectable. I closed all the VM on the host of the second. I have put in maintenance mode. I've renamed the data store, all to nothing does not.

    Thanks in advance

    The problem is that a whole army could not be connected to vCenter due to the problem, the solution involving disconnection of the host apply.

    Here is how I solved the problem:

    1 use the client vSphere client heavy, connect you to the host that cannot not be connected to vCenter Server.

    2. click on the host computer, and then click the Configuration tab.

    3. click on "storage".

    4. find the offending on the right data store.

    5. right click on the data store incriminated and click 'remove '.

    6. click on 'Add storage' in the top right.

    7. follow the steps to find this data not mounted store. I gave him a new name just to be sure, even if it's probably not necessary.

  • Total amount of storage in the data center

    I'm trying to get a script that will give me the whole total amount of space used in a data center. I want to launch against the center of data from virtual centres. There is a total of approximately 1900 of virtual machines and virtual machine is distributed among several LUNS. I need to get the overall figure for a migration.

    PS: I want to get the numbers in GB instead of megabytes and I also want it spits out a report in .csv on my local machine (c:\scripts\xxxx.csv)



    Thank you guys.

    I'm afraid that it won't work by changing just the data center-Get Get-cluster.

    The reason is the cmdlet Get-data store later in the script, on the location setting it only accepts the data center objects, folder and DatastoreCluster.

    But with a little small change, we can get it to work for clusters as well.

    Get-Cluster | Select Name,@{N="Storage Used (GB)";E={  $sum = Get-Datastore -RelatedObject (Get-VMHost -Location $_) |   where {$_.ExtensionData.Summary.MultipleHostAccess -and $_.Type -eq "VMFS"} |  Measure-Object -Property CapacityGB,FreeSpaceGB -Sum  [Math]::Round($sum[0].Sum - $sum[1].Sum,2)}} |Export-Csv c:\scrpits\xxxxx.csv -NoTypeInformation -UseCulture
    
  • Orchestrator 5.1 REPORT-all the virtual machines in the data center and create a CSV file

    Hello

    What I basically want to do is create a report CSV of all virtual machines in the data center with various information (VMname, domain name FULL, IPaddress, status, data warehouses, tools etc...).  The export list in the client feature is insufficient (especially for any KPI report).

    Problem: (workflow is still under construction so real email send does not not and need to clean the code)

    I am stuck at the part of the creation of a table that can be parsed correctly in the CSV file.  2 ways I've tried will produce a report of single object or combine all of the table into a single string (where I am now).  I think the main problem I have is that I have to create a 2D within my service to push toward the final table that is written to the CSV format.  Basically, I do not understand how to push my variables in a loop of a table.

    Any help or assistance?

    Thank you

    B

    BOOM!

    Added some comments, removed the hardcoded in the csv temp file, deleted path the hardcoded port 25 for SMTP - mail settings should come from the configuration of the MAIL plug-in. And fixed / confirmed that the workflow now includes the attachment for e-mail and ends with success!

  • I have problem with my account. I did uptade my assinature to 27/08/2015 and my photoshop is blocked, because the date for uptade is with 22/09/2015. I need urgently. What do I do? Thank you

    I have problem with my account. I did uptade my assinature to 27/08/2015 and my photoshop is blocked, because the date for uptade is with 22/09/2015. I need urgently. What do I do? Thank you

    Hi Camila,

    You will need to contact support by calling/chat for this request.

    Contact the customer service

    * Be sure to stay connected with your Adobe ID before accessing the link above *.

    Kind regards

    Sheena

  • How can I reset my "authentication required" username and password? The fields are always filled with my old information.

    How can I reset my "authentication required" username and password? The fields are always filled with my old information.

    Follow these steps to delete the recorded data (form) in a drop-down list:

    1. Click on the (empty) input field on the web page to open the drop-down list
    2. Select an entry in the drop-down list
    3. Press the DELETE key (on a Mac: shift + delete) to remove it.
    • Tools > Options > Security: passwords: "saved passwords" > "show passwords".

    You may need to clear cookies from this site, so if you checked a box to remember you.

  • Info from data store based on the location of the data center

    Hello

    I wrote a powershell script to get the amount of free space available in the stores of data butthis is too high

    I need the information broken down according to the opinion of the 'data warehouses' with the virtual Center

    EG - my eyes record data centre as

    Teir 1

    X-x-x-x-x-x-x data warehouses

    Teir 2

    X-x-x-x-x-x-x-x data warehouses

    Teir 3

    X-x-x-x-x-x-x-x data warehouses

    Can someone please

    Altogether.  The problem has to do with the Canal works and references that you are trying to use the output the way.  There are several ways to handle this, and I think that it can do what you want.

    Get-Datacenter  |  Foreach-Object { $dc=$_; $dc | Get-Datastore | Select-Object @{name="DC Name"; expression={$dc.name}},Name,FreeSpaceMB,CapacityMB}
    
  • Shortcut to achieve VCDX virtualization in the data center.

    Friends,

    I would like to know to achieve VCDX can choose path below to avoid to submit the design VCDX defense.

    VCP5-VTC > VCAP5DCA > VCAP-CID > VCDX - data center virtualization.

    Who currently hold VCP5-VTC & VCP Cloud. VCAP5DCA inprogress... hope it will pass soon

    Any information will be greatly appreciated.

    Please notify.

    Thank you

    Knockaert

    Yes - actually if you want the VCDX Cloud you have to earn the VCAP-CIA and certification VCAP-CID before sit you for the VCDX defense according to the chart here - (VCDX-cloud) VMware Certified Design Expert-cloud.

    The other option is to pursue and get VCDX-VTC and when you pass the VCAP-CID cert, you will get the Cloud VCDX certification.

    I also move the VCDX forum

  • Evolution of the data center: N7k or C6500

    Hello

    I am curretnly studying how our data centers evolve.

    For physical physical architecture:

    -We have 2 data centers, L2 interconnected at 10 Gbps.

    -On each data center, 2 x C6500 serve basic and distribution at the same time.

    -The access layer is to C3750G, connected to the chassis base with 3x1Gbps etherchannels.

    -No server rack blade for the moment.

    For the logical aspect:

    -L2: we use STP lying on the 2 DC

    -L3: Core are connected directly to the MPLS cloud. We use the VRF.

    The next evolution of the server would be to replace the stand-alone servers by blade, connected with 10 Gbps.

    I was hesitant on how to improve the current design, and I'm not sure the + and - 2 (except that Nexus need investment)

    (1) keep the C6500 and:

    -Add maps to 10 Gbps on the C6500 (16-port cards)

    -use VSS on the two site to simplify the design STP (even though we have had no problems with it so far)

    (2) replace C6500 by Nexus7K for greater flexibility with 10 Gbps

    (3) as an option, add a distribution layer N5K which would simplify the wiring and allow a possible evolution FCoE (in this solution, I would keep C6500 as base or evolve N7k).

    What do you think?

    Thank you

    P.

    This is going to be a very controversial topic.

    Let's not talk about budget here because it could become very dark.

    The 6500 Sup720 or Sup2T (I still prefer the latter), is very good when you aggregate a significant amount of some 10 Gbps and 1 Gbps and a handful of 40 Gbit/s.

    The Nexus family is very good when you want to consider a few 100 Gbit/s, but a significant amount of 10 Gbps.  NEXUS is also the way to go if you want to connect your Fibre Channel switch DC as Nexus will support 1, 2, 4 - and 8 Gbps FC.

    Nexus 7 K can support 100 Gbps (two per card online), 40 Gbps (six per card online) and of course 10 Gbit/s (up to 48 by card online).

    Very quickly, the will be some service modules that will be introduced for the Nexus 7 K.  They are WAAS, ASA and a NAM.

    There will be life with the 6500?  In some cases, Yes.  It is still is not clear if ALL the 6500 service modules will evolve into the Nexus.  In my view, the three modules, I have just mentioned is to "test" the waters.  Once the momentum has been achieved, the others could follow.

    There are rumours abound that there could be a possibility of Nexus 2 K which will support the PoE.

    Attached something for your perusal.

  • Messages received during the date change, now stuck with them

    Hello, this is a stupid thing.

    I changed my date in the coming months with impatience and received messages from a friend. Now, I have these blocked as the latest messages and after I change the date back to normal, that I have to scroll up to actually see these last posts. How can I fix it without deleting the history? It is a little messed up and to wait for a period of two months to make them disappear is a big hassle, because now it is very inappropriate to use.

    The only alternative is to try to delete misplaced messages or correct their NICU SQLite editor timestamp.

    http://community.Skype.com/T5/the-Skype-lounge/delete-a-conversation/m-p/743928#M56527

    http://community.Skype.com/T5/Windows-archive/accidentally-changed-computer-date-chat-messages-are-n...

  • Rename the data center?

    Hello

    I would like to rename my datacenter. Is this possible without impact to the underlying Cluster, hosts, and virtual machines? I tried in my test environment without any problem, but first I'll do in production I want to be sure there are no known issues.

    The environment is:

    VSphere 5.1.0

    Data Center

    Cluster having about 22 hosts ESXi - 5.1.0 - 7997

    approximately 120 virtual machines.

    Thanks for your help!

    It will not affect your underlying infrastructure. You can go ahead and do it.

    It will create problems if you use the view.

  • How to identify the data that is associated with any book in crm on demand?

    We are experimenting with books, but I have yet to find an easy way to identify the data that are not associated with any book.

    Do you have a suggestion?

    I discussed this with Oracle in recent years CAB, but we have no indication if it will be included, but it could be part of their 'User' hotel subject they are studying.

  • Unable to update the dates of start/end with updateUser API

    Hi all

    I'm trying to update the dates of beginning and end using the updateUser API, but the dates are not defined. I know the updateUser call works because I tested it by changing the user's first name and it worked fine (checked by the IOM web app). Here is the code I use to update as well as the values of the sample that I use. No exception is thrown either. I have no idea what can be the problem. Also, one of my customer request must also display the time when these values where it is updated (via a custom web application that I am building). Is - is this posible? Before I received the time where the call to userUpdate was made and adding that the values would lie in the hash table, but started using 0 for the time since that's what I read on a thread on this forum (Re: (IOM) Timestamp format must be yyyy-mm-dd... )

    /*
    Examples of values
    startDate = 2010-11-08
    endDate = 2010-12-09
    */


    String startDateStr = startDate.toString () + "00:00:00.000";
    String endDateStr = endDate.toString () + "00:00:00.000";

    /*
    impression of the channels above
    startDateStr = 00:00:00.000 2010-11-08
    endDateStr = the 00:00:00.000 2010-12-09
    */

    AttrValues Hashtable < String, String > = new Hashtable < String, String > ();
    attrValues.put ("Users.Start", activateDateStr);
    attrValues.put ("Users.End", inactivateDateStr);
    usrOps.updateUser (rsUser, attrValues);


    Please help me out I was stuck on this problem too long.

    Thank you!
    -J' I

    Published by: cri_cri_99 on June 23, 2009 12:46

    It works fine for me.

    Map strDate = new HashMap();
    strDate.put ("Date of Users.Start", "' * 12:01:56.000000000 2009-06-24 * '");
    moUserUtility.updateUser (userResultSet, strDate);

    It will show you in DB as * 24 June 09 *. I don't understand what you are asking for more in your thread.

    But updateUser API works with this format. If you found the time to some application, convert to this format and run.

    If you need help, let me know.

    Published by: Arnaud

  • Is someone using the data cloud connector Melissa with Eloqua

    We are in the middle of our integration and try to make calls from Melissa Data of our Eloqua forms.  Anyone out there who did or can provide an overview?

    I understand that if someone used this service in real time - if a person completes a form and between a postal code invalid - in real-time we ping Melissa Data to know it is a zip code not valid, and the person filling out the form receives a message that is not valid the data.  Our territory lead routing depends on ensuring that we correct geographic information if we want the data validated in real time.  Thank you!

    Post edited by: Maia Tihista

    Maia,

    I could be wrong, but I'm not aware of a Melissa data cloud connector which was built immediately.  From what I know, I think that David York (on this thread) may be closest on that.  There are other people interested too, can't wait to see it built.

    Steve

Maybe you are looking for