Get Details About VM Task Reconfigure Events – PowerCLI

When troubleshooting an issue, many times I’ll see a “Reconfigure virtual machine” task in the VM Event log in vCenter. However, there is very little information as to what was actually done.


Luckily, you can get some of this information from PowerCLI by querying the event manager. Make sure you have PowerCLI loaded and are connected to vCenter to start

I’m going to use a test VM here which I’ve named “test”. I’m also going to specify that I want to look back 30 days into the event manager

$vmObj = Get-VM -Name "test"
$daysBack = 30

Now I just need to setup the filter to look only for events that have to do with the VM I’m specifying as well as the time frame I’m specifying. I’m using the VMware.Vim.EventFilterSpecByEntity to target the VM and the VMware.Vim.EventFilterSpecByTime to target my time frame.

$dateCurrent = Get-Date
$si = get-view ServiceInstance
$em = get-view $si.Content.EventManager
$EventFilterSpec = New-Object VMware.Vim.EventFilterSpec

#only target reconfiguration events
$EventFilterSpec.Type = "VmReconfiguredEvent"

#only target events on the specified VM
$EventFilterSpec.Entity = New-Object VMware.Vim.EventFilterSpecByEntity
$EventFilterSpec.Entity.Entity = ($vmObj | get-view).MoRef

#only query back to the specified number of days
$EventFilterSpec.Time = New-Object VMware.Vim.EventFilterSpecByTime
$EventFilterSpec.Time.BeginTime = $dateCurrent.adddays(-$daysBack)
$EventFilterSpec.Time.EndTime = $dateCurrent

Now to run the query

#run query
$evts = $em.QueryEvents($EventFilterSpec)

You can see that I’ve gotten back the 5 events seen in my screenshot above from vCenter


Since I’m troubleshooting, I don’t really care about every little event. Things like updating the notes or annotations on a VM will show up as events here. I’d rather just look at events where devices on the VM have been changed, which you can see is only 4 events

$deviceChangeEvts = $evts | ?{$_.ConfigSpec.DeviceChange}


Now I just want to look at a couple details about each change operation. So I’m going to select the device type, the change type, and the file operation in case it was a virtual disk event

$deviceChangeEvts | %{$_.ConfigSpec.DeviceChange} | select Operation,FileOperation,Device | ft -AutoSize


You may notice in the output above that there are 5 operations even though there are only 4 events I’m looking at. This is because a single event may have multiple operations. In this example, the first time I edited the VM settings I added a hard drive AND a vNIC at the same time, which you can see if I look only at that first event’s details


Now maybe I want some details on the hard drive that was added, like how big it is. You can drill into this info as well. Here I’m looping through each device changed and outputting additional information about the devices. You can see the capacity of that added drive was 10485760KB or 10GB


A couple things I’ve run into here. Changing the number of cvPU or the amount of vRAM on a VM doesn’t show up as a device change. So if you care about that and filter out anything that’s not a device change you’ll miss it.

Also, if you look at a change event I wasn’t able to find a way to see what the value was before the change. For example, I grew the size of a virtual hard disk from 40GB to 100GB. I can see the change event and the size of the drive as 100GB, but I don’t think it retains the old value of 40GB anywhere. So this isn’t going to help you get back to the original settings. You will need to do more investigation elsewhere for that ability

Scheduled Task – Audit AD Group Membership with PowerShell

There are a number of AD groups which I must provide membership reports on. Let’s say for the sake of this article that it has to be a weekly report. I can easily set this up using the PowerShell module for AD (provided in the RSAT for desktop OS) and my email function Email Array of Objects Using PowerShell. Once that’s working, I just need to add the full script as a scheduled task on a Windows server.

First off, I’m going to setup the information needed to send out the email

$dateSimple = get-date -UFormat "%m/%d/%Y"
$groupName = "Admin Group"
$to = ""
$from = ""
$subject = "Group Membership Report for $groupName on $dateSimple"
$smtp = ""

Now let’s grab the group membership information we need and send the email using my function. Couple notes here. First, you may need to include a line to manually import the AD PS module (Import-Module ActiveDirectory) at the begining of the script. Newer versions of PS do this for you automatically. Also, you need to include the code for my email function in the script or import it as a module.

$groupMembers = Get-ADGroupMember $groupName -Recursive | select Name,SamAccountName,DistinguishedName | sort samaccountname
Send-EmailHTML -To $to -From $from -Subject $subject -SMTPServer $smtp -BodyAsArray $groupMembers

Here is an example of the full code: AuditGroupMembership

The resultant email will look something like this:


Now I just need to schedule a task to run this script every week on a Windows server that has PowerShell and the AD PS module installed. I’ll copy the full script out to C:\AuditScript.ps1 on the server. Now I just need to setup the task

I’m going to create a new task on one of my Windows Server 2008 R2 boxes. I’m going to name the task and select the option to run it whether the user is logged in or not. You may want to specify a different user ID here as well. Ideally you would use an account where the password doesn’t change. Otherwise you will have to update the stored password periodically.


Going to set my trigger here for weekly on Mondays at 5AM


Last, I need to set up the action to run my script with PowerShell. It’s probably best to use the full path to the PowerShell executable, but for simplicity I’m not going to do that here. Then I’m going to pass the full path to my script in as an argument


When I go to save the task it will ask me to enter the password for the user ID that will run the script and I’m done. Now I don’t have to manually gather this information for the report, nor do I have to email it off to someone manually. PowerShell and the task scheduler will do all that for me each week!

Cleanup Stale Files on VMWare Datastores – PowerCLI

In large VMWare environments, it’s quite common to have random files leftover on VMFS datastores. Sometimes this is explainable: maybe an admin removes a VM from inventory instead of deleting it from disk and then forgets about it. Other times I’ve seen sVmotion leave entire VMDK files behind after a migration or if a migration fails midway through. Regardless of how it happens, it can be impractical to manually clean this up in a large environment. In order to better identify these leftover disk hogs, I use a PowerCLI function to look for unused directories which I run each month and have the results emailed to myself for review. The function scans the top level directories on a datastore and will consider a directory stale if non of the files within have been modified in the past 30 days (default value)

Since this can take a very long time to run, it outputs what datastore/directory it is currently looking at so you know it isn’t hung up. The function will return an array of custom PS objects that contain the Directory Name with the name of the Datastore that directory lives on for each stale directory identified. It will NOT attempt to delete anything, it is simply for reporting. It works by creating a VimDatastore PSDrive to each datastore and then using the basic Get-ChildItem cmdlet to browse around the files/directories.

You can run it with no parameters and it will use 30 days as the default for StaleAfterDays and run against all datastores returned by Get-Datastore. Or you can ignore local disks by passing SkipNonSharedDatastores and if you are connected to a vCenter instance you can target a specific data center by passing the TargetDatacenter argument.


function Get-StaleDatastoreDirs {
Get a list of datastore directories where all files are inactive for more than $StaleAfterDays days 
Get-StaleDatastoreDirs traverses top level datastore directories looking for any directory where all
the files within haven't been written to for over $StaleAfterDays days.  The default is >30 days

This command will run against all datastores and use a default value of 30 for $StaleAfterDays

Get-StaleDatastoreDirs -StaleAfterDays 60 -TargetDatacenter "My Datacenter" -SkipNonSharedDatastores

This command will run against datastores in the specified datacenter and return any that have over 60
days of inactivity.  It will also skip any non-shared datastores

    [CmdletBinding()] param (
        [Parameter(Mandatory=$false)] [Int]$StaleAfterDays = 30,
        [Parameter(Mandatory=$false)] [Switch]$SkipNonSharedDatastores,
        [Parameter(Mandatory=$false)] [String]$TargetDatacenter

    process {

        $current = Get-Date
        $listOfStaleDirs = @()
        try {
            if( $SkipNonSharedDatastores ) {
                if( $TargetDatacenter ) {
                    $dsList = Get-Datastore -Location (Get-Datacenter $TargetDatacenter -ErrorAction Stop) -ErrorAction Stop | ?{$_.ExtensionData.Summary.MultipleHostAccess}
                else {
                    $dsList = Get-Datastore -ErrorAction Stop | ?{$_.ExtensionData.Summary.MultipleHostAccess}
            else {
                if( $TargetDatacenter ) {
                    $dsList = Get-Datastore -Location (Get-Datacenter $TargetDatacenter -ErrorAction Stop) -ErrorAction Stop
                else {
                    $dsList = Get-Datastore -ErrorAction Stop
            $dsList = $dsList | sort Name
        catch {
            Write-Host $_ -ForegroundColor Magenta
        foreach ($datastore in $dsList) {

            Write-Host "Scanning datastore:" $datastore.Name
            New-PSDrive -Location $datastore -Name ds -PSProvider VimDatastore -Root '\' | Out-Null
            [array]$dirs = Get-ChildItem -Path 'ds:\' | ?{$_.Name -ne ".vSphere-HA" -and $_.ItemType -eq "Folder"} | sort Name
            foreach ($dir in $dirs) {
                Write-Host "`tScanning directory:" $dir.Name
                $numOfStaleFiles = 0
                [array]$subfiles = Get-ChildItem -Path $dir.FullName -Recurse | ?{$_.ItemType -ne "Folder"}
                #check to see if current file has been written to in $staleAfterDays num of days
                foreach ($file in $subfiles) {
                    if ( ($current.Subtract($file.LastWriteTime)).TotalDays -gt $StaleAfterDays ) {
                #if all files in directory haven't been written to in $staleAfterDays num of days
                if($subfiles.length -eq $numOfStaleFiles) {
                    Write-Host "`tFound directory stale for over $StaleAfterDays days:" $dir.Name -ForegroundColor Cyan
                    $custDirObj = New-Object PSObject -Property @{
                        DatastoreName = $datastore.Name
                        DirectoryName = $dir.Name
                    $listOfStaleDirs += $custDirObj
            Remove-PSDrive ds

        return $listOfStaleDirs