I occasionally find myself needing to determine which folders or files a user or group does or does not have access to.  With nested security groups things can get even more tricky.

I ended up writing get-ntfsAccess, a quick function that will do this for me.  I feed in one or more paths to test, one more users/groups, and I get back either all the folders they have access to, or all the folders they don’t have access to.  The function does the following:

  1. If specified, define and run Get-NestedGroups on the provided entities to provide all nested security groups
  2. Check ACL of each path for any ACE matching the users/groups or their nested security groups

Define and run Get-NestedGroups

This function is a bit longer than the Get-ntfsGroups function it is defined in.

$ADObjectDetails = Get-ADObject -filter "samAccountName -eq '$ADObject'" -Properties memberOf

Get the memberOf property for the object – we need this to determine if it’s a group or another object.

$ADObjectDetails | select -expand memberof | foreach {
     $subGroups += Get-adgroup $_ | select -ExpandProperty samaccountname

If it isn’t a group, we want to know what security groups provide access.  Loop through memberOf and get the samAccountName for each of these groups

if($group){  $subGroups = ( Get-ADGroupMember -Identity $ADObject | where {$_.objectClass -like "group"} ).samAccountName  }

If it is a group, we only care what groups are inside the security group

#If there are sub groups, recurse through them

    #add results of current query
    $allNestedGroups = [string[]]$subGroups

    #initialize subSubgroupscollection
    $subSubGroupsCollection = [string[]]""

    #look for subsubgroups in each sub group
    foreach($subgroup in $subgroups){
        #Run query, return query results for verbose output, then add it to a collection
        $subSubGroups = Get-NestedGroups -ADObject $subgroup -nestLevel $nestLevel
        if($subSubGroups){ write-verbose "$tabs Level $nestLevel $subgroup returned the following groups:`n$( $subSubGroups | out-string)" }
        $subSubGroupsCollection += $subSubGroups

    #add results from subgroups only if they aren't blank or already included
    $allNestedGroups += $subSubGroupsCollection | ?{$_ -and $allNestedGroups -notContains $_}


    #If we hit an empty group, return with nothing

#Once we've recursed through all groups, return results
Write-Verbose "$tabs Level $nestLevel Returning from $ADObject with following nestedGroups:`n$( $allNestedGroups | Out-String)"

After this, we add each group in subGroups to allNestedGroups, and recursively call this function on each of those groups

Check out the innards of Get-ntfsAccess on Script Center for the full definition of the function.

Check ACL of each path for any matching ACE

#Loop through each folder
foreach($folder in $folderlist){

    #Get ACL for the folder
    write-verbose "Checking access for $folder"
    $accessList = (get-acl $folder).access

    #Set access to null - used for determining noAccess
    $access = $null

    #For each access item in the ACL
    foreach($accessItem in $accessList){

        #Identify the group and track overall access for -noaccess parameter
        $accessItemGroup = $accessItem.IdentityReference.Value

        #loop through groups we are searching for
        foreach($group in $entity)
            #if we match a group...
            if($accessItemGroup -like $group)
                #add it to a list, if listOnly or noAccess
                    #add the result unless $noAccess is specified
                    if(-not $noAccess){ $folderResults += $folder | where { $folderResults -notcontains $_ } }
                    $access = 1
                    write-verbose "Access for $folder"
                #otherwise, add an object with access information to results
                    #add the result unless $noAccess is specified
                    if(-not $noAccess){
                        $folderResults += [pscustomobject] @{
                            Path = $folder;
                            Group = $accessItemGroup;
                            FileSystemRights = $accessItem.FileSystemRights;
                            AccessControlType = $accessItem.AccessControlType;
                            IsInherited = $accessItem.isInherited;
                            InheritanceFlags = $accessItem.InheritanceFlags;
                            PropagationFlags = $accessItem.PropagationFlags
                    $access = 1
                    write-verbose "Access for $folder"

    #if we didn't find a group matching input, $access is still $null
    if(-not $access -and $noAccess){
        $folderResults += $folder
        write-verbose "No access for $folder"


This is the final piece of the code.  I tried to comment things out to show what’s going on.

We have all the users/groups and folders (or files) to test.  The first stage looks at a single path and gets the ACL.  The next stage looks at each ACE in the ACL to see if it matches the users/groups we are interested in.  When it does, add it to the results (if checking for access) or note that we do have access (if checking for no access).  Once we’ve checked all ACEs in an ACL, if nothing matched the group we note that there is no access (if checking for no access).

The function in action

Pick up the script here and make sure you are running PowerShell 3

I start out with C:\temp:

Everything has default inherited ACEs, apart from one I added for a group nested under VSR VDI PFS:

Perhaps I only want a list of folders the provided entities have access to:

Lastly, the folders that VSR VDI PFS does not have access to (all but one):

Bonus:  You can run this on network shares

My next step is to add a parameter that controls the depth of path recursion.  As it is, I’m using the built in Get-ChildItem -Recurse parameter, but I could see situations where you only care about the first few levels of directories.

On an aside, Raimund Andrée wrote a fantastic module that covers managing permissions beyond get and set-acl – check it out here.

Parallel PowerShell: Part II

I posted on parallelization in PowerShell a short while back.  Check out that post for a number of references that I won’t be including here.

I ran into a few situations where a few threads would freeze, preventing tasks from running after the parallel code runs.  There is likely a more official way to do this, but here is how I implemented a timeout for each thread.

Using Boe Prox’s code, I hacked together Run-Parallel.  It does the following:

  1. Define Get-RunspaceData, a function that loops through runspaces, cleans up if they are done or over their max runtime
  2. Take in the code we will run against various computers
  3. Create a runspace pool
  4. For each computer to run against
    • Create a PowerShell instance with the scriptblock, add the computer name as an argument
    • Add details (Computer, PowerShell instance, start time, etc.) to an array
    • Run Get-RunspaceData
  5. After all computers queued up, run Get-RunspaceData until everything has completed

Let’s step through the code for each of these starting at (2):

Take in the code we will run against various computers

        #If scriptblock is not specified, convert script file to script block.
        if(! $scriptblock){
            [scriptblock]$scriptblock = [scriptblock]::Create($((get-content $scriptfile) | out-string))
        #if scriptblock is specified, add parameter definition to first line
            $ScriptBlock = $ExecutionContext.InvokeCommand.NewScriptBlock("param(`$_)`r`n" + $Scriptblock.ToString())

In this block, we convert $scriptfile into a scriptblock, otherwise we take in $scriptblock and add param($_) to the first line.  This lets you use the Run-Parallel function like a foreach(){} – you just reference the computer as $_.

Create a runspace pool

        $sessionstate = []::CreateDefault()
        $runspacepool = [runspacefactory]::CreateRunspacePool(1, $Throttle, $sessionstate, $Host)

In this block, we create a default session state, an create a runspacepool with this state and the throttle parameter (how many to run at once).

On a side note, if you want certain variables or modules to be available to all sessions, this is where you do it. More details here.  For example (this isn’t included in my function):

$sessionstate = []::CreateDefault()

For each computer

        ForEach ($Computer in $Computers) {
           #Create the powershell instance and supply the scriptblock with the other parameters
           $powershell = [powershell]::Create().AddScript($ScriptBlock).AddArgument($computer)

           #Add the runspace into the powershell instance
           $powershell.RunspacePool = $runspacepool

           #Create a temporary collection for each runspace
           $temp = "" | Select-Object PowerShell,Runspace,Computer,StartTime
           $temp.Computer = $Computer
           $temp.PowerShell = $powershell
           $temp.StartTime = get-date

           #Save the handle output when calling BeginInvoke() that will be used later to end the runspace
           $temp.Runspace = $powershell.BeginInvoke()
           Write-Verbose ("Adding {0} collection" -f $temp.Computer)
           $runspaces.Add($temp) | Out-Null

           Write-Verbose ("Checking status of runspace jobs")
           Get-RunspaceData @runspacehash

In this block, we loop through each computer.

We create the powershell instance (if desired, you can add more arguments if needed by tacking on another .addargument() ), and add it to the pool.

We create $temp, which contains the computer, powershell instance, start time, and handle output from beginInvoke.  We add this object to the $runspaces array that will be used for tracking each runspace.

Finally, we run Get-RunspaceData.  We do this for each computer because we may start getting results before we can build up all the runspaces.


        Function Get-RunspaceData {

            Do {
                #set more to false
                $more = $false

                Write-Progress  -Activity "Running Query"`
                    -Status "Starting threads"`
                    -CurrentOperation "$count threads created - $($runspaces.count) threads open"`
                    -PercentComplete (($totalcount - $runspaces.count) / $totalcount * 100)

                #run through each runspace.
                Foreach($runspace in $runspaces) {

                    $runtime = (get-date) - $runspace.startTime
                    #If runspace completed, end invoke, dispose, recycle, counter++
                    If ($runspace.Runspace.isCompleted) {
                        $runspace.Runspace = $null
                        $runspace.powershell = $null

                    #If runtime exceeds max, dispose the runspace
                    ElseIf ( ( (get-date) - $runspace.startTime ).totalMinutes -gt $maxRunTime) {
                        $runspace.Runspace = $null
                        $runspace.powershell = $null

                    #If runspace isn't null set more to true
                    ElseIf ($runspace.Runspace -ne $null) {
                        $more = $true

                #After looping through runspaces, if more and wait, sleep
                If ($more -AND $PSBoundParameters['Wait']) {
                    Start-Sleep -Milliseconds $SleepTimer

                #Clean out unused runspace jobs
                $temphash = $runspaces.clone()
                $temphash | Where {
                    $_.runspace -eq $Null
                } | ForEach {
                    Write-Verbose ("Removing {0}" -f $

            #Stop this loop only when $more if false and wait
            } while ($more -AND $PSBoundParameters['Wait'])

        #End of runspace function

So, now we have $runspaces, an array holding information on all the parallel threads we will be running. When we run Get-RunspaceData, the function loops through $runspaces to check if each runspace has completed or has over run the maxRunTime we specified, cleaning up as necessary

Run Get-RunspaceData until everything has completed

        $runspacehash.Wait = $true
        Get-RunspaceData @runspacehash

We’re at the end! $runspaces contains all (remaining) threads that will need to run. Here, we simply add the -wait parameter for Get-RunspaceData. When this is specified, the function keeps looping until $runspaces is empty.


That’s about it!  The full script is available here in the Script Repository.  I also added ForEach-Parallel – this adds the same runtime tracking to Tome Tanasovski’s version of Foreach-Parallel.

If anyone has any suggestions on revising this, please let me know!  I assume there would be computational overhead, as get-date runs for every single runspace when looping through $runspacepools.  When you’re querying thousands of computers, this might not the most efficient way to implement timeouts on a runspace.


The method I use to track start time is not accurate.  Runspaces are all defined initially, and queued up in the runspacepool.  This means we could record a starttime, the runspace could wait for a long time if the queue is full, and the timeout would expire the runspace immediately as the starttime was defined at the start.

I changed the default Timeout to 0.  If you do want to use this parameter, set it for roughly the duration you expect all threads will take to complete, and it will provide a simple fail-safe against hung threads.

Boe Prox suggested using nested runspaces – create an intermediary runspace that tracks the runtime of the real runspace.  It sets the timeout accurately, but falls short in performance.  Given that the motivation behind these commands is performance, I won’t be using this method.

I added a ‘maxQueue’ control that will ensure the runspacepool is not filled up immediately.  It is defined as ($throttle * 2) + 1, to help ensure the runspacepool always has at least ($throttle + 1) runspaces queued up and ready to run.  This can be changed depending on whether you prefer an accurate timeout or performance.

Get-VolatileInfo – Quick Troubleshooting Info

When I run into an unknown issue with one or more workstations or servers, I often run through the same set of steps.  Eventually, I find a tidbit of information that ideally identifies the root cause of the issue, or at worst points me in a direction to find more information.

I decided to script out this basic first step.  Get-VolatileInfo will run a number of queries against one or more remote computers.  It returns the info it collects in global variables you can then manipulate to find what you need; alternatively, it provides searchable sortable HTM files thanks to a method I borrowed from Douglas Finke.

Head over to the Script Center repository for the function code or ps1 file.  This link includes details on installation and dependencies.  Once you have the function loaded, Get-Help Get-VolatileInfo –Full will provide help with parameters and examples.

How does it work?

At a high level, I define dependency functions, run through each computer listed to collect certain info, and read that information.  Here’s what it might look like if I heard there was trouble with c-is-hyperv-1:


I now have a number of variables populated with troubleshooting information for c-is-hyperv-1.  The text below tells me the command I can run to list these again, or I could pipe the command to remove-variable and get rid of the variables.  An explorer windows also popped up with the HTM files created due to the –View command.

Perhaps I want to see what server lsass.exe is talking with.  I pull up the c-is-hyperv-1netstat.htm file and search for lsass:


Alternatively, I can do the same via PowerShell (Note:  I’m enclosing the variable name in curly brackets as it has dashes in it – tab completion will do this for you!):


What information does this provide?

Here’s what I’m collecting thus far, all of which is in object form (or in the handy HTM files):

  • AutoRuns: This is a full list of everything from AutoRunsC.exe.  You can sort or filter by category to weed out anything you don’t need.  It shows a comprehensive list of items that auto start or affect auto start.
  • ComputerInfo: A bunch of info from WMI – hostname, OS, SP, CPU, RAM, Free Space for C:\, last reboot time, system time, and the difference between system time and the system time from the computer you ran the query on.
  • CurrentUsers:  Users from win32_loggedonuser
  • InstalledSoftware:  Software listed in the registry
  • LogApp: -eventCount events from Application event log (Default is 250)
  • LogSec: -eventCount events from Security event log (Default is 250)
  • LogSys: -eventCount events from System event log (Default is 250)
  • NetStat: From Get-NetworkStatistics – Netstat –ano results with hostnames and process names resolved, in object form
  • NetworkAdapter#:  A variety of configuration info for each IPenabled network adapter, starting with NetworkAdapter0, NetworkAdapter1 …
  • OpenFiles:  Info on open files
  • Processes:  Info from Get-Process
  • Profiles:  List of profiles and the date ntuser.dat was last written to.
  • Reboots:  Any instances of eventID 6009 from event logs, indicative of system starting up
  • Services: Info from Get-Service
  • ServicesNotStarted:  Services set to autostart, but not running
  • Shares:  Info on shares

If you have any suggestions on information to collect or not to collect, or any other insight, it would be greatly appreciated!

PowerShell Resources

Update:  This list has been moved to this page rather than a post.  I will leave the content below unaltered, but be sure to refer to the new page for the rare update!


I’m starting to build a list of PowerShell resources for co-workers and me.  I plan to keep this up to date and add resources on a regular basis, so you may want to check back occasionally.

Cheat sheets and quick references

Blogs and other resources


Videos and Podcasts

Books and eBooks

Tools and Add-ons

This is what I have so far. Some of it is a bit dated;  for example, PowerGUI isn’t quite as active as it once was.  I’m likely overlooking a number of resources that I simply find using google, or that are out of the scope of what I do (e.g. I don’t work with SharePoint).

If you have any suggestions on resources to include, how to better organize these, or any other insight, it would be greatly appreciated!