简体   繁体   中英

How to improve performance of 3 successive GCI -Recurse calls?

Powershell noob here. In order to create a list of potential duplicate dirs, I have a loop that runs the following 3 GCI commands on all directories to get the total size, number of files and number of directories below the currently examins dir:

$folderSize = Get-Childitem -Path $fullPath -Recurse -Force -ErrorAction SilentlyContinue | Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue       
$folderDirs = Get-ChildItem -Path $fullPath -Recurse -Force -ErrorAction SilentlyContinue -Directory | Measure-Object -ErrorAction SilentlyContinue       
$folderFiles = Get-ChildItem -Path $fullPath -Recurse -Force -ErrorAction SilentlyContinue -File | Measure-Object -ErrorAction SilentlyContinue       

The code is working fine but it seems really dumb to run 3 times a GCI with the recurse parameter on the same path. What would be a more efficient way to get those 3 informations for a given directory?

Store the results of the where first query in a variable, use the .Where({}) extension method to split them into categories based on the PSIsContainer property - at which point you can reference the automagical Count property of each (rather than invoking Measure-Object for the simple act of counting the items):

$allFileSystemItems = Get-Childitem -Path $fullPath -Recurse -Force -ErrorAction SilentlyContinue
$size = $allFileSystemItems |Measure-Object Length -Sum

# split collections into "directories" and "files"
$dirs,$files = $allFileSystemItems.Where({$_.PsIscontainer}, 'Split')

$dirCount = $dirs.Count
$fileCount = $files.Count

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM