简体   繁体   中英

A PowerShell script to find the file size and file count of a folder with millions of files?

The purpose of the script is the following:

  1. Print the number of files recursively found within a directory (omitting folders themselves)
  2. Print the total sum file size of the directory
  3. Not crash the computer because of massive memory use.

So far (3) is the tough part.

Here is what I have written and tested so far. This works perfectly well on folders with a hundred or even a thousand files:

$hostname=hostname
$directory = "foo"
$dteCurrentDate = Get-Date –f "yyyy/MM/dd"

$FolderItems = Get-ChildItem $directory -recurse
$Measurement = $FolderItems | Measure-Object -property length -sum
$colitems = $FolderItems | measure-Object -property length -sum
"$hostname;{0:N2}" -f ($colitems.sum / 1MB) + "MB;" + $Measurement.count + " files;" + "$dteCurrentDate"

On folders with millions of files, however, the $colitems variable becomes so massive from the collection of information of millions of files that it makes the system unstable. Is there a more efficient way to draw and store this information?

If you use streaming and pipelining, you should be reduce problem with (3) a lot, because when you stream, each object is passed along the pipeline as and when they are available and do not take up much memory and you should be able to process millions of files (though it will take time).

Get-ChildItem $directory -recurse | Measure-Object -property length -sum

I don't believe @Stej's statement, Get-ChildItem probably reads all entries in the directory and then begins pushing them to the pipeline. , is true. Pipelining is a fundamental concept of PowerShell (provide the cmdlets, scripts, etc. support it). It both ensures that processed objects are passed along the pipeline one by one as and when they are available and also, only when they are needed. Get-ChildItem is not going to behave differently.

A great example of this is given in Understanding the Windows PowerShell Pipeline .

Quoting from it:

The Out-Host -Paging command is a useful pipeline element whenever you have lengthy output that you would like to display slowly. It is especially useful if the operation is very CPU-intensive. Because processing is transferred to the Out-Host cmdlet when it has a complete page ready to display, cmdlets that precede it in the pipeline halt operation until the next page of output is available. You can see this if you use the Windows Task Manager to monitor CPU and memory use by Windows PowerShell.

Run the following command: Get-ChildItem C:\Windows -Recurse . Compare the CPU and memory usage to this command: Get-ChildItem C:\Windows -Recurse | Out-Host -Paging Get-ChildItem C:\Windows -Recurse | Out-Host -Paging .

Benchmark on using Get-ChildItem on c:\ (about 179516 files, not milions, but good enough):

Memory usage after running $a = gci c:\ -recurse (and then doing $a.count ) was 527,332K .

Memory usage after running gci c:\ -recurse | measure-object gci c:\ -recurse | measure-object was 59,452K and never went above around 80,000K .

(Memory - Private Working Set - from TaskManager, seeing memory for the powershell.exe process. Initially, it was about 22,000K .)

I also tried with two million files (it took me a while to create them!)

Similar experiment:

Memory usage after running $a = gci c:\ -recurse ( and then doing $a.count ) was 2,808,508K .

Memory usage while running gci c:\ -recurse | measure-object gci c:\ -recurse | measure-object was 308,060K and never went above around 400,000K . After it finished, it had to do a [GC]::Collect() for it to return to the 22,000K levels.

I am still convinced that Get-ChildItem and pipelining can get you great memory improvements even for millions of files.

Get-ChildItem probably reads all entries in the directory and then begins pushing them to the pipeline. In case that Get-ChildItem doesn't work well, try to switch to .NET 4.0 and use EnumerateFiles and EnumeratedDirectories :

function Get-HugeDirStats($directory) {
    function go($dir, $stats)
    {
        foreach ($f in [system.io.Directory]::EnumerateFiles($dir))
        {
            $stats.Count++
            $stats.Size += (New-Object io.FileInfo $f).Length
        }
        foreach ($d in [system.io.directory]::EnumerateDirectories($dir))
        {
            go $d $stats
        }
    }
    $statistics = New-Object PsObject -Property @{Count = 0; Size = [long]0 }
    go $directory $statistics

    $statistics
}

#example
$stats = Get-HugeDirStats c:\windows

Here the most expensive part is the one with New-Object io.FileInfo $f , because EnumerateFiles returns just file names. So if only count of files is enough, you can comment the line.

See Stack Overflow question How can I run PowerShell with the .NET 4 runtime? to learn how to use .NET 4.0.


You may also use plain old methods which are also fast, but read all the files in directory. So it depends on your needs, just try it. Later there is comparison of all the methods.

function Get-HugeDirStats2($directory) {
    function go($dir, $stats)
    {
        foreach ($f in $dir.GetFiles())
        {
            $stats.Count++
            $stats.Size += $f.Length
        }
        foreach ($d in $dir.GetDirectories())
        {
            go $d $stats
        }
    }
    $statistics = New-Object PsObject -Property @{Count = 0; Size = [long]0 }
    go (new-object IO.DirectoryInfo $directory) $statistics

    $statistics
}

Comparison :

Measure-Command { $stats = Get-HugeDirStats c:\windows }
Measure-Command { $stats = Get-HugeDirStats2 c:\windows }
Measure-Command { Get-ChildItem c:\windows -recurse | Measure-Object -property length -sum }
TotalSeconds      : 64,2217378
...

TotalSeconds      : 12,5851008
...

TotalSeconds      : 20,4329362
...

@manojlds: Pipelining is a fundamental concept. But as a concept it has nothing to do with the providers. The file system provider relies on the .NET implementation (.NET 2.0) that has no lazy evaluation capabilities (~ enumerators). Check that yourself.

The following function is quite cool and is fast to calculate the size of a folder, but it doesn't always work (especially when there is a permission problem or a too long folder path).

Function sizeFolder($path) # Return the size in MB.
{
    $objFSO = New-Object -com  Scripting.FileSystemObject
    ("{0:N2}" -f (($objFSO.GetFolder($path).Size) / 1MB))
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM