简体   繁体   中英

Improve the efficiency of my PowerShell script

The below code searches 400+ numbers from a list.txt file to see if it exists within any files within the folder path specified.

The script is very slow and has yet to complete as it did not complete after 25 minutes of running. The folder we are searching is 507 MB (532,369,408 bytes) and it contains 1,119 Files & 480 Folders . Any help to improve the speed of the search and the efficiency is greatly appreciated.

$searchWords = (gc 'C:\temp\list.txt') -split ','
$results = @()
Foreach ($sw in $searchWords)
{
    $files = gci -path 'C:\Users\david.craven\Dropbox\Asset Tagging\_SJC Warehouse_\_Project Completed_\2018\A*' -filter "*$sw*" -recurse

    foreach ($file in $files)
    {
        $object = New-Object System.Object
        $object | Add-Member -Type NoteProperty –Name SearchWord –Value $sw
        $object | Add-Member -Type NoteProperty –Name FoundFile –Value $file.FullName
        $results += $object
    }

}

$results | Export-Csv C:\temp\output.csv -NoTypeInformation

The following should speed up your task substantially:

If the intent is truly to look for the search words in the file names :

$searchWords = (Get-Content 'C:\temp\list.txt') -split ','
$path = 'C:\Users\david.craven\Dropbox\Facebook Asset Tagging\_SJC Warehouse_\_Project Completed_\2018\A*'

Get-ChildItem -File -Path $path -Recurse -PipelineVariable file |
  Select-Object -ExpandProperty Name |
    Select-String -List -SimpleMatch -Pattern $searchWords |
      Select-Object @{n='SearchWord'; e={$_.Pattern}},
                    @{n='FoundFile'; e={$file.FullName}} |
        Export-Csv C:\temp\output.csv -NoTypeInformation

If the intent is to look for the search words in the files' contents :

$searchWords = (Get-Content 'C:\temp\list.txt') -split ','
$path = 'C:\Users\david.craven\Dropbox\Facebook Asset Tagging\_SJC Warehouse_\_Project Completed_\2018\A*'

Get-ChildItem -File -Path $path -Recurse |
  Select-String -SimpleMatch -Pattern $searchWords |
    Select-Object @{n='SearchWord'; e={$_.Pattern}},
                  @{n='FoundFile'; e={$_.Path}} |
      Export-Csv C:\temp\output.csv -NoTypeInformation

The keys to performance improvement:

  • Perform the search with a single command, by passing all search words to Select-String .

  • Instead of constructing custom objects in a script block with New-Object and Add-Member , let Select-Object construct the objects for you directly in the pipeline, using calculated properties .

  • Instead of building an intermediate array iteratively with += - which behind the scenes recreates the array every time - use a single pipeline to pipe the result objects directly to Export-Csv .

So there are definitely some basic things in the PowerShell code you posted that can be improved, but it may still not be super fast. Based on the sample you gave us I'll assume you're looking to match the file names against a list of words. You're looping through the list of words (400 iterations) and in each loop you're looping through all 1,119 files. That's a total of 447,600 iterations!

Assuming you can't reduce the number of iterations in the loop, let's start by making each iteration faster. The Add-Member cmdlet is going to be really slow, so switch that approach up by casting a hashtable to the [PSCustomObject] type accelerator:

[PSCustomObject]@{
    SearchWord = $Word
    File       = $File.FullName
}

Also, there is no reason to pre-create an array object and then add each file to it. You can simply capture the ouptut of the foreach loop in a variable:

$Results = Foreach ($Word in $Words)
{
...

So a faster loop might look like this:

$Words = Get-Content -Path $WordList
$Files = Get-ChildItem -Path $Path -Recurse -File

$Results = Foreach ($Word in $Words)
{    
    foreach ($File in $Files)
    {
        if ($File.BaseName -match $Word)
        {
            [PSCustomObject]@{
                SearchWord = $Word
                File       = $File.FullName
            }
        }
    }
}

A simpler approach might be to use Where-Object on the files array:

$Results = Foreach ($Word in $Words)
{
    $Files | Where-Object BaseName -match $Word
}

Try both and test out the performance.

So if speeding up the loop doesn't meet your needs, try removing the loop entirely. You could use regex and join all the words together:

$Words = Get-Content -Path $WordList
$Files = Get-ChildItem -Path $Path -Recurse -File
$WordRegex = $Words -join '|'
$Files | Where basename -match $WordRegex

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM