简体   繁体   English

如何加快 Powershell 中的 Get-ChildItem

[英]How to speed up Get-ChildItem in Powershell

Just wandering how could I speed up the Get-ChildItem in Powershell?只是徘徊我怎样才能加快 Powershell 中的 Get-ChildItem ?

I have the following script to search for a file that created by today and copy it over to another folder.我有以下脚本来搜索今天创建的文件并将其复制到另一个文件夹。

$fromDirectory = "test_file_*.txt"
$fromDirectory = "c:\sour\"
$toDirectory = "c:\test\"

Get-ChildItem $fromDirectory -Include $fileName -Recurse | Where {$_.LastWriteTime -gt (Get-Date).Date} | Copy-Item -Destination $toDirectory

Due to the folder that I search have 124,553 history files, it's take me age for the search.由于我搜索的文件夹有 124,553 个历史文件,因此搜索需要我的时间。 Does any know how could I improve my script to speed up my search and copy?有谁知道如何改进我的脚本以加快搜索和复制速度?

Here are some things to try:以下是一些可以尝试的事情:

First, use Measure-Command {} to get the actual performance:首先,使用 Measure-Command {} 获取实际性能:

Measure-Command { Get-ChildItem $fromDirectory -Include $fileName -Recurse | Where {$_.LastWriteTime -gt (Get-Date).Date} | Copy-Item -Destination $toDirectory }

Then, consider removing the '-Recurse' flag, because this is actually going inside every directory and child and child of child.然后,考虑删除“-Recurse”标志,因为这实际上是进入每个目录以及子目录和子目录的子目录。 If your target log files are really that scattered, then...如果您的目标日志文件真的那么分散,那么......

Try using robocopy to match a pattern in the filename and lastwritetime, then use powershell to copy over.尝试使用 robocopy 匹配文件名和 lastwritetime 中的模式,然后使用 powershell 进行复制。 You could even use robocopy to do the copying.您甚至可以使用 robocopy 进行复制。

It's possible that you just have a huge, slow problem to solve, but try these to see if you can break it down.您可能只是要解决一个巨大而缓慢的问题,但请尝试这些,看看您是否可以将其分解。

This is a well-known feature of NTFS.这是 NTFS 的一个众所周知的特性。 Microsoft's docs say that the limit for decreasing performance is about 50 000 files in a directory.微软的文档说,降低性能的限制是一个目录中大约 50 000 个文件。

If the file names are of very similar, creation of 8dot3 legacy names will start to slow down when there are about 300 000 files.如果文件名非常相似,当有大约 300 000 个文件时,8dot3 旧名称的创建将开始减慢。 Though you have "only" 120 k files, it's the same order of magnitude.尽管您“只有” 120 k 个文件,但数量级相同。

Some previous questions discuss this issue .一些先前的问题讨论了这个问题 Sadly, there is no a single good solution but better hierarchy in directories.可悲的是,没有一个好的解决方案,但目录中的层次结构更好。 The usual tricks are to disable 8dot3 with fsutil and last access date via registry, but those will help only so much.通常的技巧是使用fsutil和通过注册表的最后访问日期禁用 8dot3,但这些只会有很大帮助。

Can you redesign the directory structure?你能重新设计目录结构吗? Moving old files into, say, year-quarter subdirs might keep the main directory clean enough.将旧文件移动到年季度子目录可能会保持主目录足够干净。 To find out file's year-quarter, a quick way is like so,要找出文件的年季度,一种快速的方法是这样的,

gci | % { 
  $("{2} => {1}\Q{0:00}" -f [Math]::ceiling( ($_.LastAccessTime.toString('MM'))/3),
    $_.LastAccessTime.ToString('yyyy'), 
    $_.Name 
  )
}

I would try putting the Get-ChildItem $fromDirectory -Include $fileName -Recurse | Where {$_.LastWriteTime -gt (Get-Date).Date}我会尝试把Get-ChildItem $fromDirectory -Include $fileName -Recurse | Where {$_.LastWriteTime -gt (Get-Date).Date} Get-ChildItem $fromDirectory -Include $fileName -Recurse | Where {$_.LastWriteTime -gt (Get-Date).Date} in an Array and then copying results from the Array. Get-ChildItem $fromDirectory -Include $fileName -Recurse | Where {$_.LastWriteTime -gt (Get-Date).Date}在数组中,然后从数组中复制结果。

$GrabFiles =@()
$GrabFiles =@( Get-ChildItem $fromDirectory -Include $fileName -Recurse | Where {$_.LastWriteTime -gt (Get-Date).Date} )
Copy-Item -Path $GrabFiles -Destination $toDirectory }

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM