简体   繁体   English

Python 查找最近的文件慢

[英]Python Find Most Recent File Slow

Am I doing something wrong or is finding the most recent file at a file path location supposed to be fairly slow?我是在做错什么,还是在应该相当慢的文件路径位置找到最新的文件?

The below code is taking upwards of 3 minutes.下面的代码需要 3 分钟以上。 Is this expected for parsing thru a list of ~850 files?这是否可以通过约 850 个文件的列表进行解析?

I am using a regex pattern to find only.txt files and so after searching thru my file share location it returns a list of ~850 files.我正在使用正则表达式模式来查找 only.txt 文件,因此在通过我的文件共享位置进行搜索后,它会返回约 850 个文件的列表。 This is the list it parses thru to get the max(File) by key=os.path.getctime I tried sort instead of max and to just grab the top file but that wasn't any faster.这是它解析通过 key=os.path.getctime 获取 max(File) 的列表,我尝试使用 sort 而不是 max 并仅获取顶部文件,但这并没有更快。

import os
import glob

path='C:\Desktop\Test'
fileRegex='*.*txt'
latestFile = get_latest_file(filePath, fileRegex)

def get_latest_file(path,fileRegex):
    fullpath = os.path.join(path, fileRegex)
    list_of_files = glob.iglob(fullpath, recursive=True)

    if not list_of_files:               
        latestFile=''   

    latestFile = max(list_of_files, key=os.path.getctime)
    return latestFile

Try using os.scandir() , this speeded up my file searching massively.尝试使用os.scandir() ,这大大加快了我的文件搜索速度。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM