简体   繁体   中英

will using the command os.lstat(“some_file. txt”).st_size block the file so other programs cannot write?

I am having a hard time finding a way to asynchronously read/write to a text file due to python blocking the file when it opens it.

I need to read from the file as another program writes to it. Once the other program writes to the txt file python will then read it, look for some key data, and then erase the text file so it is ready for the next batch of data (the data comes in batches because they are test reports from a RAM tester).

So I'm wondering if the os.lstat("some_file. txt").st_size command will block the file while it is collecting the size of the file. If it doesn't block it I will run that command in a loop and use the size of the file to trigger python to open, read, delete and close the text file since the data is written to the text file 404 characters at a time every 3-52 seconds.

I'm trying to avoid the situation where python has the text file open to read/delete to it while the other program goes to write to it.

I am using Windows as my platform.

You need to post-process what a third party program writes in a text file. The best solution if it was possible would be to simply pipe the output of the first program into yours. In that case, you would just process your standard input.

If the other program writes into a true file, you should read it, and never ever try to write there. Simply, even at eof, you must still wait until more data is available or first program terminates.

You cannot simply keep the file open when you reach EOF but note the position close and reopen it. That way you can see whether new lines are added to the file. Just add a short sleep because it will be an active loop and you want to be kind with the processor and a global timeout to exit the program if nothing else happens for a too long time.

Here is a possible implementation :

import re
import time

filename = 'Yup.txt' # file to scan
sleep = 0.5          # time to sleep (0,5 second) after an end of file detection
timeout = 240        # total delay in sleep unit before timeout (240 * 0.5 : 2 minutes)

rx = re.compile("PASS")
pos = 0
while True:
    with open(filename) as fd:
        fd.seek(pos)
        for line in fd:
            tim = timeout
            # do optional processing for any line
            if (rx.search(line)): # line contains PASS (anywhere in the line)
                print "*** FOUND PASS ***"
                # do processing for PASS line
        pos = fd.tell()
    time.sleep(sleep)
    tim -= 1
    if tim == 0:
        print '*** TIMEOUT ***'
        break

A possible improvement would be to start the RAM tester in the script and test for the end of process :

import subprocess
prog = subprocess.Popen("command line to start the ram tester", shell=True)

because you can then test if the program has exited (before or in replacement of timeout test) :

    if prog.poll() is not None:
        print " Program exited"
        break

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM