简体   繁体   中英

How to get all lines from a file to EOF?

I hope someone can help me to solve the problem. I found a lot of topics around reading files with sed, but nothing fits to my case. Maybe someone knows the solution to following situation:

I have a logger that creates a file of a defined size (in my case 5MB). The logger uses the file like a ring buffer and writes the logger information into this file. The logger starts at, of course, line 1 and remarks the end with eof. In a hex editor, it looks like this:

0d 3c 3c 3c 45 4f 46 3e 3e 3e 0d 20 20 20 20 20  .<<<EOF>>>.

Now I have two situations, an easy one and a complex one:

  1. I need to print from start to the end identifier.

  2. The most beautiful solution recognizes if after EOF are initial values (0x20), then print from line 1 to EOF. If there are values after end of line identifier then read all after EOF until file size and then from line one to EOF identifier. This should print out all lines of this "ring buffer". Is something like this possible?

To solve (1) I tried some sed commands, eg:

sed -e '1,$p' test.log > result.txt 

-> Aim: print everything from line 1 to EOF patter, but both files have the same size (in my case 5MB). It looks like $p refers to the real end of file and not to the EOF pattern.

sed -e '/EOF/,$d' test.log > result.txt 

-> Aim: print everything before EOF pattern, but result.txt has the size 0.

Can anybody offer any hints or solutions to solve this?

Using sed , I think you need two commands (and two scans of the file):

logfile="…some-name…"
eofmark="<<<EOF>>>"

sed -n "/$eofmark/,\$ { /$eofmark/d; p; }" $logfile  # Read the tail material
sed -n "1,/$eofmark/  { /$eofmark/d; p; }" $logfile  # Read the head material

Using perl or awk , you could slurp the whole file into memory, and then print the tail part followed by the head part. For example, in awk :

logfile="…some-name…"
eofmark="<<<EOF>>>"

awk "/$eofmark/"' {eofline = NR}
     {line[NR] = $0}
     END { for (i = eofline+1; i <= NR; i++) print line[i]
           for (i = 1; i < eofline; i++) print line[i]
     }' $logfile

This is feasible since reading a 5 MiB file into memory won't stress machines with gigabytes of main memory. If the file itself was gigabytes of data, you'd think twice about slurping it into memory, though scanning it twice would also be painful.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM