[英]Awk: how to print the field separator with your columns (field separator also a regular expression)
I have a file that looks like 我有一个看起来像的文件
3 5 t27s60
4 8 s30s40
2 2 t80t10
6 4 s80t10
And I want to produce a file like 我想产生一个像
3 5 t27 s60
4 8 s30 s40
2 2 t80 t10
6 4 s80 t10
So I would specify the field separator as s or t, but I want to keep these characters in the output. 因此,我将字段分隔符指定为s或t,但我想将这些字符保留在输出中。
AFAIK it is not possible to obtain the exact field delimiter when FS
has been set to a regular expresssion. 如果将
FS
设置为正则表达式,则无法获得确切的字段定界符。
I would use sed
for this use case: 我将在此用例中使用
sed
:
sed 's/...$/ &/' file
The s
command substitutes the last 3 characters in a line ...
before the end $
by a space and itself &
. s
命令用空格和其本身&
替换行...
末尾$
之前的最后3个字符。
If counting characters from the end does not work because the number of characters after the delimiter is not fixed, you can use the following sed
command: 如果由于定界符后的字符数不固定而使从末尾开始的字符计数不起作用,则可以使用以下
sed
命令:
sed -r 's/(s|t)([^st]+)$/ \1\2/' file
I'm searching for s
or t
using (s|t)
followed by 1 or more characters until the which are neither s
or t
. 我正在使用
(s|t)
后跟1个或多个字符的s
或t
搜索,直到都不是s
或t
。
A quick awk one-liner: 快速awk单行代码:
awk '{gsub(/[st]/," &",$0)}1' input.txt
outputs: 输出:
3 5 t27 s60
4 8 s30 s40
2 2 t80 t10
6 4 s80 t10
Here, we use the special meaning of &
in the gsub
command: it stands for the machted expression. 在这里,我们在
gsub
命令中使用&
的特殊含义:它代表machted表达式。 Hence, gsub(/[st]/," &",$0)
prepends a blank before each "s" or "t" 因此,
gsub(/[st]/," &",$0)
在每个“ s”或“ t”之前加一个空格
If repeated blanks are a problem: 如果重复出现空白是一个问题:
awk '{gsub(/[st]/," &",$0);gsub(/[ ]+/," ",$0)}1' input.txt
which gives: 这使:
3 5 t27 s60
4 8 s30 s40
2 2 t80 t10
6 4 s80 t10
Or perl, add a space before an "s" or "t", if the previous character is not whitespace: 或者,如果前一个字符不是空格,则在perl的“ s”或“ t”之前添加一个空格:
perl -pe 's/(?<=\S)([st])/ $1/g' file
The equivalent awk is 等效的awk是
awk '{print gensub(/([^[:blank:]])([st])/, "\\\1 \\\2", "g")}' file
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.