简体   繁体   中英

send the unix output to a csv file

I want to put the output data from unix command to a csv file. Suppose the output which I am getting is :

A
B
C

I want to put this data in .csv file as

A B C

in three different columns but same row.

Try this :

printf '%s\n' A B C | paste -sd ' ' >> file.csv

or more classical for a CSV (delimiter with a , :

printf '%s\n' A B C | paste -sd ',' >> file.csv

printf '%s\\n' ABC is just an example to have the same sample input as you. My solution works with spaces in a same line too.

EDIT from your comments, you seems to need to treat with a for loop, so :

for i in {0..5}; do printf '%s\n' {A..C} | paste -sd " " >> file.csv; done

or in pseudo code :

for ...:
    unix_command | paste -sd " " >> file.csv
endfor
unix_command | tr "\n" " " > file.csv

要么

unix_command | awk 'ORS=FS' > file.csv

For my understanding, @Django needs three line into one line.

paste -d ' ' - - - < infile

If you need output as csv format (split by , ), you can use this

paste -d ',' - - - < infile

Here is the test result

$ cat infile
Manoj Mishra
Japan
Environment.
Michael Jackson
America
Environment.

$ paste -d ',' - - - < infile

Manoj Mishra,Japan,Environment.
Michael Jackson,America,Environment.

A more general answer

If the output of your command is multi-line and you want to put the quoted output in csv format, n items per line, the following script could be handy.


The groupby program reads from stdin and

  1. quotes each input line
  2. groups n quoted input lines in a csv record, using a comma as a separator

optionally, using the -s optional argument, the program discards the last line of its output if said last line doesn't contain exactly n items.

The -h option, as usual, echoes an usage line and exits.

Specifying another option the program prints the usage line and exits in error.

The code

% cat groupby
#!/bin/sh
usage () { echo Usage: $0 [-s] n --- -s is for \"strict\", outputs only records of n items. ; exit $1 ; }
s=0
while getopts :sh o ; do
    case "${o}" in
        s) s=1 ; shift ;;
        h) usage 0     ;;
        *) usage 1     ;;
    esac
done
awk -v n=$1 -v s=$s -v q='"' '
NR==1   {buf = q $0 q ; next}
NR%n==1 {print buf; buf = q $0 q ; next}
        {buf = buf "," q $0 q}
END     {if(!s||NR%n==0)print buf}'
%

An example of usage

% chmod +x groupby
% echo -e "1\n2\n3\n4\n5" | ./groupby 3
"1","2","3"
"4","5"
% echo -e "1\n2\n3\n4\n5\n6" | ./groupby 3
"1","2","3"
"4","5","6"
echo -e "1\n2\n3\n4\n5\n6\n7" | ./groupby 3
"1","2","3"
"4","5","6"
"7"
% echo -e "1\n2\n3\n4\n5\n6\n7\n8" | ./groupby -s 4
"1","2","3","4"
"5","6","7","8"
% echo -e "1\n2\n3\n4\n5\n6\n7" | ./groupby -s 4
"1","2","3","4"
%

A different angle

I changed the defaults to suit best the OP requirements, and introduced other options, see the usage string for details

#!/bin/sh
usage () { echo 'Usage: '$0' [-s] [-q quote_char] [-c separator_char] n
  Reads lines from stdin and prints them grouped by n and separated by spaces.
  Optional arguments:
    -s is for "strict", outputs only records of n items;
    -q quote_char, forces quoting of each input line;
    -c separator_char, changes the field separator,
       interesting alternatives are tab, comma, semicolon etc;
    -h prints this help and exits.' ; exit $1 ; }

# Default options
s=0 ; q='' ; c=' '

# Treatment of optional arguments
while getopts :shc:q: o ; do
    case "${o}" in
        s) s=1 ;         ;;
        c) c="${OPTARG}" ;;
        q) q="${OPTARG}" ;;
        h) usage 0       ;;
        *) usage 1       ;;
    esac
done
shift $(($OPTIND-1))

# awk code
awk -v n=$1 -v s=$s -v q="$q"  -v c="$c" '
    NR==1   {buf = q $0 q ; next}
    NR%n==1 {print buf; buf = q $0 q ; next}
            {buf = buf c q $0 q}
    END     {if(!s||NR%n==0)print buf}'

just use xargs .


eg:

less filename| xargs >> filename.csv

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM