简体   繁体   中英

parsing urls from a text file to bash shell array

i have a text file with many urls

www.example1.com
www.example2.com
www.example3.com
www.example4.com

How would i go about iterating through it, and parsing each url into an array instance..

ie

array[0]
array[1]
array[2]
array[3]

thanks guys

Pure bash way:

array=($(<inFile))

then:

# list all the elements of array
echo "${array[@]}"

# echo num of elements in array
echo ${#array[@]}

## OR loop through the above array
for i in "${array[@]}"
do
   echo $i # or do whatever with individual element of the array
done

# 1st element of array
echo ${array[0]}

# 2nd element of array
echo ${array[2]}

# ...

One concern with ( $(< file.txt) ) is if a line in the file can contain whitespace. Consider the array produced from

a b
c
d

The array would have 4 elements, not 3. URLs cannot contain whitespace, so it is not an issue here. For the general case, bash 4 provides a mapfile command which ensures that each line provides a single element for an array.

mapfile array < file.txt

is a one-step replacement for the following loop:

while IFS= read -r line; do
    array+=( "$line" )
done < file.txt

If you want to have the array be in your script instead of reading a file you can use this approach.

#!/bin/bash

array=(
"www.example1.com"
"www.example2.com"
"www.example3.com"
"www.example4.com"
)

for URL in "${array[@]}"; do
    echo "$URL"
done

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM