Essentially, I have a huge file and all the file contains are multiple words per line, each separated by a space. Kind of like this:
WORD WORD WORD WORD
ANOTHER
WORD SCRABBLE BLAH
YES NO
What I want to do is put all the words in the file into one huge list, I tried using split but that didn't account for the new lines(\\n)
Reading via for line in f
splits on newline and it's efficient memory-wise (it reads one line at a time) but putting everything in a huge list is not. Anyway, if you insist:
huge_list = []
with open(huge_file, "r") as f:
for line in f:
huge_list.extend(line.split())
To read the whole file into memory as a string, use f.read()
instead:
huge_list = []
with open(huge_file, "r") as f:
huge_list = f.read().split()
Input file (words separated by spaces and newlines):
WORD WORD WORD WORD
ANOTHER
WORD SCRABBLE BLAH
YES NO
Output of both examples:
>>> huge_list
['WORD', 'WORD', 'WORD', 'WORD', 'ANOTHER', 'WORD', 'SCRABBLE', 'BLAH', 'YES', 'NO']
>>>
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.