简体   繁体   中英

Store contents of several files in an array Python

I have a directory with a lot of files, structured as n rows x 2 columns . What I'd like to do is to store the contents of these files in a way that the shape of the final array is (nfiles, nrows, 2). Something similar to

array = numpy.array([[[1,1], [1,1], [1,1]], [1,1], [1,1], [1,1]]])

but how to do it for several files?

I've tried

fnames = glob.glob(/path/to/directory/"*.txt")
final_array = [numpy.genfromtxt(fname) for fname in fnames]

but the final shape is (nfiles,), and numpy.reshape didn't work. With

import pandas as pd
df_list = [pd.read_csv(filename, header=None, sep=" ") for filename in fnames]
comb = pd.concat(df_list, ignore_index=True, sort=False)

I can create an array with all file contents (presumably in order) in 2 columns. Is there any way to divide this in parts to reshape (nrows is the same for every file)? Note that I don't want separate arrays for each file, but a single array

Thank you all for the help

Use np.stack or If you're reading in a bunch of files in a loop:

import numpy as np
a = np.zeros(m,n,2) 

for i in range(m):
# code here
    a[i,:,:] = df_list
# code here

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM