[英]MXNET - Invalid type '<type 'numpy.ndarray'>' for data, should be NDArray, numpy.ndarray,
I am having trouble with basic IO with mxnet
. 我在使用
mxnet
进行基本IO时遇到麻烦。 I am attempting to use mxnet.io.NDArrayIter
to read in-memory datasets for training in mxnet. 我正在尝试使用
mxnet.io.NDArrayIter
读取内存数据集以在mxnet中进行训练。 I have the below code (condensed for brevity) which preprocesses the code and attempt to iterate through it (heavily based on the tutorial ): 我有下面的代码(为简洁起见,简明扼要),该代码对代码进行预处理并尝试对其进行迭代(很大程度上基于本教程 ):
import csv
import mxnet as mx
import numpy as np
from sklearn.feature_extraction.text import CountVectorizer, TfidfTransformer
from sklearn.pipeline import Pipeline
with open('data.csv', 'r') as data_file:
data = list(csv.reader(data_file))
labels = np.array(map(lambda x: x[1], data)) # one-hot encoded classes
data = map(lambda x: x[0], data) # raw text in need of pre-processing
transformer = Pipeline(steps=(('count_vectorizer', CountVectorizer()),
('tfidf_transformer', TfidfTransformer())))
preprocessed_data = np.array([np.array(row) for row in transformer.fit_transform(data)])
training_data = mx.io.NDArrayIter(data=preprocessed_data, label=labels, batch_size=50)
for i, batch in enumerate(training_data):
print(batch)
When executing this code, I receive the following error: 执行此代码时,出现以下错误:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/mxnet/io.py", line 510, in _init_data
data[k] = array(v)
File "/usr/local/lib/python3.5/dist-packages/mxnet/ndarray/utils.py", line 146, in array
return _array(source_array, ctx=ctx, dtype=dtype)
File "/usr/local/lib/python3.5/dist-packages/mxnet/ndarray/ndarray.py", line 2245, in array
arr[:] = source_array
File "/usr/local/lib/python3.5/dist-packages/mxnet/ndarray/ndarray.py", line 437, in __setitem__
self._set_nd_basic_indexing(key, value)
File "/usr/local/lib/python3.5/dist-packages/mxnet/ndarray/ndarray.py", line 698, in _set_nd_basic_indexing
self._sync_copyfrom(value)
File "/usr/local/lib/python3.5/dist-packages/mxnet/ndarray/ndarray.py", line 856, in _sync_copyfrom
source_array = np.ascontiguousarray(source_array, dtype=self.dtype)
File "/usr/local/lib/python3.5/dist-packages/numpy/core/numeric.py", line 581, in ascontiguousarray
return array(a, dtype, copy=False, order='C', ndmin=1)
TypeError: float() argument must be a string or a number, not 'csr_matrix'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "mxnet_test.py", line 20, in <module>
training_data = mx.io.NDArrayIter(data=preprocessed_data, label=labels, batch_size=50)
File "/usr/local/lib/python3.5/dist-packages/mxnet/io.py", line 643, in __init__
self.data = _init_data(data, allow_empty=False, default_name=data_name)
File "/usr/local/lib/python3.5/dist-packages/mxnet/io.py", line 513, in _init_data
"should be NDArray, numpy.ndarray or h5py.Dataset")
TypeError: Invalid type '<class 'numpy.ndarray'>' for data, should be NDArray, numpy.ndarray or h5py.Dataset
which I do not understand, as my data is being converted to a numpy.ndarray
before creating the NDArrayIter
instance. 我不明白,我的数据被转换为
numpy.ndarray
创建之前NDArrayIter
实例。 Would someone be willing to provide some insight on how to read data in mxnet
? 有人愿意提供有关如何在
mxnet
读取数据的mxnet
吗?
The code above is currently using the following versions: 上面的代码当前使用以下版本:
With the help of user2357112
, this was resolved by using exception chaining in Python 3 to find the exception (updated in question): 在
user2357112
的帮助下,此问题通过使用Python 3中的异常链接来查找异常(正在更新中)而得以解决:
The transformer
pipeline was returning a numpy.array
of scipy.sparse.csr_matrix
matrices instead of a 2-d numpy.array
. transformer
管道返回的是scipy.sparse.csr_matrix
矩阵的numpy.array
,而不是numpy.array
。 By adding changing the following line to use the toarray
method for the conversion instead, the script will run. 通过添加更改以下行以改为使用
toarray
方法进行转换,脚本将运行。
preprocessed_data = np.array([row.toarray() for row in transformer.fit_transform(data)])
optimal solution : toarray
is inefficient in terms of memory consumption when used on a scipy.sparse.csr_matrix
. 最佳解决方案 :在
toarray
上使用scipy.sparse.csr_matrix
时,在内存消耗方面效率低下。 In version 1.10
of mxnet
, one can use mxnet.nd.sparse.array
to more efficiently store the data: 在
mxnet
1.10
版本中,可以使用mxnet.nd.sparse.array
来更有效地存储数据:
...
preprocessed_data = mx.nd.sparse.array(transformer.fit_transform(data))
training_data = mx.io.NDArrayIter(data=preprocessed_data, label=preprocessed_labels, batch_size=5, last_batch_handle='discard')
for i, batch in enumerate(training_data):
print(batch)
With the only caveat being that one must use the last_batch_handle='discard'
keyword argument in the NDArrayIter
(functionality of last_batch_handle
here ) 与唯一需要注意的是,人们必须使用
last_batch_handle='discard'
关键字参数在NDArrayIter
(功能last_batch_handle
这里 )
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.