繁体   English   中英

Apache Spark-ImportError:没有名为_winreg的模块

[英]Apache Spark - ImportError: No module named _winreg

一个对我非常有用的脚本大约一周前就停止工作了。 当我编译后来用于创建RDD的lambda函数时,就会出现问题。

考虑下面的代码:

class RDDUtils(object):
@staticmethod
def map_builder(*fields):
    """
    Creates a compiled lambda function for use in spark keyBy using the specified field names
    :param fields: The name of the fields to create the function with
    :return: A compiled python function
    """
    func = FunctionType(
        compile("lambda x: {" + ',\n'.join('"'"{}"'" : x[{}]'.format(c, i) for i, c in enumerate(fields, 0)) + "}",
                "<string>",
                "eval"), {})
    return func()

@staticmethod
def rdd_creator(context, fields, source_file, delim='\t'):
    """
    Method which creates an RDD
    :param context: spark context
    :param fields: fields / columns in our csv file
    :param source_file: location of csv file
    :return: RDD
    """
    build = RDDUtils.map_builder(*fields)
    rdd = context.textFile(source_file).map(lambda x: x.split(delim)).map(build)
    return rdd


rdd = RDDUtils()

sc = context('demo1', 'local')
fields = ['username', 'full_name', 'src_id']
source_file = '/home/aaron/dim_operator.csv'

create_rdd = rdd.rdd_creator(sc, fields,source_file)

print create_rdd.first()

下面的回溯:

File "/usr/lib/python2.7/dist-packages/six.py", line 116, in __getattr__
    _module = self._resolve()
  File "/usr/lib/python2.7/dist-packages/six.py", line 105, in _resolve
    return _import_module(self.mod)
  File "/usr/lib/python2.7/dist-packages/six.py", line 76, in _import_module
    __import__(name)
ImportError: No module named _winreg

是什么原因导致该产品突然停止工作?

在Ubuntu 14.04.3上运行

我通过显式调用lambda,然后将eval()环绕在我没有lambda的情况下动态创建的字符串周围来解决此问题。

更新后的代码如下:

class RDDUtils(object):

def map_builder(*fields):
    lambda_dict = "{" + ','.join('"'"{}"'" : x[{}]'.format(c, i) for i, c in enumerate(fields, 0)) + "}"
    return lambda_dict

def rdd_creator(context, fields, source_file, delim='\t'):
    """
    Creates an RDD
    """
    build = map_builder(*fields)
    rdd = context.textFile(source_file).map(lambda x: x.split(delim)).map(lambda x: eval(build))
    return rdd


if __name__ == "__main__":

    rdd = RDDUtils()

    sc = context('demo1', 'local')
    fields = ['username', 'full_name', 'src_id']
    source_file = '/home/aaron/dim_operator.csv'

    create_rdd = rdd.rdd_creator(sc, fields,source_file)

    print create_rdd.first()

预期结果如下:

{'用户名':u'dev','src_id':u'1','full_name':u'主要开发用户'}

编辑:下面的完整回溯...

Traceback (most recent call last):
  File "/home/aaron/apps/pycharm-3.0.2/helpers/pydev/pydevd.py", line 1532, in <module>
    debugger.run(setup['file'], None, None)
  File "/home/aaron/apps/pycharm-3.0.2/helpers/pydev/pydevd.py", line 1143, in run
    pydev_imports.execfile(file, globals, locals) #execute the script
  File "/home/aaron/PycharmProjects/fetl/dim_operator.py", line 127, in <module>
    print create_rdd.first()
  File "/home/aaron/apps/spark/python/pyspark/rdd.py", line 1242, in first
    rs = self.take(1)
  File "/home/aaron/apps/spark/python/pyspark/rdd.py", line 1194, in take
    totalParts = self._jrdd.partitions().size()
  File "/home/aaron/apps/spark/python/pyspark/rdd.py", line 2288, in _jrdd
    pickled_cmd, bvars, env, includes = _prepare_for_python_RDD(self.ctx, command, self)
  File "/home/aaron/apps/spark/python/pyspark/rdd.py", line 2206, in _prepare_for_python_RDD
    pickled_command = ser.dumps(command)
  File "/home/aaron/apps/spark/python/pyspark/serializers.py", line 411, in dumps
    return cloudpickle.dumps(obj, 2)
  File "/home/aaron/apps/spark/python/pyspark/cloudpickle.py", line 816, in dumps
    cp.dump(obj)
  File "/home/aaron/apps/spark/python/pyspark/cloudpickle.py", line 133, in dump
    return pickle.Pickler.dump(self, obj)
  File "/usr/lib/python2.7/pickle.py", line 224, in dump
    self.save(obj)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 562, in save_tuple
    save(element)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/aaron/apps/spark/python/pyspark/cloudpickle.py", line 254, in save_function
    self.save_function_tuple(obj, [themodule])
  File "/home/aaron/apps/spark/python/pyspark/cloudpickle.py", line 304, in save_function_tuple
    save((code, closure, base_globals))
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 548, in save_tuple
    save(element)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 600, in save_list
    self._batch_appends(iter(obj))
  File "/usr/lib/python2.7/pickle.py", line 633, in _batch_appends
    save(x)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/aaron/apps/spark/python/pyspark/cloudpickle.py", line 254, in save_function
    self.save_function_tuple(obj, [themodule])
  File "/home/aaron/apps/spark/python/pyspark/cloudpickle.py", line 304, in save_function_tuple
    save((code, closure, base_globals))
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 548, in save_tuple
    save(element)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 600, in save_list
    self._batch_appends(iter(obj))
  File "/usr/lib/python2.7/pickle.py", line 636, in _batch_appends
    save(tmp[0])
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/aaron/apps/spark/python/pyspark/cloudpickle.py", line 209, in save_function
    modname = pickle.whichmodule(obj, name)
  File "/usr/lib/python2.7/pickle.py", line 817, in whichmodule
    if name != '__main__' and getattr(module, funcname, None) is func:
  File "/usr/lib/python2.7/dist-packages/six.py", line 116, in __getattr__
    _module = self._resolve()
  File "/usr/lib/python2.7/dist-packages/six.py", line 105, in _resolve
    return _import_module(self.mod)
  File "/usr/lib/python2.7/dist-packages/six.py", line 76, in _import_module
    __import__(name)
ImportError: No module named _winreg

此错误是由于您的操作系统引起的。 Winreg无法在Linux(Ubuntu)上运行。 这是仅Windows模块。

https://docs.python.org/release/2.1.2/lib/module--winreg.html

可用性:Windows。

2.0版中的新功能。

这些函数将Windows注册表API公开给Python。 使用句柄对象可以确保正确关闭了句柄,而不是使用整数作为注册表句柄,即使程序员忽略显式关闭它们也是如此。

该模块向Windows注册表公开了一个非常低级的界面。 预计将来会创建一个新的winreg模块,为注册表API提供更高级别的接口。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM