简体   繁体   English

收到TypeError:无法腌制_thread.lock对象

[英]Getting TypeError: can't pickle _thread.lock objects

I am querying MongoDB to get list of dictionary and for each dict in the list, I am doing some comparison of values. 我正在查询MongoDB以获取字典列表,并且对于列表中的每个字典,我正在对值进行一些比较。 Based on the result of comparison, I am storing the values of dictionary, comparison result, and other values calculated in a mongoDB collection. 基于比较的结果,我将存储字典的值,比较结果以及在mongoDB集合中计算出的其他值。 I am trying to do this by invoking multiprocessing and am getting this error. 我正在尝试通过调用多处理来做到这一点,并收到此错误。

def save_for_doc(doc_id):

    #function to get the fields of doc
    fields = get_fields(doc_id)
    no_of_process = 5
    doc_col_size = 30000
    chunk_size = round(doc_col_size/no_of_process)
    chunk_ranges = range(0, no_of_process*chunk_size, chunk_size)
    processes = [ multiprocessing.Process(target=save_similar_docs, args= 
    (doc_id,client,fields,chunks,chunk_size)) for chunks in chunk_ranges]
    for prc in processes:
       prc.start()

def save_similar_docs(arguments):

     #This function process the args and saves the results to MongoDB. Does not 
     #return anything as the end result is directly stored.

Below is the error: 下面是错误:

 File "H:/Desktop/Performance Improvement/With_Process_Pool.py", line 144, 
 in save_for_doc
   prc.start()

 File "C:\ProgramData\Anaconda3\lib\multiprocessing\process.py", line 105, 
 in start
  self._popen = self._Popen(self)

 File "C:\ProgramData\Anaconda3\lib\multiprocessing\context.py", line 223, 
 in _Popen
   return _default_context.get_context().Process._Popen(process_obj)

 File "C:\ProgramData\Anaconda3\lib\multiprocessing\context.py", line 322, 
 in _Popen
   return Popen(process_obj)

 File "C:\ProgramData\Anaconda3\lib\multiprocessing\popen_spawn_win32.py", 
 line 65, in __init__
reduction.dump(process_obj, to_child)

 File "C:\ProgramData\Anaconda3\lib\multiprocessing\reduction.py", line 60, 
 in dump
        ForkingPickler(file, protocol).dump(obj)

        TypeError: can't pickle _thread.lock objects

What does this error mean? 这个错误是什么意思? Please explain and how can I get over. 请解释一下,我该如何克服。

The documentation says that you can't copy a client from a main process to a child process, you have to create the connection after you fork. 该文档说,您不能将客户端从主进程复制到子进程,必须在分叉后创建连接。 The client object cannot be copied, create connections, after you fork the process. 派生该过程之后,无法复制客户端对象,请创建连接。

On Unix systems the multiprocessing module spawns processes using fork(). 在Unix系统上,多处理模块使用fork()生成进程。 Care must be taken when using instances of MongoClient with fork(). 将MongoClient实例与fork()配合使用时必须小心。 Specifically, instances of MongoClient must not be copied from a parent process to a child process. 特别是,绝不能将MongoClient实例从父进程复制到子进程。 Instead, the parent process and each child process must create their own instances of MongoClient. 相反,父进程和每个子进程必须创建自己的MongoClient实例。

http://api.mongodb.com/python/current/faq.html#id21 http://api.mongodb.com/python/current/faq.html#id21

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Joblib错误:TypeError:无法腌制_thread.lock对象 - Joblib error: TypeError: can't pickle _thread.lock objects Keras:TypeError:无法使用KerasClassifier来pickle _thread.lock对象 - Keras: TypeError: can't pickle _thread.lock objects with KerasClassifier Keras 2,TypeError:无法pickle _thread.lock对象 - Keras 2, TypeError: can't pickle _thread.lock objects Keras模型:TypeError:无法腌制_thread.lock对象 - Keras model: TypeError: can't pickle _thread.lock objects 使用 Queue() 进行多处理:TypeError: can't pickle _thread.lock objects - Multiprocessing with Queue(): TypeError: can't pickle _thread.lock objects 在将Queue传递给子进程中的线程时,如何解决“ TypeError:无法腌制_thread.lock对象” - How to fix 'TypeError: can't pickle _thread.lock objects' when passing a Queue to a thread in a child process Keras Lambda图层和变量:“TypeError:无法pickle _thread.lock对象” - Keras Lambda layer and variables : “TypeError: can't pickle _thread.lock objects” 多处理,Python3,Windows:TypeError:无法腌制 _thread.lock 对象 - Multiprocessing, Python3, Windows: TypeError: can't pickle _thread.lock objects 使用HappyBase连接池的PySpark dataframe.foreach()返回'TypeError:无法pickle thread.lock对象' - PySpark dataframe.foreach() with HappyBase connection pool returns 'TypeError: can't pickle thread.lock objects' 使用 'spawn' 启动 redis 进程但面临 TypeError: can't pickle _thread.lock objects - Using 'spawn' to start a redis process but facing TypeError: can't pickle _thread.lock objects
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM