带有joblib库的spacy生成_pickle.PicklingError:无法腌制任务以将其发送给工作人员
Posted
技术标签:
【中文标题】带有joblib库的spacy生成_pickle.PicklingError:无法腌制任务以将其发送给工作人员【英文标题】:spacy with joblib library generates _pickle.PicklingError: Could not pickle the task to send it to the workers 【发布时间】:2019-11-14 23:08:30 【问题描述】:我有一个很大的句子列表(约 7 百万),我想从中提取名词。
我使用 joblib
库来并行化提取过程,如下所示:
import spacy
from tqdm import tqdm
from joblib import Parallel, delayed
nlp = spacy.load('en_core_web_sm')
class nouns:
def get_nouns(self, text):
doc = nlp(u"".format(text))
return [token.text for token in doc if token.tag_ in ['NN', 'NNP', 'NNS', 'NNPS']]
def parallelize(self, sentences):
results = Parallel(n_jobs=1)(delayed(self.get_nouns)(sent) for sent in tqdm(sentences))
return results
if __name__ == '__main__':
sentences = ['we went to the school yesterday',
'The weather is really cold',
'Can we catch the dog?',
'How old are you John?',
'I like diving and swimming',
'Can the world become united?']
obj = nouns()
print(obj.parallelize(sentences))
当n_jobs
in parallelize function 大于 1 时,我得到这个长错误:
100%|██████████| 6/6 [00:00<00:00, 200.00it/s]
joblib.externals.loky.process_executor._RemoteTraceback:
"""
Traceback (most recent call last):
File "C:\Python35\lib\site-packages\joblib\externals\loky\backend\queues.py", line 150, in _feed
obj_ = dumps(obj, reducers=reducers)
File "C:\Python35\lib\site-packages\joblib\externals\loky\backend\reduction.py", line 243, in dumps
dump(obj, buf, reducers=reducers, protocol=protocol)
File "C:\Python35\lib\site-packages\joblib\externals\loky\backend\reduction.py", line 236, in dump
_LokyPickler(file, reducers=reducers, protocol=protocol).dump(obj)
File "C:\Python35\lib\site-packages\joblib\externals\cloudpickle\cloudpickle.py", line 267, in dump
return Pickler.dump(self, obj)
File "C:\Python35\lib\pickle.py", line 408, in dump
self.save(obj)
File "C:\Python35\lib\pickle.py", line 520, in save
self.save_reduce(obj=obj, *rv)
File "C:\Python35\lib\pickle.py", line 623, in save_reduce
save(state)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 810, in save_dict
self._batch_setitems(obj.items())
File "C:\Python35\lib\pickle.py", line 836, in _batch_setitems
save(v)
File "C:\Python35\lib\pickle.py", line 520, in save
self.save_reduce(obj=obj, *rv)
File "C:\Python35\lib\pickle.py", line 623, in save_reduce
save(state)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 810, in save_dict
self._batch_setitems(obj.items())
File "C:\Python35\lib\pickle.py", line 841, in _batch_setitems
save(v)
File "C:\Python35\lib\pickle.py", line 520, in save
self.save_reduce(obj=obj, *rv)
File "C:\Python35\lib\pickle.py", line 623, in save_reduce
save(state)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 810, in save_dict
self._batch_setitems(obj.items())
File "C:\Python35\lib\pickle.py", line 836, in _batch_setitems
save(v)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 770, in save_list
self._batch_appends(obj)
File "C:\Python35\lib\pickle.py", line 797, in _batch_appends
save(tmp[0])
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 725, in save_tuple
save(element)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\site-packages\joblib\externals\cloudpickle\cloudpickle.py", line 718, in save_instancemethod
self.save_reduce(types.MethodType, (obj.__func__, obj.__self__), obj=obj)
File "C:\Python35\lib\pickle.py", line 599, in save_reduce
save(args)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 725, in save_tuple
save(element)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\site-packages\joblib\externals\cloudpickle\cloudpickle.py", line 395, in save_function
self.save_function_tuple(obj)
File "C:\Python35\lib\site-packages\joblib\externals\cloudpickle\cloudpickle.py", line 594, in save_function_tuple
save(state)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 810, in save_dict
self._batch_setitems(obj.items())
File "C:\Python35\lib\pickle.py", line 836, in _batch_setitems
save(v)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 810, in save_dict
self._batch_setitems(obj.items())
File "C:\Python35\lib\pickle.py", line 841, in _batch_setitems
save(v)
File "C:\Python35\lib\pickle.py", line 520, in save
self.save_reduce(obj=obj, *rv)
File "C:\Python35\lib\pickle.py", line 623, in save_reduce
save(state)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 810, in save_dict
self._batch_setitems(obj.items())
File "C:\Python35\lib\pickle.py", line 836, in _batch_setitems
save(v)
File "C:\Python35\lib\pickle.py", line 520, in save
self.save_reduce(obj=obj, *rv)
File "C:\Python35\lib\pickle.py", line 599, in save_reduce
save(args)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 740, in save_tuple
save(element)
File "C:\Python35\lib\pickle.py", line 520, in save
self.save_reduce(obj=obj, *rv)
File "C:\Python35\lib\pickle.py", line 623, in save_reduce
save(state)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 740, in save_tuple
save(element)
File "C:\Python35\lib\pickle.py", line 495, in save
rv = reduce(self.proto)
File "stringsource", line 2, in preshed.maps.PreshMap.__reduce_cython__
TypeError: self.c_map cannot be converted to a Python object for pickling
"""Exception in thread QueueFeederThread:
Traceback (most recent call last):
File "C:\Python35\lib\site-packages\joblib\externals\loky\backend\queues.py", line 150, in _feed
obj_ = dumps(obj, reducers=reducers)
File "C:\Python35\lib\site-packages\joblib\externals\loky\backend\reduction.py", line 243, in dumps
dump(obj, buf, reducers=reducers, protocol=protocol)
File "C:\Python35\lib\site-packages\joblib\externals\loky\backend\reduction.py", line 236, in dump
_LokyPickler(file, reducers=reducers, protocol=protocol).dump(obj)
File "C:\Python35\lib\site-packages\joblib\externals\cloudpickle\cloudpickle.py", line 267, in dump
return Pickler.dump(self, obj)
File "C:\Python35\lib\pickle.py", line 408, in dump
self.save(obj)
File "C:\Python35\lib\pickle.py", line 520, in save
self.save_reduce(obj=obj, *rv)
File "C:\Python35\lib\pickle.py", line 623, in save_reduce
save(state)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 810, in save_dict
self._batch_setitems(obj.items())
File "C:\Python35\lib\pickle.py", line 836, in _batch_setitems
save(v)
File "C:\Python35\lib\pickle.py", line 520, in save
self.save_reduce(obj=obj, *rv)
File "C:\Python35\lib\pickle.py", line 623, in save_reduce
save(state)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 810, in save_dict
self._batch_setitems(obj.items())
File "C:\Python35\lib\pickle.py", line 841, in _batch_setitems
save(v)
File "C:\Python35\lib\pickle.py", line 520, in save
self.save_reduce(obj=obj, *rv)
File "C:\Python35\lib\pickle.py", line 623, in save_reduce
save(state)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 810, in save_dict
self._batch_setitems(obj.items())
File "C:\Python35\lib\pickle.py", line 836, in _batch_setitems
save(v)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 770, in save_list
self._batch_appends(obj)
File "C:\Python35\lib\pickle.py", line 797, in _batch_appends
save(tmp[0])
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 725, in save_tuple
save(element)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\site-packages\joblib\externals\cloudpickle\cloudpickle.py", line 718, in save_instancemethod
self.save_reduce(types.MethodType, (obj.__func__, obj.__self__), obj=obj)
File "C:\Python35\lib\pickle.py", line 599, in save_reduce
save(args)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 725, in save_tuple
save(element)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\site-packages\joblib\externals\cloudpickle\cloudpickle.py", line 395, in save_function
self.save_function_tuple(obj)
File "C:\Python35\lib\site-packages\joblib\externals\cloudpickle\cloudpickle.py", line 594, in save_function_tuple
save(state)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 810, in save_dict
self._batch_setitems(obj.items())
File "C:\Python35\lib\pickle.py", line 836, in _batch_setitems
save(v)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 810, in save_dict
self._batch_setitems(obj.items())
File "C:\Python35\lib\pickle.py", line 841, in _batch_setitems
save(v)
File "C:\Python35\lib\pickle.py", line 520, in save
self.save_reduce(obj=obj, *rv)
File "C:\Python35\lib\pickle.py", line 623, in save_reduce
save(state)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 810, in save_dict
self._batch_setitems(obj.items())
File "C:\Python35\lib\pickle.py", line 836, in _batch_setitems
save(v)
File "C:\Python35\lib\pickle.py", line 520, in save
self.save_reduce(obj=obj, *rv)
File "C:\Python35\lib\pickle.py", line 599, in save_reduce
save(args)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 740, in save_tuple
save(element)
File "C:\Python35\lib\pickle.py", line 520, in save
self.save_reduce(obj=obj, *rv)
File "C:\Python35\lib\pickle.py", line 623, in save_reduce
save(state)
File "C:\Python35\lib\pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python35\lib\pickle.py", line 740, in save_tuple
save(element)
File "C:\Python35\lib\pickle.py", line 495, in save
rv = reduce(self.proto)
File "stringsource", line 2, in preshed.maps.PreshMap.__reduce_cython__
TypeError: self.c_map cannot be converted to a Python object for pickling
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Python35\lib\threading.py", line 914, in _bootstrap_inner
self.run()
File "C:\Python35\lib\threading.py", line 862, in run
self._target(*self._args, **self._kwargs)
File "C:\Python35\lib\site-packages\joblib\externals\loky\backend\queues.py", line 175, in _feed
onerror(e, obj)
File "C:\Python35\lib\site-packages\joblib\externals\loky\process_executor.py", line 310, in _on_queue_feeder_error
self.thread_wakeup.wakeup()
File "C:\Python35\lib\site-packages\joblib\externals\loky\process_executor.py", line 155, in wakeup
self._writer.send_bytes(b"")
File "C:\Python35\lib\multiprocessing\connection.py", line 183, in send_bytes
self._check_closed()
File "C:\Python35\lib\multiprocessing\connection.py", line 136, in _check_closed
raise OSError("handle is closed")
OSError: handle is closed
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File ".../playground.py", line 43, in <module>
print(obj.Paralize(sentences))
File ".../playground.py", line 32, in Paralize
results = Parallel(n_jobs=2)(delayed(self.get_nouns)(sent) for sent in tqdm(sentences))
File "C:\Python35\lib\site-packages\joblib\parallel.py", line 934, in __call__
self.retrieve()
File "C:\Python35\lib\site-packages\joblib\parallel.py", line 833, in retrieve
self._output.extend(job.get(timeout=self.timeout))
File "C:\Python35\lib\site-packages\joblib\_parallel_backends.py", line 521, in wrap_future_result
return future.result(timeout=timeout)
File "C:\Python35\lib\concurrent\futures\_base.py", line 405, in result
return self.__get_result()
File "C:\Python35\lib\concurrent\futures\_base.py", line 357, in __get_result
raise self._exception
_pickle.PicklingError: Could not pickle the task to send it to the workers.
我的代码有什么问题?
【问题讨论】:
我在下面添加了一个替代解决方案来并行化提到的 Spacy 进程。 如果dill
可以并行化代码,那么multiprocess
(即multiprocessing
使用dill
)是否也合适?
【参考方案1】:
问:我的代码有什么问题?
嗯,很可能问题不是来自代码,而是来自“隐藏”处理,一旦 n_jobs
指示(和joblib
内部协调)准备那么多主进程的精确副本,以便让它们彼此独立工作(从而有效地摆脱 GIL 锁定并将多个进程流映射到物理硬件资源)
此步骤负责复制所有 python 对象,并且已知使用 Pickle
来执行此操作。 Pickle
模块以其历史上对什么可以腌制和什么不能腌制的主要限制而闻名。
错误消息证实了这一点:
TypeError: self.c_map cannot be converted to a Python object for pickling
可以尝试提供 Mike McKearns dill
模块而不是 Pickle
并测试您的“有问题的”python 对象是否会被此模块腌制而不会引发此错误。
dill
具有相同的 API 签名,因此纯 import dill as pickle
可能有助于使所有其他代码保持不变。
我遇到了同样的问题,大型模型要在多个进程中分配和返回,dill
是一种解决方法。性能也有所提高。
奖励:
dill
允许保存/恢复完整的 Python 解释器状态!
这是找到dill
的一个很酷的副作用,一旦import dill as pickle
完成,pickle.dump_session( <aFile> )
将保存python 解释器会话的完整状态副本。如果需要,这可以恢复(崩溃后恢复、训练有素的训练和优化的 ML 模型状态完全保存/恢复、增量学习 ML 模型状态完全保存并重新分配以用于已部署用户群的远程恢复,等等)
【讨论】:
感谢您的简短解释。我尝试导入dill as pickle
,它确实有效,但性能并没有提高。我想我必须找到另一种方法来并行化这个过程。
在哪里添加import dill as pickle
声明?我尝试在from joblib import Parallel, delayed
之前和之后添加它,但我一直收到相同的错误_pickle.PicklingError
@user_007 你能告诉我们你在哪里添加了导入吗?它在 joblib 文件之一中吗?【参考方案2】:
同样的问题。我通过在Parallel
中将后端从loky
更改为threading
来解决。
【讨论】:
感谢您的评论,它帮助了我:) 使用threading
后端有什么问题吗?性能损失?
好问题。我不是 joblib 专家,但我在 sklearn(在后台使用 joblib)上找到了这个:“'loky' 建议运行操作 Python 对象的函数。'threading' 是一种低开销的替代方案,效率最高对于释放全局解释器锁的函数:例如,对显式释放 GIL 的本机代码的几次调用中的 I/O 绑定代码或 CPU 绑定代码。”【参考方案3】:
我的问题的另一个答案:
我没有找到使用 Spacy 的 Joblib 解决方案,而是为了并行化进程,我发现 Spacy 发布了一个名为 Pipeline 的东西,您可以在其中使用多线程解析大量文档。
我将它应用到上面的相同示例中:
class nouns:
def get_nouns(self, sentences):
start = time.time()
docs = nlp.pipe(sentences, n_threads=-1)
result = [ ' '.join([token.text for token in doc if token.tag_ in ['NN', 'NNP', 'NNS', 'NNPS']]) for doc in docs]
print('Time Elapsed ms'.format((time.time() - start) * 1000))
print(result)
if __name__ == '__main__':
sentences = ['we went to the school yesterday',
'The weather is really cold',
'Can we catch the dog?',
'How old are you John?',
'I like diving and swimming',
'Can the world become united?']
obj = nouns()
obj.get_nouns(sentences)
【讨论】:
【参考方案4】:我在并行词形还原方面遇到了类似的问题,但使用另一个库 pymystem3
。
from pymystem3 import Mystem
mystem = Mystem()
def preprocess_text(text):
...
tokens = mystem.lemmatize(text)
...
text = " ".join(tokens)
return text
data_set = Parallel(n_jobs=-1)(delayed(preprocess_text)(article) for article in tqdm(articles))
解决方案是将初始化放入函数中。
def preprocess_text(text):
...
mystem = Mystem()
tokens = mystem.lemmatize(text)
...
text = " ".join(tokens)
return text
我怀疑你可以用 nlp = spacy.load
尝试同样的方法
【讨论】:
以上是关于带有joblib库的spacy生成_pickle.PicklingError:无法腌制任务以将其发送给工作人员的主要内容,如果未能解决你的问题,请参考以下文章
使用带有 AllenNLP Interpret 或 Textattack 的 spaCy 模型
ModuleNotFoundError:没有名为“sklearn.linear_model._base”的模块