celery 5.3.6 报错ValueError: not enough values to unpack (expected 3, got 0)

发布时间 2023-12-06 14:12:14作者: 九尾cat

celery 5.3.6 报错ValueError: not enough values to unpack

启动celery脚本报错

执行 python run_task.py报错,celery服务端和脚本端日志信息如下

 

 # celery -A tasks worker --loglevel=INFO
 
 -------------- celery@DESKTOP-BQAR0JR v5.3.6 (emerald-rush)
--- ***** -----
-- ******* ---- Windows-10-10.0.19045-SP0 2023-12-06 11:01:05
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app:         tasks:0x1a7532b6340
- ** ---------- .> transport:   redis://192.168.1.105:6379/0
- ** ---------- .> results:     redis://192.168.1.105/0
- *** --- * --- .> concurrency: 8 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . tasks.add

[2023-12-06 11:01:05,903: WARNING/MainProcess] c:\users\administrator\appdata\local\programs\py
thon\python38\lib\site-packages\celery\worker\consumer\consumer.py:507: CPendingDeprecationWarn
ing: The broker_connection_retry configuration setting will no longer determine
whether broker connection retries are made during startup in Celery 6.0 and above.
If you wish to retain the existing behavior for retrying connections on startup,
you should set broker_connection_retry_on_startup to True.
  warnings.warn(

[2023-12-06 11:01:06,037: INFO/MainProcess] Connected to redis://192.168.1.105:6379/0
[2023-12-06 11:01:06,039: WARNING/MainProcess] c:\users\administrator\appdata\local\programs\py
thon\python38\lib\site-packages\celery\worker\consumer\consumer.py:507: CPendingDeprecationWarn
[2023-12-06 11:01:06,864: INFO/SpawnPoolWorker-7] child process 12484 calling self.run()       
[2023-12-06 11:01:06,867: INFO/SpawnPoolWorker-5] child process 21516 calling self.run()       
[2023-12-06 11:01:07,680: INFO/MainProcess] mingle: all alone
[2023-12-06 11:01:08,304: INFO/MainProcess] celery@DESKTOP-BQAR0JR ready.
[2023-12-06 11:01:44,627: INFO/MainProcess] Task tasks.add[2e88f4bd-aebb-4f80-a0dc-c34ea27a4a22
] received
[2023-12-06 11:01:44,806: ERROR/MainProcess] Task handler raised error: ValueError('not enough 
values to unpack (expected 3, got 0)')
billiard.einfo.RemoteTraceback:
"""
Traceback (most recent call last):
  File "c:\users\administrator\appdata\local\programs\python\python38\lib\site-packages\billiar
d\pool.py", line 361, in workloop
    result = (True, prepare_result(fun(*args, **kwargs)))
Traceback (most recent call last):
  File "c:\users\administrator\appdata\local\programs\python\python38\lib\site-packages\billiar
d\pool.py", line 361, in workloop
    result = (True, prepare_result(fun(*args, **kwargs)))
  File "c:\users\administrator\appdata\local\programs\python\python38\lib\site-packages\celery\
app\trace.py", line 664, in fast_trace_task
    tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
worker: Hitting Ctrl+C again will terminate all running tasks!

报错 AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for'

 

执行脚本

# coding=utf-8

from tasks import add

result = add.delay(4,4)
print('Is task ready: %s' % result.ready())

run_result = result.get(timeout=1)
print('Task Result: %s' % run_result)

解决方案

win10上运行celery4.x ,5.x就会出现这个问题,需要安装eventlet解决,问题解决。

#安装依赖
pip install eventlet


#运行work server指定eventlet
celery -A <mymodule> worker -l info -P eventlet