由于 ImportError,Celery Django 部署因 Elastic Beanstalk 失败:无法导入名称“Celery”(ElasticBeanstalk::ExternalInvoc

Posted

技术标签:

【中文标题】由于 ImportError,Celery Django 部署因 Elastic Beanstalk 失败:无法导入名称“Celery”(ElasticBeanstalk::ExternalInvocationError)【英文标题】:Celery Django Deployment fails with Elastic Beanstalk because of ImportError: cannot import name 'Celery' (ElasticBeanstalk::ExternalInvocationError) 【发布时间】:2019-07-21 06:01:03 【问题描述】:

在配置 celery 后尝试部署 django 应用程序后出现错误。它在本地环境中工作正常。看起来芹菜节拍或工人正在开始。尝试通过 supervisord 运行 celery worker 时出现错误

[i-063a3b57f40eb2ffa] [2019-02-27T13:04:39.139Z] INFO  [22820] - [Application update app-8bc8-190227_130333@187/AppDeployStage0/EbExtensionPostBuild/Infra-EmbeddedPostBuild/postbuild_0_django_brain_dev/Command 04_start_celery_beat] : Completed activity. Result:
  celeryd-beat: ERROR (not running)
  celeryd-beat: ERROR (abnormal termination)

[i-063a3b57f40eb2ffa] [2019-02-27T13:04:40.021Z] INFO  [22820] - [Application update app-8bc8-190227_130333@187/AppDeployStage0/EbExtensionPostBuild/Infra-EmbeddedPostBuild/postbuild_0_django_brain_dev/Command 05_start_celery_worker] : Starting activity...
[i-063a3b57f40eb2ffa] [2019-02-27T13:04:42.397Z] INFO  [22820] - [Application update app-8bc8-190227_130333@187/AppDeployStage0/EbExtensionPostBuild/Infra-EmbeddedPostBuild/postbuild_0_django_brain_dev/Command 05_start_celery_worker] : Completed activity. Result:
  celeryd-worker: ERROR (not running)
  celeryd-worker: ERROR (abnormal termination)


  from celery import Celery
  File "/opt/python/current/app/django_app/celery.py", line 3, in <module>
  from celery import Celery
  ImportError: cannot import name 'Celery'
   (ElasticBeanstalk::ExternalInvocationError)

容器命令:

 02_celery_tasks_config:
    command: "cat .ebextensions/files/celery_configuration.txt > /opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh && chmod 744 /opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh"
    leader_only: true

  03_celery_tasks_run:
    command: "sed -i 's/\r$//' /opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh"
    leader_only: true

  04_start_celery_beat:
    command: "/usr/local/bin/supervisorctl -c /opt/python/etc/supervisord.conf restart celeryd-beat"
    leader_only: true

  05_start_celery_worker:
    command: "/usr/local/bin/supervisorctl -c /opt/python/etc/supervisord.conf restart celeryd-worker"

celery configuration.txt 包含

#!/usr/bin/env bash

# Get django environment variables
celeryenv=`cat /opt/python/current/env | tr '\n' ',' | sed 's/export //g' | sed 's/$PATH/%(ENV_PATH)s/g' | sed 's/$PYTHONPATH//g' | sed 's/$LD_LIBRARY_PATH//g' | sed 's/%/%%/g'`
celeryenv=$celeryenv%?

# Create celery configuraiton script
celeryworkerconf="[program:celeryd-worker]
; Set full path to celery program if using virtualenv
command=/opt/python/run/venv/bin/celery worker -A djangobrain --loglevel=INFO

directory=/opt/python/current/app
user=nobody
numprocs=1
stdout_logfile=/var/log/celery-worker.log
stderr_logfile=/var/log/celery-worker.log
autostart=true
autorestart=true
startsecs=10

; Need to wait for currently executing tasks to finish at shutdown.
; Increase this if you have very long running tasks.
stopwaitsecs = 600

; When resorting to send SIGKILL to the program to terminate it
; send SIGKILL to its whole process group instead,
; taking care of its children as well.
killasgroup=true

; if rabbitmq is supervised, set its priority higher
; so it starts first
priority=998

environment=$celeryenv"

celerybeatconf="[program:celeryd-beat]
; Set full path to celery program if using virtualenv
command=/opt/python/run/venv/bin/celery beat -A djangobrain --loglevel=INFO --workdir=/tmp -S django --pidfile /tmp/celerybeat.pid

directory=/opt/python/current/app
user=nobody
numprocs=1
stdout_logfile=/var/log/celery-beat.log
stderr_logfile=/var/log/celery-beat.log
autostart=true
autorestart=true
startsecs=10

; Need to wait for currently executing tasks to finish at shutdown.
; Increase this if you have very long running tasks.
stopwaitsecs = 600

; When resorting to send SIGKILL to the program to terminate it
; send SIGKILL to its whole process group instead,
; taking care of its children as well.
killasgroup=true

; if rabbitmq is supervised, set its priority higher
; so it starts first
priority=998

environment=$celeryenv"

# Create the celery supervisord conf script
echo "$celeryworkerconf" | tee /opt/python/etc/celeryworker.conf
echo "$celerybeatconf" | tee /opt/python/etc/celerybeat.conf

# Add configuration script to supervisord conf (if not there already)
if ! grep -Fxq "[include]" /opt/python/etc/supervisord.conf
  then
  echo "[include]" | tee -a /opt/python/etc/supervisord.conf
  echo "files: celerybeat.conf celeryworker.conf" | tee -a /opt/python/etc/supervisord.conf
fi

# reread the supervisord config
/usr/local/bin/supervisorctl -c /opt/python/etc/supervisord.conf reread
# update supervisord in cache without restarting all services
/usr/local/bin/supervisorctl -c /opt/python/etc/supervisord.conf update

# Start/Restart celeryd through supervisord
supervisorctl -c /opt/python/etc/supervisord.conf restart celeryd-beat
supervisorctl -c /opt/python/etc/supervisord.conf restart celeryd-worker

【问题讨论】:

你用pip install celery安装了celery吗? 是的,它是通过 requirements.txt 安装的 【参考方案1】:

这不是答案,但我放弃了 Celery 路径并通过无服务器部署作为解决方案。

【讨论】:

以上是关于由于 ImportError,Celery Django 部署因 Elastic Beanstalk 失败:无法导入名称“Celery”(ElasticBeanstalk::ExternalInvoc的主要内容,如果未能解决你的问题,请参考以下文章

Django,ImportError:无法导入名称 Celery,可能的循环导入?

芹菜工人错误:ImportError no module named celery

弹性 beantalk 中的 celery worker 出错(使用 django 和 SQS)[ImportError:curl 客户端需要 pycurl 库。]

Python3.5以上Celery4.2.1启动报错:ImportError: cannot import name 'LRUCache' from 'kombu.utils.

ImportError: No module named ‘MySQLdb‘

django 1.11 与 celery 4.0 和 djcelery 兼容性问题