使用环境变量在 AWS Elastic Beanstalk 上使用 django 运行 celery
Posted
技术标签:
【中文标题】使用环境变量在 AWS Elastic Beanstalk 上使用 django 运行 celery【英文标题】:Run celery with django on AWS Elastic Beanstalk using environment variables 【发布时间】:2017-05-05 00:43:22 【问题描述】:我想用我的 Django 应用在 AWS Elastic Beanstalk 上运行 celery。我关注了@yellowcap (How do you run a worker with AWS Elastic Beanstalk?) 的精彩回答。所以我的supervisord.conf
看起来像这样:
files:
"/opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh":
mode: "000755"
owner: root
group: root
content: |
#!/usr/bin/env bash
# Get django environment variables
celeryenv=`cat /opt/python/current/env | tr '\n' ',' | sed 's/export //g' | sed 's/$PATH/%(ENV_PATH)s/g' | sed 's/$PYTHONPATH//g' | sed 's/$LD_LIBRARY_PATH//g'`
celeryenv=$celeryenv%?
# Create celery configuraiton script
celeryconf="[program:celeryd]
; Set full path to celery program if using virtualenv
command=/opt/python/run/venv/bin/celery worker -A myappname --loglevel=INFO
directory=/opt/python/current/app
user=nobody
numprocs=1
stdout_logfile=/var/log/celery-worker.log
stderr_logfile=/var/log/celery-worker.log
autostart=true
autorestart=true
startsecs=10
; Need to wait for currently executing tasks to finish at shutdown.
; Increase this if you have very long running tasks.
stopwaitsecs = 600
; When resorting to send SIGKILL to the program to terminate it
; send SIGKILL to its whole process group instead,
; taking care of its children as well.
killasgroup=true
; if rabbitmq is supervised, set its priority higher
; so it starts first
priority=998
environment=$celeryenv"
# Create the celery supervisord conf script
echo "$celeryconf" | tee /opt/python/etc/celery.conf
# Add configuration script to supervisord conf (if not there already)
if ! grep -Fxq "[include]" /opt/python/etc/supervisord.conf
then
echo "[include]" | tee -a /opt/python/etc/supervisord.conf
echo "files: celery.conf" | tee -a /opt/python/etc/supervisord.conf
fi
# Reread the supervisord config
supervisorctl -c /opt/python/etc/supervisord.conf reread
# Update supervisord in cache without restarting all services
supervisorctl -c /opt/python/etc/supervisord.conf update
# Start/Restart celeryd through supervisord
supervisorctl -c /opt/python/etc/supervisord.conf restart celeryd
在我决定将 settings.py 中的一些变量迁移到我的 Elastic Beanstalk 环境属性之前,他的代码运行良好。
确实,调用脚本时出现以下错误:
for \'environment\' is badly formatted'>: file: /usr/lib64/python2.7/xmlrpclib.py line: 800
celeryd: ERROR (no such process)
感谢您的帮助。
【问题讨论】:
您的环境行有一个尾随引号。那应该在那里吗?environment=$celeryenv"
是的,celeryconf 声明结束了
你能粘贴一下生成的celery.conf吗?
【参考方案1】:
这是由于 Supervisor 如何解析配置文件 [1]。
您的环境设置包含未转义的 % 字符,可能来自 Django SECRET_KEY。
以下内容对我有用 - 尝试在此处将 | sed 's/%/%%/g'
附加到管道链:
celeryenv=`cat /opt/python/current/env | tr '\n' ',' | sed 's/export //g' | sed 's/$PATH/%(ENV_PATH)s/g' | sed 's/$PYTHONPATH//g' | sed 's/$LD_LIBRARY_PATH//g'`
结果行:
celeryenv=`cat /opt/python/current/env | tr '\n' ',' | sed 's/export //g' | sed 's/$PATH/%(ENV_PATH)s/g' | sed 's/$PYTHONPATH//g' | sed 's/$LD_LIBRARY_PATH//g' | sed 's/%/%%/g'`
[1]https://github.com/Supervisor/supervisor/issues/291
【讨论】:
以上是关于使用环境变量在 AWS Elastic Beanstalk 上使用 django 运行 celery的主要内容,如果未能解决你的问题,请参考以下文章
AWS Elastic Beanstalk Docker 环境变量
如何使用 Elastic Beanstalk 为 AWS CDK 设置环境变量?
如果 AWS_SECRET_KEY 必须存储在 EBS 环境变量中,AWS Secrets Manager 如何在 Elastic Beanstalk 应用程序中安全使用?
Django AWS RDS 环境变量未在 Elastic Beanstalk 中设置