使用带有 Compressor 的 Boto 的 Django AWS S3 无法压缩 UncompressableFileError
Posted
技术标签:
【中文标题】使用带有 Compressor 的 Boto 的 Django AWS S3 无法压缩 UncompressableFileError【英文标题】:Django AWS S3 using Boto with Compressor fails to compress UncompressableFileError 【发布时间】:2016-05-26 20:14:28 【问题描述】:在 this guide 和这些 [1] [2] 帖子之后,我尝试使用 django-storages Boto 在 AWS S3 上设置静态存储。
执行collectstatic
时,命令成功收集到STATIC_ROOT
。但是,文件没有上传到 S3 压缩文件,服务器无法提供这些文件。返回 500 错误。查看日志:
错误信息:
UncompressableFileError: 'https://<myapp>.s3.amazonaws.com/static/oscar/css/styles.css' could not be found in the COMPRESS_ROOT '/var/www/<myappname>/static' or with staticfiles.
编辑:
我也将 STATIC_URL 更改为 http://%s/ % AWS_S3_CUSTOM_DOMAIN
,我得到了同样的错误,除了它仍在搜索 https
,但 COMPRESS_URL 是 http
。
UncompressableFileError: 'https://<myappname>.s3.amazonaws.com/static/oscar/css/styles.css' isn't accessible via COMPRESS_URL ('http://<myappname>.s3.amazonaws.com/') and can't be compressed
这是压缩机和 Boto 不兼容吗?
相关代码:
# settings/prod.py
AWS_ACCESS_KEY_ID = <Key_ID>
AWS_SECRET_ACCESS_KEY = <Secret_Key>
AWS_STORAGE_BUCKET_NAME = "<my_bucket_name>"
AWS_S3_CUSTOM_DOMAIN = "%s.s3.amazonaws.com" % AWS_STORAGE_BUCKET_NAME
STATIC_URL = "https://%s/" % AWS_S3_CUSTOM_DOMAIN
AWS_LOCATION = 'static'
DEFAULT_FILE_STORAGE = "storages.backends.s3boto.S3BotoStorage"
STATICFILES_STORAGE = "myapp.storage.s3utils.CachedS3BotoStorage"
COMPRESS_STORAGE = "myapp.storage.s3utils.CachedS3BotoStorage"
AWS_IS_GZIPPED = True
COMPRESS_URL = STATIC_URL
STATIC_ROOT = "/var/www/<myappname>/static/"
COMPRESS_ROOT = STATIC_ROOT
storage/s3utils.py 来自this documentation
from django.core.files.storage import get_storage_class
from storages.backends.s3boto import S3BotoStorage
class CachedS3BotoStorage(S3BotoStorage):
"""
S3 storage backend that saves the files locally, too.
"""
def __init__(self, *args, **kwargs):
super(CachedS3BotoStorage, self).__init__(*args, **kwargs)
self.local_storage = get_storage_class(
"compressor.storage.CompressorFileStorage")()
def save(self, name, content):
name = super(CachedS3BotoStorage, self).save(name, content)
self.local_storage._save(name, content)
return name
【问题讨论】:
您能否使用 --traceback 运行 collectstatic 命令,这可能会更详细地说明问题所在。我记得压缩机有几个问题。如果您打算使用 Cloudfront,请注意,在 AWS 中正确设置 CORS 设置非常非常痛苦。 【参考方案1】:通过以下设置解决:
AWS_ACCESS_KEY_ID = '<KEY_ID>'
AWS_SECRET_ACCESS_KEY = '<SECRET_KEY>'
AWS_STORAGE_BUCKET_NAME = "<app_name>"
AWS_S3_CUSTOM_DOMAIN = "s3.amazonaws.com/%s" % AWS_STORAGE_BUCKET_NAME
MEDIA_URL = "https://%s/media/" % AWS_S3_CUSTOM_DOMAIN
STATIC_URL = "https://%s/static/" % AWS_S3_CUSTOM_DOMAIN
COMPRESS_URL = STATIC_URL
DEFAULT_FILE_STORAGE = '<app_name>.storage.s3utils.MediaS3BotoStorage'
STATICFILES_STORAGE = '<app_name>.storage.s3utils.CachedS3BotoStorage'
COMPRESS_STORAGE = '<app_name>.storage.s3utils.CachedS3BotoStorage'
MEDIA_ROOT = '<app_name>/media/'
STATIC_ROOT = '<app_name>/static/'
COMPRESS_ROOT = STATIC_ROOT
COMPRESS_ENABLED = True
COMPRESS_CSS_FILTERS = ['compressor.filters.css_default.CssAbsoluteFilter',
'compressor.filters.cssmin.CSSMinFilter'
]
COMPRESS_PARSER = 'compressor.parser.htmlParser'
STATICFILES_FINDERS = (
'django.contrib.staticfiles.finders.FileSystemFinder',
'django.contrib.staticfiles.finders.AppDirectoriesFinder',
'compressor.finders.CompressorFinder'
)
还有我的s3utils.py
class CachedS3BotoStorage(S3BotoStorage):
"""
S3 storage backend that saves files locally too.
"""
location = 'static'
def __init__(self, *args, **kwargs):
super(CachedS3BotoStorage, self).__init__(*args, **kwargs)
self.local_storage = get_storage_class(
"compressor.storage.CompressorFileStorage")()
def save(self, name, content):
name = super(CachedS3BotoStorage, self).save(name, content)
self.local_storage._save(name, content)
return name
class MediaS3BotoStorage(S3BotoStorage):
""" S3 storage backend that saves to the 'media' subdirectory"""
location = 'media'
【讨论】:
很好的答案,我的设置几乎没有什么不同,我把头发扯掉了,对我有帮助的是我在我的代码中添加了 https: - 硬编码而不是 // 并且它有效,谢谢你的问题在这里如果它对某人有帮助。 ***.com/questions/40825990/…【参考方案2】:看起来有人在这里遇到了同样的问题:https://github.com/django-compressor/django-compressor/issues/368#issuecomment-182817810
试试这个:
import copy
def save(self, name, content):
content2 = copy.copy(content)
name = super(CachedS3BotoStorage, self).save(name, content)
self.local_storage._save(name, content2)
return name
注意:我同时使用 django-storages S3BotoStorage 和 django-compressor 没有问题。我认为是 gzipping 导致了问题。
【讨论】:
以上是关于使用带有 Compressor 的 Boto 的 Django AWS S3 无法压缩 UncompressableFileError的主要内容,如果未能解决你的问题,请参考以下文章
使用带有全局二级索引的 boto3 在 dynamodb 上进行有条件的放置
带有boto3的emr上的pyspark,带有Futures的s3对象结果的副本在[100000毫秒]后超时