问题描述
我有从 Amazon cloudfront 提供服务器的公共和私有文件,公共文件工作正常,但现在我想通过经过身份验证的读取保护其中一些为私有文件.
私有文件有自己的Uploader DocumentUploader,文件需要存放在单独的bucket中吗?就像现在一样,它们都在一个桶中.
不久前我用 Paperclip 做过类似的事情,但似乎找不到使用 Carrierwave 和使用定时 Authenticated_url 的好资源
我看到他们在这里有类似的东西:
但我不确定如何实现它.
任何提示将不胜感激.
取决于安全程度,但您可以在特定的 Uploader Class 本身上设置文件权限以覆盖默认权限,如下所示:
class SomeUploader
这将自动导致来自此上传器的文件现在预先带有临时 AWS 到期和访问密钥,并且未来上传将设置为私有,即不可公开访问.
3
I have both public and private files which I server from Amazon cloudfront, the public files work fine but now I'd like to secure some of them as private with an authenticated read.
The private files have their own Uploader DocumentUploader, do the files need to be stored in separate buckets? As it is now they are all in the one bucket.
I've done something similar with Paperclip awhile back but can't seem to find a good resource for doing it with Carrierwave and using a timed Authenticated_url
I see they have something like it here:
But I'm not sure how to implement it.
Any tips would be greatly appreciated.
解决方案
Depends how secure, but you can set file permissions on the particular Uploader Class itself overriding the default permissions like so:
class SomeUploader < CarrierWave::Uploader::Base
def fog_public
false
end
def fog_authenticated_url_expiration
5.minutes # in seconds from now, (default is 10.minutes)
end
.....
That will automatically cause the files from this Uploader to now be prepended with the temporary AWS expiration and accesskeys and future uploads will be set to private, ie not publicly accessible.
这篇关于在 S3 和 cloudfront 上使用 railscarrierwave 私有文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!