When a user creates or registers for a new account on my website, an image is created(generated) and is supposed to be uploaded to the s3 bucket. The image is successfully created(verified by running the ls command on the server in the media directory) but it's not getting uploaded to s3. However, when I try uploading an image for a user account from the admin panel, changes are correctly reflected in s3 (i.e newly uploaded image from admin panel is shown in s3 bucket's directory, but this is not feasible as the users cannot be given admin panel access). I aim to auto-upload the generated image to the s3 bucket when a new account is created.
Here's some related code.
views.py
def signup(request):
if request.method == "POST":
base_form = UserForm(data=request.POST)
addnl_form = AddnlForm(data=request.POST)
if base_form.is_valid() and addnl_form.is_valid():
usrnm = base_form.cleaned_data['username']
if UserModel.objects.filter(user__username=usrnm).count()==0:
user = base_form.save()
user.set_password(user.password)
user.save()
#print(img)
addnl = addnl_form.save(commit=False )
addnl.user = user
img = qr.make_image() #create a qr code image, full code not included.
img.save('media/qrcodes/%s.png'%usrnm)
addnl.qr_gen = 'qrcodes/%s.png'%usrnm
addnl.save()
else:
messages.error(request,base_form.errors,addnl_form.errors)
else:
base_form = UserForm()
addnl_form = AddnlForm()
return render(request,'app/signup.html',{'base_form':base_form,'addnl_form':addnl_form} )
models.py
class UserModel(models.Model):
.
.
.
qr_gen = models.ImageField(upload_to='qrcodes',default=None,null=True,blank=True)
settings.py
DEFAULT_FILE_STORAGE = 'project.storage_backend.MediaStorage'
storage_backend.py
from storages.backends.s3boto3 import S3Boto3Storage
class MediaStorage(S3Boto3Storage):
location = 'media'
default_acl = 'public-read'
file_overwrite = False
UPDATE
Instead of auto-generating, an image and uploading it to s3, if I upload any image in the registration form, even in that case it's successfully uploading to s3, the only case where it fails is when I need to auto-upload without user intervention.
Please help me solve this problem. Thank you.
I would recommend taking a look at django-storages which automates all of this so you should only worry about the form and the view instead of anything else. In there you will find help on how to deal with images easily.
Instead of globally setting media storage settng, set it on the field
class UserModel(models.Model):
...
qr_gen = models.ImageField(upload_to='qrcodes',storage=MediaStorage())
Auto uploading an image directly to s3 requires one to directly communicate with S3's backend API. Plainly using django-storages or tweaking DEFAULT_FILE_STORAGE path isn't just enough as it only helps to point user-uploaded files to the specified s3 bucket/path. This problem can be tackled using the boto3 library's upload_file method.
Usage example:
import boto3
s3 = boto3.resource('s3')
s3.Bucket('mybucket').upload_file('/tmp/hello.txt', 'hello.txt')
Params:
Filename (str) -- The path to the file to upload.
Key (str) -- The name of the key to upload to.
ExtraArgs (dict) -- Extra arguments that may be passed to the client operation.
Callback (function) -- A method which takes a number of bytes transferred to be periodically called during the upload.
Related
I am running an app built with djang-tenants. The app asks the user (tenant) to upload some data. I want the data to be segregated in sub directories for each tenant.
Acccording to the doc (https://django-tenants.readthedocs.io/en/latest/files.html), here is how media root is configured:
settings.py
MEDIA_ROOT = "/Users/murcielago/desktop/simulation_application/data"
MULTITENANT_RELATIVE_MEDIA_ROOT = "%s"
On the upload everything is great.
Now, I can't find a way to retrieve the file being uploaded within the app. Basically I need the app to serve the file corresponding to which tenant is requesting it.
Here is how I thought this would work:
from django.conf import settings
media_file_dir = settings.MULTITENANT_RELATIVE_MEDIA_ROOT
df = pd.read_csv(media_file_dir+'/uploads/sample_orders_data.csv')
but this does not work.
I have made it work so far grabbing the tenant name from the url and passing it to the app using pickle but this is not right in terms of security and won't scale.
Would someone has a clue on the best way to handle the lecture of tenant specific files?
This question already has answers here:
Save uploaded image to S3 with Django
(2 answers)
Closed 1 year ago.
My problem is that images stored in media folder are not transferring to S3 Bucket. I tested with other file from request and the file did transfer, so I assume settings.py must be OK.
From views.py ->
This works:
if request.method == 'POST':
imageFile = request.FILES['images']
upload = Upload(file=imageFile)
upload.save()
image_url = upload.file.url
print(image_url)
This does not work:
for i in os.listdir(folder):
f = os.path.join(conf_settings.MEDIA_ROOT,company, i)
upload = Upload(file=f)
upload.save()
No error but it just does not work.
This also does not work:
for i in os.listdir(folder):
with open(os.path.join(folder, i)) as f:
upload = Upload(file=f)
upload.save()
>The error I am getting is:
>
>Exception Value:
>'_io.TextIOWrapper' object has no attribute '_committed'
>
>at upload.save()
This is my storage_backend.py
from django.conf import settings
from storages.backends.s3boto3 import S3Boto3Storage
class MediaStorage(S3Boto3Storage):
location = 'media'
default_acl = 'public-read'
file_overwrite = True
This is my model.py
class Upload(models.Model):
uploaded_at = models.DateTimeField(auto_now_add=True)
file = models.FileField()
I am uploading a .ZIP file with images. Unzipping it and saving them to media folder, then I want to upload from media folder to S3 Bucket. This operation fails.
The file in request.FILES is the Zip file, which I am using to test that all settings.py for AWS should be correct because it does transfer correctly.
I believe my issue has to do with the way I am reading the file and passing it.
So after many hours....this actually worked. Although the transfer is a bit slow, im sure there must be a better way.
https://stackoverflow.com/a/53260957/11116189
I'm trying to get files uploaded through the Django admin site to be placed on a network drive. Lets say the path to this drive is '\\FILESERVER\Django'.
My initial thought was to just set my media root to the same path I'd use to access the drive via File Explorer:
#settings.py
MEDIA_ROOT = r'\\FILESERVER\Django'
An Example Model:
#models.py
class Article(models.Model):
title = models.CharField(max_length=128)
pdf = models.FileField(upload_to='articles', blank=True, null=True)
def __str__(self):
return self.title
But when I upload a file, It just creates the folder on my local C: drive.
(e.g. C:\FILESERVER\Django\articles). Is there a way I can tell Django that this is supposed to be a path to a network drive?
Note: This is a Django 2.0.4 app running on a windows machine.
I believe what you're looking for is in pathlib. Have a look here:
https://docs.python.org/3/library/pathlib.html#methods-and-properties
Something like PureWindowsPath('//FILESERVER/Django').drive should do the trick for MEDIA_ROOT, IIRC.
If you use a network file server,you may need a customized django file storage class.
I have configured my django app's default file storage to use boto.
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage
I also have a model that stores uploaded images to s3
...
profile_pic = models.ImageField(upload_to=get_upload_path, null=True)
...
However, when I reference this field, it shows up with an S3 url.
How do I configure this to return a cloudfront address?
Use AWS_S3_CUSTOM_DOMAIN in Djangp Storage there is option to specify CloudFront URL
I am using django-storages and Amazon S3 for file storages. In my model I have:
avatar = models.ImageField(_('Avatar'), upload_to='avatars/profiles/', blank=True, null=True)
The image is uploaded successfully on save, but full url with credentials is saved. In my Retrieve requests/ when I read the url from db via console) I get something like:
https://subdomain.amazonaws.com/avatars/profiles/filename.jpg?X-Amz-Algorithm=XXX&X-Amz-Expires=XXX&X-Amz-SignedHeaders=XXXX&X-Amz-Signature=XXXX&X-Amz-Date=XXXXXX&X-Amz-Credential=XXXX
How can I prevent this? I could strip the url before responding, but I do not need and therefore do not want to save them in this format, because all files can be accessed publicly, also no need for credentials.
Ps. I though of using the post_save hook but it seemed like a hack to me.
To remove the authentication credentials in the query string, set AWS_QUERYSTRING_AUTH = False in your settings.py. From django-storages documentation at https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html:
AWS_QUERYSTRING_AUTH (optional; default is True)
Setting AWS_QUERYSTRING_AUTH to False to remove query parameter authentication from generated URLs. This can be useful if your S3 buckets are public.
What you see in X-Amz-Credentials is your access key. In Amazon context it is not considered sensitive information, so it can be stored in plain text.
if you set AWS_S3_CUSTOM_DOMAIN in settings.py,
django-storages will return custom-doamin without query string
you can reference below piece of code of class S3BotoStorage
def url(self, name, headers=None, response_headers=None, expire=None):
# Preserve the trailing slash after normalizing the path.
name = self._normalize_name(self._clean_name(name))
if self.custom_domain:
return "%s//%s/%s" % (self.url_protocol,
self.custom_domain, filepath_to_uri(name))
if expire is None:
expire = self.querystring_expire
return self.connection.generate_url(
expire,
method='GET',
bucket=self.bucket.name,
key=self._encode_name(name),
headers=headers,
query_auth=self.querystring_auth,
force_http=not self.secure_urls,
response_headers=response_headers,
)