Hello I'm having this problem when trying to deploy a django application to aws, the app runs perfectly locally but when trying to deploy to aws this error shows up.
like this
I tried to install that manually, but it doesn't work that way.
If the docker version for your deployment runner is 20.x - it's probably this https://github.com/docker/compose/issues/8105
Try pinning the deploy docker version to 19.03.13ce-1
Related
I am creating docker containers to deploy my apps via API, so there is a docker for app_manager that exposes an API and a docker container for an application(s) itself. Both app_manager and deploying applications are written in Python. I am testing deployment locally prior to deploying on the server. After the successful installation of app_manager I sent a request to deploy a specific version of my application via app_manager API (that uses subprocess.run) in a web browser. However, the response is null and by checking the docker logs I found the following error:
docker: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by docker)
I do not use Jenkins and adding libc6 library to DockerFile does not change anything. I still can install my app in Docker container manually with a couple of bash commands. I can
even deploy my app in the VSCode editor via subprocess.run() directly. However, how can I fix it, so I can deploy via sending API request?
I am using Ubuntu 22.04.1 LTS, Python 3.8, Docker version 20.10.12, build 20.10.12-0ubuntu4
Thank you
I have deployed fast API and selenium python script to Heroku. I have added the required buildpacks and config variable before deploying and tried everything but still, I get the error. The script runs perfectly on the local host, but I can't seem to run it on the Heroku server. Below are the image of log details and my code on Github. I really appreciate the help
ss of build logs
https://github.com/sus0001/kalimati_market-fastAPI
I am currently building a project that uses Django Web Framework (hosted on AWSEB) but also needs to execute some R scripts. I have tried using subprocess.call(["Rscript", "R_django_test.R"]) but I get the following error "No such file or directory: 'Rscript': 'Rscript'". The code above works locally, but not on the project hosted on AWS. Any help would be greatly appreciated.
It would be easier to use Docker deployment if you don't yet. AWSEB uses Amazon Linux 2 a CentOS-based Linux distributive, so follow the guide How to install R on CentOS and place the commands into the Dockerfile, then RScript command will be available in the environment.
I have created a Django application which is deployed to AWS using Zappa in a CI/CD pipeline using the following commands:
- zappa update $ENVIRONMENT
- zappa manage $ENVIRONMENT "collectstatic --noinput"
- zappa manage $ENVIRONMENT "migrate"
However I want to add a test step to the CI/CD pipeline that would test the application much like in a way that I would test the application locally using the Django command:
python manage.py test
Is there a way to issue a zappa command to make run the "python manage.py test"? Do you have any suggestion?
Here is the docs that have a docker setup that you can use for local testing (Docs here)
Also, this post from Ian Whitestone(a core Zappa contributor) uses the official AWS docker image for the same job. And he has given one example for local testing also (Blog post)
I Deployed a project on the scraping-hub but my spider isn't working because scraping-hub uses an old version of twisted library. The project is working fine on my local machine, Is there anyway that i could make an egg of the twisted updated version and deploy it on scraping-hub.
Would this help: http://doc.scrapinghub.com/shub.html#deploying-dependencies
?
It shows how to deploy specific dependencies for you project to scrapinghub cloud service.