I want to start on an open source project.
is there a way i could follow the flow of execution of code in python when its run.
Using sys.set_trace() i am able to follow the trace of a code blocks in a script. I want to bump this up to bigger projects lets say for example a django app.
a simple example given the command python3 manage.py runserver when run with "trace" - an imaginery tool
trace python3 manage.py runserver would listen for function and method calls from in project after the command and maybe print them out somewhere.
What would be the best way to go about implementing this.
my idea is creating a virtual box like docker where i would listen for any python calls.
Does anyone have a better way or input to help me on this
i would highly appreciate it
Related
I am creating a project where I am using an ubuntu server for production.
I need to run the server in the background even though I log out. So, as a solution I am using screen but however if I restart it won't work as we all know and I have to rewrite the screen commands.
So, I want to use the startup service but somehow I am lost in that as I have no idea how to do that.here is the one way to write the script and I love it but how to add the command
python manage.py runserver [IP Address]
in this service.
Sorry if I sound so silly but I need a quick and useful solution, so I thought to be here.
Thanks if anyone can guide me on it.
I currently have a handful of small Python scripts on my laptop that are set to run every 1-15 minutes, depending on the script in question. They perform various tasks for me like checking for new data on a certain API, manipulating it, and then posting it to another service, etc.
I have a NAS/personal server (unRAID) and was thinking about moving the scripts to there via Docker, but since I'm relatively new to Docker I wasn't sure about the best approach.
Would it be correct to take something like the Phusion Baseimage which includes Cron, package my scripts and crontab as dependencies to the image, and write the Dockerfile to initialize all of this? Or would it be a more canonical approach to modify the scripts so that they are threaded with recursive timers and just run each script individually in it's own official Python image?
No dude just install python on the docker container/image, move your scripts and run them as normal.
You may have to expose some port or add firewall exception but your container can be as native linux environment.
While developing locally on a side Django project, my workflow is currently this:
Make changes.
Go to Terminal session with virtual environment. Stop foreman with ctrl+C.
Type python manage.py collectstatic to move all my static css/js/img files.
Restart foreman with foreman start.
In an attempt to be more efficient and do a better job of learning, I'm wondering how I can optimize the workflow is it's more like this:
Make changes.
Run a single command that moves static files and restarts foreman.
Would someone be able to point me in the right direction. Thanks.
You could create a bash script that does this bunch of commads for you.
While I have no experience with foreman, you could create a script with content something like:
#!/bin/bash
sudo killall foreman
python manage.py collectstatic
foreman start
Then add execution rights to it:
chmod +x script.sh
And execute everything in one command:
./script.sh
I assume you absolutely can't get around using foreman for local development, cause otherwise you would not even need to do a collectstatic or manual restart.
Maybe writting a custom management command based on runserver is the way to go for you, as it would already have the check-for-change-and-restart logic in it.
https://docs.djangoproject.com/en/dev/howto/custom-management-commands/
https://github.com/django/django/blob/master/django/core/management/commands/runserver.py?source=cc
I have a website on my local server and I like to execute system commands on my local server with a button press in the html file that is displayed. Is there a way to either execute system commands like gpio write 0 1 or to run python scripts? And how can I get the output of a system command as a string, like /opt/vc/bin/vcgencmd measure_temp| egrep "[0-9.]{4,}" -o output e.g. 44.4?
Thanks
David
You will have to a webserver with some kind of server-side script. There's lots of ways you can do this. If you know PHP, that may be easiest. If you want to use python check out uwsgi.
Here is a pretty simple project I wrote with uwsgi that might help you get started if you go that route. I found a lot of the examples didn't help a lot, so you might have some luck with that code.
Edit: Actually, uwsgi on the pi is a pretty old version, and it does some weird things on ARM if you try to compile it.
I created a proof of concept for you here using gunicorn instead. Just follow the instructions under the Installing section.
I have a django site that needs to be rebuilt every night. I would like to check out the code from the Git repo and then begin doing the stuff like setting up the virtual environment, downloading the packages, etc. This would have no manual intervention as this would be run from cron
I'm really confused as to what to use for this. Should I write a Python script or a Shell script? Are there any tools that assist in this?
Thanks.
So what I'm looking for is CI and from what I've seen I'll probably end up using Jenkins or Buildbot for it. I've found the docs to be rather cryptic for someone who's never attempted anything like this before.
Do all CI like Buildbot/Jenkins simply run tests and more test and send you reports or do they actually set up a working Django environment that you can access through your browser?
You'll need to create some sort of build script that does everything but the GIT checkout. I've never used any Python build tools, but perhaps something like: http://www.scons.org/.
Once you've created a script you can use Jenkins to schedule a nightly build and report success/failure: http://jenkins-ci.org/. Jenkins will know how to checkout your code and then you can have it run your script.
There are litterally 100's of different tools to do this. You can write python scripts to be run from cron, you can write shell scripts, you can use one of the 100's of different build tools.
Most python/django shops would likely recommend Fabric. This really is a matter of you running through and making sure you understand everything that needs to be done and how to script it. Do you need to run a test suite before you deploy to ensure it doesn't really break everything? Do you need to run South database migrations? You really need to think about what needs to be done and then you just write a fabric script to do those things.
None of this even touches the fact that overall what you're asking for is continuous integration which itself has a whole slew of tools to help manage that.
What you are asking for is Continuous Integration.
There are many CI tools out there, but in the end it boils down to your personal preferences (like always, hopefully) and which one just works for you.
The Django project itself uses buildbot.
If you would ask me, then I would recommend you continuous.io, which works ouf the box with Django applications.
You can manually set how many times you would like to build your Django project, which is great.
You can, of course, write a shell script which rebuilds your Django project via cron, but you should deserve better than that.