I'm trying to do some local processing of data entries from the GAE datastore and I am trying to do this by using the remote_api. I just want to write some quick processing scripts that pull some data, but I am getting import errors saying that Python cannot import from google.
Am I supposed to run the script from within the development environment somehow. Or perhaps I need to include all of the google stuff in my Python path? That seems excessive though.
Why is including the paths that onerous ?
Normally the remote_api shell is used interactively but it is a good tool that you can use as the basis of acheiving what your want.
The simplest way will be to copy and modify the remote_api shell so that rather than presenting an interactive shell you get it to run a named script.
That way it will deal with all the path setup.
In the past I have integrated the remote_api inside a zope server, so that plone could publish stuff to appengine. All sort of things are possible with remote_api, however you need to deal with imports like anything else in python, except that appengine libraries are not installed in site-packages.
Related
I have a client for whom I have created a program that utilizes a variety of data and machine learning packages. The client would like for the program to be easily run without installing any type of python environment. Is this possible?
I am assuming the best bet would be to transform the .py file into a .exe file but am unsure of how to do this if I have packages that need to be installed before the program can be run.
Are there websites that exist that allow you to easily host complex .py files on them to be run by anyone that accesses the URL?
I think you are looking for "freezing", which package everything including the interpreter, libs and packages into a single executable file.
There are several tools for this purpose:
https://wiki.python.org/moin/Freeze
https://docs.python-guide.org/shipping/freezing/
I think the use of colaboratory that is cloud service provided by Google might be better. Your client who has to sign up for Google account can not only run the python program, but also utilize any major python packages on the cloud (of course, it's possible to install the necessary packages into the client's cloud space), without constructing the python environment on client's local PC. What's more, it's at free!
Need advice on how to incorporate Python into an Azure ASP.NET web application environment. Please excuse this question but I am new to Azure and I'm not clear on how to proceed. Every option that I look into looks promising but they all seem to have their own issues. Below is a more thorough explanation but the deal is that I have an Azure account with all kinds of goodies, a full fledged ASP.NET (C#) web app running via App Service, I am new to Azure (but not Python), and I'm hoping to add Python functionality to this whole setup. In short:
I want to add Python to this setup mainly to run scheduled jobs and also to trigger Python code from ASP.NET web form submissions
ideally I want a solution that resembles a non-cloud setup. I know this sounds silly but I'm finding the cloud/Azure functionality to be nuanced and not straightforward. I want a place to put a bunch of Python scripts, run, edit, schedule and trigger them from ASP.NET
for example: I created a WebJob that runs manually and from the documentation it wasn't clear how it should be called. I just figured out that you need to POST with Basic Auth (and the credentials provided).
!Also, Azure CMD does NOT like files with 'underscore _' in them! You cannot submit a Web Job with a py file with an underscore nor can you write output with a file with an underscore
!Also, I don't see an option for this Web Job to run Python 3.6.4 (which I installed via extension). Right now it is using 2.7.15...
!Also, CRON expression in Azure has six *, not five plus a command. Again, more weird stuff to worry about
I tried these instructions but the updates to the web page's Web.config file breaks the ASP.NET web pages
ideally the most cost effective option
Any info is greatly appreciated
MORE DETAILED EXPLANATION
Currently I have an ASP.NET site running via Azure App Service and I would like to add Python scripts and possibly Flask/Rest functionality. Note that I am not expecting to serve any content via Python and will largely be running Python scripts either on a scheduled basis or call them from ASP.NET. As a matter of fact, and this is an important point, I'm hoping to have ASP.NET trigger/run a Python script when a web form is submitted. I realize that I could get a similar effect if I make a web call to a Rest api that is running Python. In any event, I can't tell if I should:
add a Python extension to the current App Service running the web page (I tried this) OR
I did install Python 3.6.4 and some packages via pip
These instructions were useful, however the updates to the web page's Web.config file breaks the ASP.NET web pages
set up a VM that will have all of the Python code (but how can I have the .NET web page(s) call the Python in the VM?) OR
use Azure functions (I'm completely new to this and must admit that I prefer to have my old school Python environment instead although I see the benefit of using functions. But how do you deal with logging and debugging?)
or what about a custom windows container (Docker)?
This requires installing VS Code and that is OK but I'm looking for a solution that another user can get into with as few interruptions as possible
The idea is to ramp up the use of Python although, like I said, I don't expect Python to be serving any of the web content. It will be used to run in the background and to run scheduled jobs. What is the most robust and hopefully easiest way to add Python functionality to Azure (most importantly in a way to be able to trigger/use Python from an App Service running .NET?)? I've searched online and stack overflow so far with interesting finds but nothing to my liking.
For example, the following link discusses how to schedule WebJobs. I just created a manual one and when I called the webhook I got the message: "No route registered for '/api/triggeredwebjobs/TestPython/run'" How to schedule python web jobs on azure
The Docker method looks very promising, however, I'm looking for a simple solution as there is another person who will be involved in all of this and he's busy with other projects
Thank you very much!
I found a solution, though I'm open to more info. Like I mentioned in my post, I used the 'add extension' tool to add Python 3.6.4 to my Azure (installed in D:\home\python364x64).
Then I installed a bunch of packages via pip, these installed into D:\home\python364x64\Lib\site-packages.
I created a Python folder in webpages\Python where I put my scripts.
Finally, in ASP.NET I used the Diagnostics.Process call to run my code in ~\webpages\Python\somecode_2.py
The main issue is that Azure came with Python 2.7.15 installed. And for some reason when my Python code got executed it was using 3.4 (where that version came from beats me). So for each script, I had to create an _2.py version where I simply did the following in order to call the original script via Python 3.6.4. Looks a little nasty but it works. So like I said, I would welcome more info for ways to do this better...
--
import os<br>
os.system("D:\\home\python364x64\python.exe SomePython.py {0}".format(add arguments here)
I have a server which executes Python scripts from a certain directory path. Incidently this path is a check-out from the SVN trunk version of the scripts. However, I get the feeling that this isn't the right way to provide and update scripts for a server.
Do you suggest other approaches? (compile, copy, package, ant etc.)
In the end a web server will execute some Python script with parameters. How do I do the update process?
Also, I have trouble deciding what is best to handle updated versions which only work for new projects on the server. Therefore, if I update the Python scripts, but only newly created web jobs will know how to handle that. I "delivery" to one of many directories which keep track of versions and the server picks the right one?!
EDIT: I webserver is basically an interface that runs some data analysis. That analysis is the actual scripts that take some parameters and mingle data. I don't really change the web interface. I only need to update the data scripts stored on webserver. Indeed, in some advanced version the web server should also pick the right version of my data scripts. However, at the moment I have no idea which would be the easiest way.
The canonical way of distributing Python code/functionality is by using a PyPi compliant package manager.
A list of available PyPi implementations on python.org:
http://wiki.python.org/moin/PyPiImplementations
Instructions on setting up and using EggBasket:
http://chrisarndt.de/projects/eggbasket/#installation
Instructions on installing ChiShop:
http://justcramer.com/2011/04/04/setting-up-your-own-pypi-server/
Note that for this to work you need to distribute your code as "Eggs"; you can find out how to do this here: http://peak.telecommunity.com/DevCenter/setuptools
A great blog post on the usage of eggs and the different parts in packaging: http://mxm-mad-science.blogspot.com/2008/02/python-eggs-simple-introduction.html
I have a django site that needs to be rebuilt every night. I would like to check out the code from the Git repo and then begin doing the stuff like setting up the virtual environment, downloading the packages, etc. This would have no manual intervention as this would be run from cron
I'm really confused as to what to use for this. Should I write a Python script or a Shell script? Are there any tools that assist in this?
Thanks.
So what I'm looking for is CI and from what I've seen I'll probably end up using Jenkins or Buildbot for it. I've found the docs to be rather cryptic for someone who's never attempted anything like this before.
Do all CI like Buildbot/Jenkins simply run tests and more test and send you reports or do they actually set up a working Django environment that you can access through your browser?
You'll need to create some sort of build script that does everything but the GIT checkout. I've never used any Python build tools, but perhaps something like: http://www.scons.org/.
Once you've created a script you can use Jenkins to schedule a nightly build and report success/failure: http://jenkins-ci.org/. Jenkins will know how to checkout your code and then you can have it run your script.
There are litterally 100's of different tools to do this. You can write python scripts to be run from cron, you can write shell scripts, you can use one of the 100's of different build tools.
Most python/django shops would likely recommend Fabric. This really is a matter of you running through and making sure you understand everything that needs to be done and how to script it. Do you need to run a test suite before you deploy to ensure it doesn't really break everything? Do you need to run South database migrations? You really need to think about what needs to be done and then you just write a fabric script to do those things.
None of this even touches the fact that overall what you're asking for is continuous integration which itself has a whole slew of tools to help manage that.
What you are asking for is Continuous Integration.
There are many CI tools out there, but in the end it boils down to your personal preferences (like always, hopefully) and which one just works for you.
The Django project itself uses buildbot.
If you would ask me, then I would recommend you continuous.io, which works ouf the box with Django applications.
You can manually set how many times you would like to build your Django project, which is great.
You can, of course, write a shell script which rebuilds your Django project via cron, but you should deserve better than that.
Is it possible to run django without shell access? My hoster supports the following for 5€/month:
python (I assume via mod_python)
mysql
There is no shell nor cronjob support, which costs additional 10€/month, so I'm trying to avoid it.
I know that Google Apps also work without shell access, but I assume that is possible because of their special configuration.
It's possible but not desirable. Having shell access makes it possible to centralise things properly using symlinks.
Get a better host would be my first suggestion. WebFaction is the most recommended shared host for using with Django.
If that's out of your price range, there are plenty of hosts that give you a proper system account (vs just a ftp account) and have mod_python or mod_wsgi (preferred now).
Google Apps works without shell because their system looks for a dispatcher script that you have to write to an exact specification.
It is possible.
Usually you will develop your application locally (where shell access is nice to have) and publish your work to your server. All you need for this is FTP access and some way to import a database dump from your development database (often hosters provide an installation of phpMyAdmin for this).
python (I assume via mod_python)
From my experience, you are most certainly wrong with that assumption. Many low-cost providers claim to support python but in fact provide only an outdated version that can be used with CGI scripts. This setup will have a pretty low performance for Django apps.