I need to get data (json) in my html page, with help of Ajax. I have a Nodejs server serving requests.
I have to get the json from server, which is python code to process and produce json as output.
So should i save json in db and access it? (seems complicated just for one single use)
Should i run python server, to serve the requests with json as result (call it directy from html via ajax)
Should i serve requests with nodejs alone, by calling python method from nodejs? if so how to call the python method.
If calling python requires it to run server ? which one to prefer (zeropc, or some kind of web framework?)
Which is the best solution? or Which is preferred over other in what scenario and what factors?
As I understand:
You have a server running
an app that acts as a webserver, made with nodejs.
another app in python that does not expose an HTTP api, but you would like to interact with it in some way?
You have control over the source code of both
When running big systems with parts programmed on different languages their are many ways to make it all work together.
The choice is yours, and you give us little information to help you decide which way is better. It all depends on how big are your python and nodejs apps, and what they do.
There are no magical way to call a python method from nodejs, or vice versa
Spawn persistent processes, and make them communicate with sockets, or pipes. Generally this is done with enabling libraries (like ZeroMQ + a serialization/RPC format).
Spawn persistent processes and make them communicate with a message queue in between (RabbitMQ, ActiveMQ, ...)
Spawn your nodejs webserver, and use the "child_process" module to spawn a process in python, and interact with it throught its standard input/output. There are helper libraries to do this: https://www.npmjs.com/package/python-shell
Just spawn the nodejs webserver and execute python scripts with "child_process" when you need them (no persistent python process).
Rewrite it all in python
Rewrite it all in nodejs
In your application, if you have some requirement of processing results of python server requests in your nodejs application, then you need to call the python server requests in nodejs app with request libray and then process the result. Otherwise, you should simply call python server resources through client side ajax requests.
Thanks
Related
I've made a small python script to scrap the web. I would like to make a nice and simple web interface where the user can enter data to search for and have the result displayed as a list.
I understand that there's many different ways to do that but don't know which one would be the best in my case.
I would like something :
Really simple and light
Running locally with the less dependencies possible.
So far I've thinking about :
A NodeJS server displaying content and executing the script
A web framework in Python (web.py, Flask, Django..?)
A local webserver (XAMPP) and cgi
Please note that I don't know much about web in python but I'm a bit used to NodeJS.
What would you recommend ?
Thanks, Victor
Personally I prefer gevent, bottle or Flask, and some front end framework like bootstrap or framework7.
Gevent easily makes it asynchronous and has websockets built right in, and bottle is the easiest (and fastest) way to build a web app or api.
Socket.io ist Very easy to send Data between Websites and scripts.
The Website Connect with the Socket.io Server and Inside the Server the Python Script can be executed
I've been trying websocket-client and socketio-client with no luck so far. The broad picture of what I want to accomplish is this:
Currently, I have a Flask Rest API that has both a web front-end and a command line interface, and it handles several different sets of file uploads/downloads. Both communicate with the server using HTTP requests, the web one from JQuery AJAX and the CLI uses python requests. I would like to switch to using sockets so that database changes from one client appear on all of them. I have been able to get Flask-SocketIO working between my JQuery and Flask server, but I'm struggling with getting any client libraries working from the CLI portion. Is there an easy to use python library for sockets similar to requests I should be using for this transition, or am I going in a totally wrong direction with making this switch?
Another option, unsure of the viability, would be to try and keep both the REST API for the CLI and have sockets for the web interface. Sounds very messy though.
After doing a lot of searching and messing around with various libraries, the one that was easiest to get up and running connecting a command line tool with a Flask-SocketIO webapp was socketIO-client.
This repository came in handy for the issues where I was struggling to understand how to correctly use the waits to receive info on the client side.
Once I've finished the project in a few weeks, I will come back and add more details so people finding this in the future can have an easier time getting this set up.
while I have quite some python experience, I've never used it for web and I have vast amount of web experience with PHP.
Now I want to create a simple python script (lets call it service.py) that runs on example.com. I installed mod_wsgi as suggested by the docs, my web server is Apache 2.2, the mod_wsgi is loaded successfully.
How do I configure my web server/mod_wsgi so that requests comming to example.com/service are processed by service.py?
Then how do I access the request params (like $_GET, $_POST, $_FILES) from the python script?
With mod_wsgi, you configure which URLs are served by setting WSGIScriptAlias. Your script, though, needs to be an actual WSGI application, which exposes an application variable which is called by the server.
I suspect it'd be easier to configure your script as simple CGI. You can then use the cgi module from the standard library to access your request params (note, though, that the examples you give are PHP-specific: they're accessed differently in Python, depending on the specific framework).
Another alternative would be to use a mini-framework like Flask, which would encapsulate all this and give you a simple interface to use in your service.py script.
Hi I want to deploy a matlab application on the web using python. Is there a way to do it.I have converted my application into jar files (java classes) as per the documentation on math works site. Can someone point me in the right direction to go ahead
The fact that your Matlab code is packaged up as Jars may not help that much here, at least not with pure Python.
There are a few ways you can take code written in Java and expose it to Python.
Jython
If you are willing to give Jython a shot this may be a really easy way to provide a Django interface to your jars.
Basically you'll get to write a normal Django App and also use Jython to work natively with your Jars. This could be the best of both worlds assuming you aren't tied to CPython.
Django-Jython
Java Compatibility Interfaces
On CPYTHON either of the following projects will help you work with the code in your Jar files:
JCC: Create a Python extension module that wraps your Jar file
JPype: Provides an API for running the JVM and calling into code running in that JVM from Python.
Separate Process:
If you have a standalone program written in Matlab (really any language) you could execute it as a child process of your Django application. You'd look into a simple web form in Django that allowed you to submit values to be inputs to this process and then in your view (after validating the form) you'd do something like:
command = "mymatlabprogram.exe %s"%(arg1,)
process = subprocess.Popen(command.split())
stdout, stderr = process.communicate()
Assuming that worked you could pull answers out of stdout or error messages out of stderr. You could serve an image created by that process, etc. Once something like this is working you could look into celeryd to extract the subprocess stuff from your web app.
The advantage of working with a separate process is that you isolate bugs in your Matlab code from breaking your web application and vice versus. The disadvantage is you have to serialize everything and work with multiple times between the client's browser and your web app, between the web app and the executable, and back to the client.
To give a little background, I'm writing (or am going to write) a daemon in Python for scheduling tasks to run at user-specified dates. The scheduler daemon also needs to have a JSON-based HTTP web service interface (buzzword mania, I know) for adding tasks to the queue and monitoring the scheduler's status. The interface needs to receive requests while the daemon is running, so they either need to run in a separate thread or cooperatively multitask somehow. Ideally the web service interface should run in the same process as the daemon, too.
I could think of a few ways to do it, but I'm wondering if there's some obvious module out there that's specifically tailored for this kind of thing. Any suggestions about what to use, or about the project in general are quite welcome. Thanks! :)
Check out the class BaseHTTPServer -- a "Basic HTTP server" bundled with Python.
http://docs.python.org/library/basehttpserver.html
You can spin up a second thread and have it serve your requests for you very easily (probably < 30 lines of code). And it all runs in the same process and Python interpreter space, so it can access all your objects, etc.
I'm not sure I understand your question properly, but take a look at Twisted
I believed all kinds of python web framework is useful.
You can pick up one like CherryPy, which is small enough to integrate into your system. Also CherryPy includes a pure python WSGI server for production.
Also the performance may not be as good as apache, but it's already very stable.
Don't re-invent the bicycle!
Run jobs via cron script, and create a separate web interface using, for example, Django or Tornado.
Connect them via a database. Even sqlite will do the job if you don't want to scale on more machines.