I am a Python programmer, but new to webservices.
Task:
I have a Typo3-Frontend and a Postgresql-database. I want to write a backend between these two parts in Python. Another developer gave me a wsdl-file and xsd-file to work with, so we use SOAP. The program I code should be bound to a port (TCP/IP) and act as a service. The data/payload will be encoded in json-objects.
Webclient <---> Frontend <---> Backend(Me) <---> Database
My ideas:
I code all functions by hand from the wsdl-file, with the datatypes from xsd.
I bind a service to a port that recieves incoming json-data
I parse the incoming data, do some database-operations, do other stuff
I return the result to the frontend.
Questions:
Do I have to code all the methods/functions descripted in the wsdl-file by hand?
Do I have to define the complex datatypes by hand?
How should I implement the communication between the frontend and the backend?
Thanks in advance!
Steffen
I have successfully used suds client to communicate with Microsoft Dynamics NAV (former Navision).
Typical session looks like this:
from suds.client import Client
url = 'http://localhost:7080/webservices/WebServiceTestBean?wsdl'
client = Client(url)
By issuing print client you get the list of types and operations supported by the serivce.
Suds - version: 0.3.3 build: (beta) R397-20081121
Service (WebServiceTestBeanService) tns="http://test.server.enterprise.rhq.org/"
Prefixes (1):
ns0 = "http://test.server.enterprise.rhq.org/"
Ports (1):
(Soap)
Methods:
addPerson(Person person, )
echo(xs:string arg0, )
getList(xs:string str, xs:int length, )
getPercentBodyFat(xs:string name, xs:int height, xs:int weight)
getPersonByName(Name name, )
hello()
testExceptions()
testListArg(xs:string[] list, )
testVoid()
updatePerson(AnotherPerson person, name name, )
Types (23):
Person
Name
Phone
AnotherPerson
WSDL operations are exposed as ordinary python functions and you can use ordinary dicts in place of WSDL types.
I would go with Twisted since I am working with it anyway and enjoy the system.
Another asynchronous option may be Tornado.
Or a synchronous version with Flask.
I'm sure there are many other options. I would look for a higher level framework like those listed above so you don't have to spend too much time connecting the frontend to the backend.
You can use Python libraries as SOAPpy or PyWS.
Related
I have this web app where I upload files and do some stuff with them.
There's been a request to connect this app with another site and upload files (upon request) form the app to the site. This site provides an API and with it python code to interact with that. Since I am one year old in programming years I don't quite get the API concept. All I know so far is that it's something in the middle that handles requests.
Anyway,
The code provided(from the site) has a class and several methods to that class like so:
class ApiClient(object):
def method1(self,param1):
Since I haven't,so far, worked with external APIs before I don't know how to handle it and have some questions.
1.Regardless the class ApiClinet(object) I get that they provide python code so the class is an object, but how do I handle it? Do I make it a models.Model?
2.Adittionaly to my first question,do I store the information for this in the database? So do I need fields?
thanks in advnace!
"API' means "Application Programming Interface", which actually means a whole lot of mostly unrelated stuffs... But when it comes to web, "API" usually means "something you can interact with by sending HTTP requests".
Sometimes you'll only have the description of the endpoint urls, what's expected in the request and what's supposed to be returned in the response, and you have to write all the client code by yourself (using urllib or - better - requests), but sometimes someone (eventually the author of the API) also provides a client library for your language of choice, so it's just a matter of using this client library.
In your case it seems you already have the client, so it's just a matter of using it, ie:
from yourapi import ApiClient
client = ApiClient(<probably some API key needed here>)
result = client.fetch_something(some, args)
What methods are available and what they return is of course totally specific to this API and should be documented somewhere.
What you do with results is totally up to you and depends on what your project's requirements are so no one can answer this question.
import requests
REMOTE_API_URL = "https://google.com/blabla"
class ApiClient(models.Model):
file = models.CharField(max_length=255)
def save(self, *args, **kwargs):
post_data = {'remote_api_file_field': self.file}
requests.post(REMOTE_API_URL, data=post_data)
super(ApiClient).save()
Also see; https://docs.djangoproject.com/en/1.10/ref/models/instances/#customizing-model-loading
I am fighting with tornado and the official python oauth2client, gcloud... modules.
These modules accept an alternate http client passed with http=, as long as it has a method called request which can be called by any of these libraries, whenever an http request must be sent to google and/or to renew the access tokens using the refresh tokens.
I have created a simple class which has a self.client = AsyncHttpClient()
Then in its request method, returns self.client.fetch(...)
My goal is to be able to yield any of these libraries calls, so that tornado will execute them in asynchronously.
The thing is that they are highly dependant on what the default client - set to httplib2.Http() returns: (response, content)
I am really stuck and cannot find a clean way of making this async
If anyone already found a way, please help.
Thank you in advance
These libraries do not support asynchronous. The porting process is not always easy.
oauth2client
Depending on what you want to do maybe Tornado's GoogleOAuth2Mixin or tornado-alf will be enough.
gcloud
Since I am not aware of any Tornado/asyncio implementation of gcloud-python, so you could:
you may write it yourself. Again it's not simple transport change of Connection.http or request, all the stuff around must be able to use/yield future/coroutines.
wrap it in ThreadPoolExecutor (as #Apero mentioned). This is high level API, so any nested api calls within that yield will be executed in same thread (not using the pool). It could work well.
external app (with ProcessPoolExecutor or Popen).
When I had similar problem with AWS couple years ago, I've ended up with executing, asynchronously, CLI (Tornado + subprocess.Popen + some cli (awscli, or boto based)) and simple cases (like S3, basic EC2 operations) with plain AsyncHTTPClient.
I am trying to add authentication to a xmlrpc server (which will be running on nodes of a P2P network) without using user:password#host as this will reveal the password to all attackers. The authentication is so to basically create a private network, preventing unauthorised users from accessing it.
My solution to this was to create a challenge response system very similar to this but I have no clue how to add this to the xmlrpc server code.
I found a similar question (Where custom authentication was needed) here.
So I tried creating a module that would be called whenever a client connected to the server. This would connect to a challenge-response server running on the client and if the client responded correctly would return True. The only problem was that I could only call the module once and then I got a reactor cannot be restarted error. So is there some way of having a class that whenever the "check()" function is called it will connect and do this?
Would the simplest thing to do be to connect using SSL? Would that protect the password? Although this solution would not be optimal as I am trying to avoid having to generate SSL certificates for all the nodes.
Don't invent your own authentication scheme. There are plenty of great schemes already, and you don't want to become responsible for doing the security research into what vulnerabilities exist in your invention.
There are two very widely supported authentication mechanisms for HTTP (over which XML-RPC runs, therefore they apply to XML-RPC). One is "Basic" and the other is "Digest". "Basic" is fine if you decide to run over SSL. Digest is more appropriate if you really can't use SSL.
Both are supported by Twisted Web via twisted.web.guard.HTTPAuthSessionWrapper, with copious documentation.
Based on your problem description, it sounds like the Secure Remote Password Protocol might be what you're looking for. It's a password-based mechanism that provides strong, mutual authentication without the complexity of SSL certificate management. It may not be quite as flexible as SSL certificates but it's easy to use and understand (the full protocol description fits on a single page). I've often found it a useful tool for situations where a trusted third party (aka Kerberos/CA authorities) isn't appropriate.
For anyone that was looking for a full example below is mine (thanks to Rakis for pointing me in the right direction). In this the user and password is stored in a file called 'passwd' (see the first useful link for more details and how to change it).
Server:
#!/usr/bin/env python
import bjsonrpc
from SRPSocket import SRPSocket
import SocketServer
from bjsonrpc.handlers import BaseHandler
import time
class handler(BaseHandler):
def time(self):
return time.time()
class SecureServer(SRPSocket.SRPHost):
def auth_socket(self, socket):
server = bjsonrpc.server.Server(socket, handler_factory=handler)
server.serve()
s = SocketServer.ForkingTCPServer(('', 1337), SecureServer)
s.serve_forever()
Client:
#! /usr/bin/env python
import bjsonrpc
from bjsonrpc.handlers import BaseHandler
from SRPSocket import SRPSocket
import time
class handler(BaseHandler):
def time(self):
return time.time()
socket, key = SRPSocket.SRPSocket('localhost', 1337, 'dht', 'testpass')
connection = bjsonrpc.connection.Connection(socket, handler_factory=handler)
test = connection.call.time()
print test
time.sleep(1)
Some useful links:
http://members.tripod.com/professor_tom/archives/srpsocket.html
http://packages.python.org/bjsonrpc/tutorial1/index.html
I'm building a Django app with an API built on Piston. For the sake of keeping everything as DRY as possible and the API complete, I'd like my internal applications to call the API rather than the models (kind of a proxy-view-controller a la https://github.com/raganwald/homoiconic/blob/master/2010/10/vc_without_m.md but all on one django install for now). So the basic setup is:
Model -> API -> Application -> User Client
I can overload some core Piston classes to create an internal client interface for the application, but I'm wondering if I could just use the Django Test Client to accomplish the same thing. So to create an article, rather than calling the model I would run:
from django.test.client import Client
c = Client()
article = c.post('/api/articles', {
'title' : 'My Title',
'content' : 'My Content'
})
Is there a reason I shouldn't use the test client to do this? (performance, for instance) Is there a better tool that's more tailored for this specific purpose?
After reviewing the code for TestClient, it doesn't appear to have any additional overhead related to testing. Rather, it just functions as a basic client for internal requests. I'll be using the test client as the internal client, and using Piston's DjangoEmitter to get model objects back from the API.
Only testing will tell whether the internal request mechanism is too much of a performance hit.
First of all, I will admit I am a novice to web services, although I'm familiar with HTML and basic web stuff. I created a quick-and-dirty web service using Python that calls a stored procedure in a MySQL database, that simply returns a BIGINT value. I want to return this value in the web service, and I want to generate a WSDL that I can give our web developers. I might add that the stored procedure only returns one value.
Here's some example code:
#!/usr/bin/python
import SOAPpy
import MySQLdb
def getNEXTVAL():
cursor = db.cursor()
cursor.execute( "CALL my_stored_procedure()" ) # Returns a number
result=cursor.fetchall()
for record in result:
return record[0]
db=MySQLdb.connect(host="localhost", user="myuser", passwd="********", db="testing")
server = SOAPpy.SOAPServer(("10.1.22.29", 8080))
server.registerFunction(getNEXTVAL)
server.serve_forever()
I want to generate a WSDL that I can give to the web folks, and I'm wondering if it's possible to have SOAPpy just generate one for me. Is this possible?
When I tried to write Python web service last year, I ended up using ZSI-2.0 (which is something like heir of SOAPpy) and a paper available on its web.
Basically I wrote my WSDL file by hand and then used ZSI stuff to generate stubs for my client and server code. I wouldn't describe the experience as pleasant, but the application did work.
I want to generate a WSDL that I can give to the web folks, ....
You can try soaplib. It has on-demand WSDL generation.
Sorry for the question few days ago. Now I can invoke the server successfully. A demo is provided:
def test_soappy():
"""test for SOAPpy.SOAPServer
"""
#okay
# it's good for SOAPpy.SOAPServer.
# in a method,it can have morn than 2 ws server.
server = SOAPProxy("http://localhost:8081/")
print server.sum(1,2)
print server.div(10,2)