pysimplesoap web service return connection refused - python

I've created some web services using pysimplesoap like on this documentation:
https://code.google.com/p/pysimplesoap/wiki/SoapServer
When I tested it, I called it like this:
from SOAPpy import SOAPProxy
from SOAPpy import Types
namespace = "http://localhost:8008"
url = "http://localhost:8008"
proxy = SOAPProxy(url, namespace)
response = proxy.dummy(times=5, name="test")
print response
And it worked for all of my web services, but when I try to call it by using an library which is needed to specify the WSDL, it returns "Could not connect to host".

To solve my problem, I used the object ".wsdl()" to generate the correct WSDL and saved it into a file, the WSDL generated by default wasn't correct, was missing variable types and the correct server address...

The server name localhost is only meaningful on your computer. Once outside, other computers won't be able to see it.
1) find out your external IP, with http://www.whatismyip.com/ or another service. Note that IPs change over time.
2) plug the IP in to http://www.soapclient.com/soaptest.html
If your local service is answering IP requests as well as from localhost, you're done!

Related

Access Azure EventHub with WebSocket and proxy

I'm trying to access Azure EvenHub but my network makes me use proxy and allows connection only over https (port 443)
Based on https://learn.microsoft.com/en-us/python/api/azure-eventhub/azure.eventhub.aio.eventhubproducerclient?view=azure-python
I added proxy configuration and TransportType.AmqpOverWebsocket parametr and my Producer looks like this:
async def run():
producer = EventHubProducerClient.from_connection_string(
"Endpoint=sb://my_eh.servicebus.windows.net/;SharedAccessKeyName=eh-sender;SharedAccessKey=MFGf5MX6Mdummykey=",
eventhub_name="my_eh",
auth_timeout=180,
http_proxy=HTTP_PROXY,
transport_type=TransportType.AmqpOverWebsocket,
)
and I get an error:
File "/usr/local/lib64/python3.9/site-packages/uamqp/authentication/cbs_auth_async.py", line 74, in create_authenticator_async
raise errors.AMQPConnectionError(
uamqp.errors.AMQPConnectionError: Unable to open authentication session on connection b'EHProducer-a1cc5f12-96a1-4c29-ae54-70aafacd3097'.
Please confirm target hostname exists: b'my_eh.servicebus.windows.net'
I don't know what might be the issue.
Might it be related to this one ? https://github.com/Azure/azure-event-hubs-c/issues/50#issuecomment-501437753
you should be able to set up a proxy that the SDK uses to access EventHub. Here is a sample that shows you how to set the HTTP_PROXY dictionary with the proxy information. Behind the scenes when proxy is passed in, it automatically goes over websockets.
As #BrunoLucasAzure suggested checking the ports on the proxy itself will be good to check, because based on the error message it looks like it made it past the proxy and cant resolve the endpoint.

Python - Using Windows hosts file when using Python Requests / Use predefined IP Address without making a DNS request

I am trying to use Python requests to make a HTTP GET request to a domain, without using urllib3/httplib.HTTPConnection to perform a DNS request for the domain. I set the domain in the Windows hosts file, but Python requests appears to override this, so I need to define the DNS resolution for the domain in the script.
I want to script to bypass the dns request so I can set the IP address. In the example below I've set this to 45.22.67.8, and I will change this to my public IP address later.
I tried using this 'monkey patching' technique but it doesn't work. Requests doesn't generate a DNS request in Wireshark, but it also doesn't connect to the HTTP server.
import socket
import requests
from requests.packages.urllib3.connection import HTTPConnection
socket.getaddrinfo = '45.22.67.8'
url = "http://www.randomdomain.com"
requests.get(url, timeout=10)
Error
'str' object is not callable
Thanks!
Edit: just updated the code in my example. All I want to do is override future http connections to trick the http packets to go to a different destination IP.

Using Python requests to GET not working - web client and browser works

I have my web app API running.
If I go to http://127.0.0.1:5000/ via any browser I get the right response.
If I use the Advanced REST Client Chrome app and send a GET request to my app at that address I get the right response.
However this gives me a 503:
import requests
response = requests.get('http://127.0.0.1:5000/')
I read to try this for some reason:
s = requests.Session()
response = s.get('http://127.0.0.1:5000/')
But I still get a 503 response.
Other things I've tried: Not prefixing with http://, not using a port in the URL, running on a different port, trying a different API call like Post, etc.
Thanks.
Is http://127.0.0.1:5000/ your localhost? If so, try 'http://localhost:5000' instead
Just in case someone is struggling with this as well, what finally worked was running the application on my local network ip.
I.e., I just opened up the web app and changed the app.run(debug=True) line to app.run(host="my.ip.address", debug = True).
I'm guessing the requests library perhaps was trying to protect me from a localhost attack? Or our corporate proxy or firewall was preventing communication from unknown apps to the 127 address. I had set NO_PROXY to include the 127.0.0.1 address, so I don't think that was the problem. In the end I'm not really sure why it is working now, but I'm glad that it is.

Set port in requests

I'm attempting to make use of cgminer's API using Python. I'm particularly interested in utilizing the requests library.
I understand how to do basic things in requests, but cgminer wants to be a little more specific. I'd like to shrink
import socket
import json
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.connect(('localhost', 4028))
sock.send(json.dumps({'command': 'summary'}))
using requests instead.
How does one specify the port using that library, and how does one send such a json request and await a response to be stored in a variable?
Request is an HTTP library.
You can specify the port in the URL http://example.com:4028/....
But, from what I can read in a hurry here cgminer provides a RPC API (or JSON RPC?) not an HTTP interface.
As someone who has learned some of the common pitfalls of python networking the hard way, I'm adding this answer to emphasize an important-but-easy-to-mess-up point about the 1st arg of requests.get():
localhost is an alias which your computer resolves to 127.0.0.1, the IP address of its own loopback adapter. foo.com is also an alias, just one that gets resolved further away from the host.
requests.get('foo.com:4028') #<--fails
requests.get('http://foo.com:4028') #<--works usually
& for loopbacks:
requests.get('http://127.0.0.1:4028') #<--works
requests.get('http://localhost:4028') #<--works
this one requires import socket & gives you the local ip of your host (aka, your address within your own LAN); it goes a little farther out from the host than just calling localhost, but not all the way out to the open-internet:
requests.get('http://{}:4028'.format(socket.gethostbyname(socket.gethostname()))) #<--works
You can specify the port for the request with a colon just as you would in a browser, such as
r = requests.get('http://localhost:4028'). This will block until a response is received, or until the request times out, so you don't need to worry about awaiting a response.
You can send JSON data as a POST request using the requests.post method with the data parameter, such as
import json, requests
payload = {'command': 'summary'}
r = requests.post('http://localhost:4028', data=json.dumps(payload))
Accessing the response is then possible with r.text or r.json().
Note that requests is an HTTP library - if it's not HTTP that you want then I don't believe it's possible to use requests.

Http proxy works with urllib.urlopen, but not with requests.get [duplicate]

I am trying to do a simple get request through a proxy server:
import requests
test=requests.get("http://google.com", proxies={"http": "112.5.254.30:80"})
print test.text
The address of the proxy server in the code is just from some freely available proxy lists on the internet. The point is that this same proxy server works when I use it from browser, but it doesn't work from this program. And i tried many different proxy servers and none of them works through above code.
Here is what I get for this proxy server:
The requested URL could not be retrieved While trying to retrieve the URL: http:/// The following error was encountered:
Unable to determine IP address from host name for
The dnsserver returned: Invalid hostname
This means that: The cache was not able to resolve the
hostname presented in the URL. Check if the address is correct.
I know its an old question, but it should be
import requests
test=requests.get("http://google.com", proxies={"http":"http://112.5.254.30:80","https": "http://112.5.254.30:80"})
print (test.text)

Categories