I am trying to validate the SSL connection between client and server.
I have two python scripts send.py for producer and receive.py for consumer.
I am using below code to make connection.
import pika
ssl_option = {'certfile': '/home/rmqca/client1/cert.pem', 'keyfile': '/home/rmqca/client1/key.pem'}
parameters = pika.ConnectionParameters(host='localhost', port=5671, ssl=True, ssl_options=ssl_option)
connection = pika.BlockingConnection(parameters)
ALso, in my rabbitmq.config, I am using the below parameters:
{ssl_listeners, [5671]},
{ssl_options, [{cacertfile, "/home/rmqca/testca/cacert.pem"},
{certfile, "/home/rmqca/server/cert.pem"},
{keyfile, "/home/rmqca/server/key.pem"},
{verify, verify_peer},
{fail_if_no_peer_cert, true}]}
This works fine when I try connecting through SSL.
But As I wanted to cover negative usecase, like if I make connection without ssl, like using code:
import pika
connection = pika.BlockingConnection()
then, as per my understanding, my client should not be able to connect to server. But currently it is connecting fine. I am not sure why this is happening. Am I doing anything wrong here?
Related
I am working with the Python API client for Elastic Search and I am trying to connect to it.
I create the client like this:
def setup_es():
ES_USER = os.getenv("ES_USER")
ES_PASS = os.getenv("ES_PASS")
print(f"Setting up ES with HOST={ES_HOST}, USER={ES_USER}, PASS={ES_PASS}")
return Elasticsearch([ES_HOST], basic_auth=(ES_USER, ES_PASS))
But whenever I try using the client, I always get a connection timed out error.
For example:
client.info()
or
client.options(ignore_status=[400,404]).indices.delete(index=MY_INDEX)
Always produce:
***_transport.ConnectionTimeout: Connection timed out
I know the host, user and password are right. Am I missing something else? Any ideas please?
I solved this issue by specifying port 443 in the elastic search host variable.
I have a very simple Python (Flask socket.io) application which works as a server and another app written in AngularJS which is a client.
In order to handle connected and disconnected client I use respectlivy:
#socketio.on('connect')
def on_connect():
print("Client connected")
#socketio.on('disconnect')
def on_disconnect():
print("Client disconnected")
When Client connects to my app I get information about it, in case if client disconnect (for example because of problems with a network) I don't get any information.
What is the proper way to handle the situation in which client disconnects unexpectedly?
There are two types of connections: using long-pooling or WebSocket.
When you use WebSocket clients knows instantly that server was disconnected.
In the case of long-polling, there is need to set ping_interval and pint_timeout parameters (I also find information about heartbeat_interval and heartbeat_timeout but I don't know how they are related to ping_*).
From the server perspective: it doesn't know that client was disconnected and the only way to get that information is to set ping_interval and ping_timeout.
I am trying to connect to the MusicBrainz database using the psycopg2 python's module. I have followed the instructions presented on http://musicbrainz.org/doc/MusicBrainz_Server/Setup, but I cannot succeed in connecting. In particular I am using the following little script:
import psycopg2
conn = psycopg2.connect( database = 'musicbrainz_db', user= 'musicbrainz', password = 'musicbrainz', port = 5000, host='10.16.65.250')
print "Connection Estabilished"
The problem is that when I launch it, it never reaches the print statement, and the console (I'm on linux) is block indefinitely. It does not even catches the ctrl-c kill, so I have to kill python itself in another console. What can cause this?
You seem to be mistaking MusicBrainz-Server to be only the database.
What's running on port 5000 is the Web Server.
You can access http://10.16.65.250:5000 in the browser.
Postgres is also running, but listens on localhost:5432.
This works:
import psycopg2
conn = psycopg2.connect(database="musicbrainz_db",
user="musicbrainz", password="musicbrainz",
port="5432", host="localhost")
print("Connection established")
In order to make postgres listen to more than localhost you need to change listen_addresses in /etc/postgresql/9.1/main/postgres.conf and make an entry for your (client) host or network in /etc/postgresql/9.1/main/pg_hba.conf.
My VM is running in a 192.168.1.0/24 network so I set listen_addresses='*' in postgres.conf and in pg_hab.conf:
host all all 192.168.1.0/24 trust
I can now connect from my local network to the DB in the VM.
Depending on what you actually need, you might not want to connect to the MusicBrainz Server via postgres. There is a MusicBrainz web service you can access in the VM.
Example:
http://10.16.65.250:5000/ws/2/artist/c5c2ea1c-4bde-4f4d-bd0b-47b200bf99d6.
In that case you might be interested in a library to process the data:
python-musicbrainzngs.
EDIT:
You need to set musicbrainzngs.set_hostname("10.16.65.250:5000") for musicbrainzngs to connect to your local VM.
My project use bottle and HBase, client connect to HBase via python thrift client, code simplify like this
#!/usr/bin/env python
from bottle import route, run, default_app, request
client = HBaseClient()
#route('/', method='POST')
def index():
data = client.getdata()
return data
Now the issue is if client disconnect, our request will be failed. So it requires to make sure client keep alive.
One solution is using connection pool, is there any connection pool I can refer to?
Any other solution for this issue?
Looks happybase can deal this issue
HappyBase has a connection pool that tries to deal with broken connections to some extent: http://happybase.readthedocs.org/en/latest/user.html#using-the-connection-pool
I am writing some code that uses poplib and imaplib to collect emails through a proxy server.
I use the following to set up a proxy connection:-
import socks
import socket
socks.setdefaultproxy(socks.PROXY_TYPE_SOCKS4,proxy_ip,port,True)
socket.socket = socks.socksocket
Which I got from the stackoverflow post:-
http://stackoverflow.com/questions/3386724/python-how-can-i-fetch-emails-via-pop-or-imap-through-a-proxy
Then I make my connection with the email server:-
server = poplib.POP3(self.host, self.port)
server.user(self.username)
server.pass_(self.password)
I am testing my code in a unittest and have encountered a problem that I believe relates to my connection with the proxy not closing down properly.
An example is:-
I have set up the proxy connection and am trying to establish a connection with the email server. As part of the unittest I intentionally use an incorrect email server password.
The poplib library throws an exception that it can't connect. I catch the exception in the unittest, then move on to the next unittest, trusting the poplib library would properly close my previous connection.
My understanding is that this is not a good thing and that I should be ensuring the email and proxy server connections are properly closed.
I know how to close the pop3 connection:-
server.quit()
But do not know how to close the connection with the proxy server or if I have to do so.
Could someone please help me with this question or with my understanding if that's where the problem lies :)
No special action is required. When you close the POP connection, the proxy connection will close automatically, since it's only needed while you are connected to something through the proxy.