What happens if not quiting the SMTP / server connection? - python

I was recently studying the functionality of SMTP and saw that in code you are supposed to quit the connection. In python via. SMTP.quit().
I was wondering why it is recommended practice to quit the connection to SMTP or I guess also servers in general?

Related

websocket connection goes to unknown state

I have a web server running on ubuntu instance at Amazon EC2.
My web server is written in Python and supports WebSockets.
My Problem:
I am able to connect to my server through websocket and I am also able send and receive data over websocket. However, after few minutes if I try to send data on same connection from client application I am not receiving anything on server for the same established connection. Now if I close the client application or try to connect through another client application, I am able to connect and communicate again for few minutes only. And after few minutes the new connection again faces the same issue. Please help me to solve the issue.
Of course, when I run the server on my local testing environment the connection never dies and I can communicate over same websocket connection after hours also.

Raspberry Pi python app and nodejs socketio communication

My requirement is to communicate socketio with nodejs server to Raspberry Pi running a local Python app. Please help me. I can find ways of communication with web app on google but is there any way to communicate with Python local app with above mentioned requirements.
It's unclear exactly which part you need help with. To make a socket.io connection work, you do the following:
Run a socket.io server on one of your two computers. Make sure it is listening on a known port (it can share a port with a web server if desired).
On the other computer, get a socket.io client library and use that to make a socket.io connection to the other computer.
Register message handlers on both computers for whatever custom messages you intend to send each way and write the code to process those incoming messages.
Write the code to send messages to the other computer at the appropriate time.
Socket.io client and server libraries exist for both node.js and python so you can either type of library for either type of system.
The important things to understand are that you must have a socket.io server up and running. The other endpoint then must connect to that server. Once the connection is up and running, you can then send message from either end to the other end.
For example, you could set up a socket.io server on node.js. Then, use a socket.io client library for python to make a socket.io connection to the node.js server. Then, once the connection is up and running, you are free to send messages from either end to the other and, if you have, message handlers listening for those specific messages, they will be received by the other end.

Execute python code over SSH

I need to write a python script which connects to the host over SSH and then somehow connects to a service sitting on localhost and performs a little interactive session.
What first came to mind is to use Paramiko to do a local port forwarding and then use Pythons's sockets library to communicate with the service.
But working with Paramiko was quite a challenge and I haven't figured out how to fix some issues.
So I switched to pxssh and used just simple scenario:
conn.sendline('telnet {} {}'.format('localhost', port)
conn.expect('PASSWORD:')
conn.sendline(password)
...
But that telnet thing really bothers me.
And I think it's possible to establish SSH connection in such a manner that from Python's code prospective I just do data = open('somefile').read() which actually opens a somefile on a remote host and all traffic is being encrypted because of SSH.

python: convert telnet application to ssh

I write a python telnet client to communicate with a server through telnet. However, many people tell me that it's no secure. How can I convert it to ssh? Should I need to totally rewrite my program?
Just create a TCP tunnel via SSH and relay all the data through there, no need to reinvent the wheel. You can find examples in SSH docs under TCP tunneling (man ssh for instance).
But if you really want to rewrite it, then check out paramiko project.
While Telnet is insecure, it's essentially just a serial console over a network, which makes it easy to code for. SSH is much, much more complex. There's encryption, authentication, negotiation, etc to do. And it's very easy to get wrong in spectacular fashion.
There's nothing wrong with Telnet per se, but if you can change things over the network - and it's not a private network - you're opening yourself up for trouble.
Assuming this is running on a computer, why not restrict the server to localhost? Then ssh into the computer and telnet to localhost? All the security with minimal hassle.
Use one of these libraries in your Python application to talk to the SSHD daemon on the server:
http://wiki.python.org/moin/SecureShell
You'll want to look at something that will help you set up SSHD in a secure manner like this:
http://www.linuxjournal.com/article/8759
You will also need to ensure that you update the SSHD daemon regularly and make use of keys and strong, rotating passwords. All this should be standard for a telnet application too, but due to the lack of transport level encryption, you really shouldn't be using telnet for anything.

SMTP and XMPP deployment/workflow

I'm developing a website that incorporates an XMPP bot and a custom SMTP server (mainly these services process commands and reply). I'd like to set up a system where I can develop locally, push changes to a staging server, and finally to a production system. (Essentially I'm developing on the live server currently.)
I'm using python, and I'm reading a bit about fabric, but I'm running into a mental block.
I am using sqlalchemy-migrate to manage database versions and have the basic DNS stuff set up for the host. Additionally, I have a library that I'm currently working on that these two services both use (in my global site-packages directory). I deploy this egg after I change anything. This would ideally also be deployable, but only available to the correct version. Would I need two versions, stage-lib and live-lib? Is this possible with python eggs?
Would I need another host to act as a staging server for these services? Or is there a way to tell DNS that something#staging.myhost.com goes to a different port than 25?
I have a fabfile right now that has a bunch of methods like stage_smtp, stage_xmpp, live_smtp, live_xmpp.
Partial answer: DNS has no way to tell you to connect to a non-standard SMTP port, even with SRV records. (XMPP does.)
So, for sending email, you'll have to do something like:
import smtplib
server = smtplib.SMTP('localhost:2525')
server.sendmail(fromaddr, toaddrs, msg)
server.quit()

Categories