SMTP and XMPP deployment/workflow - python

I'm developing a website that incorporates an XMPP bot and a custom SMTP server (mainly these services process commands and reply). I'd like to set up a system where I can develop locally, push changes to a staging server, and finally to a production system. (Essentially I'm developing on the live server currently.)
I'm using python, and I'm reading a bit about fabric, but I'm running into a mental block.
I am using sqlalchemy-migrate to manage database versions and have the basic DNS stuff set up for the host. Additionally, I have a library that I'm currently working on that these two services both use (in my global site-packages directory). I deploy this egg after I change anything. This would ideally also be deployable, but only available to the correct version. Would I need two versions, stage-lib and live-lib? Is this possible with python eggs?
Would I need another host to act as a staging server for these services? Or is there a way to tell DNS that something#staging.myhost.com goes to a different port than 25?
I have a fabfile right now that has a bunch of methods like stage_smtp, stage_xmpp, live_smtp, live_xmpp.

Partial answer: DNS has no way to tell you to connect to a non-standard SMTP port, even with SRV records. (XMPP does.)
So, for sending email, you'll have to do something like:
import smtplib
server = smtplib.SMTP('localhost:2525')
server.sendmail(fromaddr, toaddrs, msg)
server.quit()

Related

how can I post to api running on remote desktop?

I'm creating a python flask api on remote desktop and running it on localhost of remote desktop.
Is there anyway I can access this api from my local machine?
We are working in a team and I want to share this with my team members, but this is confidential and not to be deployed on open server.
We want to post and get the result with every member's local machine from api runnnig on remote desktop.
Both of our local machines and remote desktop are windows10.
Sorry for being abstract but I'm searching for any way out. Thanks.
Well, you should open your way to this API. You'll have to set up a VPN or IP address filter in the server so you can access the server from your network while still have it secured on the Internet. You can also setup a simpler proxy if you prefer it. I'll not cover the details on how to setup a VPN or proxy since it can get pretty extensive, but a Google search will help you out find the best alternative for you.
AFAIK, the Remote Desktop Protocol does not allow for any kind of VPN. However, if you can switch to TeamViewer, it does have an easy to setup VPN system that will allow you to get into the network with few configuration. Once a VPN is configured, it will work like if you were in the same network as the server, so from there you can access your API from your host machine by just going to the IP address of the server.
Do notice the security policies of whoever owns the server, since you can get into trouble if you don't have permission to enable some access from the outside. Security goes always in front of comfort.
Short term solution:
Firstly download ngrok for your operating system.
For debugging and testing purposes you can expose a secure tunnel connection to your API by running this command in your command prompt / terminal.
ngrok http <PORT_NUMBER>-host-header="localhost:<PORT_NUMBER>"
Where PORT_NUMBER is the port number in which your flask application is running.
Example if your flask application is running at port 5000 then simply execute this command:
ngrok http 5000 -host-header="localhost:5000"
Running this will give you two hostnames one with HTTP and other a secure HTTPS connected by a tunnel like this for a duration of 8 hours after which the command needs to again re-run.
Which you can call remotely
Long term solution:
Deploy flask application using FastCGI
or
To a cloud infrastructure provider like Microsoft Azure which gives readymade templates for flask applications.

PythonAnywhere - Are sockets allowed?

I have a beginner PythonAnywhere account, which, the account comparison page notes, have "Access to External Internet Sites: Specific Sites via HTTP(S) Only."
So I know only certain hosts can be accessed through HTTP protocols, but are there restrictions on use of the socket module? In particular, can I set up a Python server using socket?
PythonAnywhere dev here. Short answer: you can't run a socket server on PythonAnywhere, no.
Longer answer: the socket module is supported, and from paid accounts you can use it for outbound connections just like you could on your normal machine. On a free account, you could also create a socket connection to the proxy server that handles free accounts' Internet access, and then use the HTTP protocol to request a whitelisted site from it (though that would be hard work, and it would be easier to use requests or something like that).
What you can't do on PythonAnywhere is run a socket server that can be accessed from outside our system.
Nope. PythonAnywhere doesn't support the socket module.

Sending mail via python

I am using Debian 8 and I would like to be able to only send mail via python without installing a full blown mail server system like postfix or without using gmail.
I can only see tutorials to send mails with python with full mail system server or via gmail or other internet mail system. Isn't it possible to just send an email and don't care about receiving any?
Thanks.
You can run your own Python SMTPd server.
Well, you need a mail server. Either locally, on your machine, or somewhere on the internet. This doesn't have to be gmail.
You need to understand two things:
"email" is a protocol. Read more about it here.
in order to be able to "participate" in this protocol exchange you generally need a server that can "speak" the same protocol with other servers.
So no, you cannot send an email without some kind of server, either local or remote. As a "client" (the entity sending the email) you generally need to connect to a SMTP server in order to send or receive email.
You can find more details about how to do this with Python in the standard SMTP library.

Raspberry Pi python app and nodejs socketio communication

My requirement is to communicate socketio with nodejs server to Raspberry Pi running a local Python app. Please help me. I can find ways of communication with web app on google but is there any way to communicate with Python local app with above mentioned requirements.
It's unclear exactly which part you need help with. To make a socket.io connection work, you do the following:
Run a socket.io server on one of your two computers. Make sure it is listening on a known port (it can share a port with a web server if desired).
On the other computer, get a socket.io client library and use that to make a socket.io connection to the other computer.
Register message handlers on both computers for whatever custom messages you intend to send each way and write the code to process those incoming messages.
Write the code to send messages to the other computer at the appropriate time.
Socket.io client and server libraries exist for both node.js and python so you can either type of library for either type of system.
The important things to understand are that you must have a socket.io server up and running. The other endpoint then must connect to that server. Once the connection is up and running, you can then send message from either end to the other end.
For example, you could set up a socket.io server on node.js. Then, use a socket.io client library for python to make a socket.io connection to the node.js server. Then, once the connection is up and running, you are free to send messages from either end to the other and, if you have, message handlers listening for those specific messages, they will be received by the other end.

How can a python web app open a program on the client machine?

I'm going to be using python to build a web-based asset management system to manage the production of short cg film. The app will be intranet-based running on a centos machine on the local network. I'm hoping you'll be able to browse through all the assets and shots and then open any of them in the appropriate program on the client machine (also running centos). I'm guessing that there will have to be some sort of set up on the client-side to allow the app to run commands, which is fine because I have access to all of the clients that will be using it (although I don't have root access). Is this sort of thing possible?
As you already guessed, you will need to have a service running on the client PC listening on a predetermined port.
When the client requests to open an asset, your webapp will send the request to the running service to download the asset and run it. As long as your port no. is above 1024 and you are not running any application which requires root access, you can run this service without root.
But this is a very bad idea as it exposes the clients to malicious attacks. You will have to ensure all requests to the client service is properly signed and that the client verifies each request as valid before executing it. There may be many other security factors you will have to consider depending on your implementation of the client service. But in general, having a service that can run arbitrary requests from a remote machine is a very dangerous thing to have.
You may also not be allowed to run such a service on client PC depending on your comany's IT policies.
You are better of having the client download the resource normally and then having the user execute the resource manually.
PS: You can have the client service run on a port below 1024, but it will have to start as root and after binding to the port drop all root privileges and change the running user to a different user using setuid (or the equivalent in your language of choice)
Note this is not a standard way. Imagine the websites out there had the ability to open Notepad or Minesweeper at their will when you visit or click something.
The way it is done is, you need to have a service which is running on the client machine which can expose certain apis and trust the request from the web apps call. this needs to be running on the client machines all the time and in your web app, you can send a request to this service to launch the application that you desire.
If you have a specific subset of applications that will be run on the client systems (aka you are distributing jobs), then you might want to consider python salt. It is a distributed RPC which uses a secure protocol and authentication to distribute jobs and deliver results:
http://docs.saltstack.org/en/latest/topics/index.html
If you are looking at automating content generation based on specific updates then you might want to consider Jenkins, which has plugins for various revision control systems and build systems:
https://wiki.jenkins-ci.org/display/JENKINS/Meet+Jenkins
It may not have integration with the particular tools you are using, but if it does then it could be a quicker setup and administration than generic salt automation.
--David

Categories