How can I debug Errno 10061 from smtplib in Python? - python

At work, I have tools that send emails when they fail. I have a blacklist, which I put myself on, so errors I make don't send anything but I recently needed to update this code and in testing it, found that I can't send emails anymore.
Everyone else is able to send emails, just not me. I am able to use my regular email client (outlook) and I'm able to ping the mail server. The IT guys aren't familiar with python and as far as they can tell, my profile settings look fine. Just the simple line of server = smtplib.SMTP(smtpserver) fails with:
# error: [Errno 10061] No connection could be made because the target machine actively refused it #
For info's sake, I'm running Vista with python 2.6, though nearly everyone else is running almost identical machines with no issues. We are all running the same security settings and the same anti-virus software, though disabling them still doesn't work.
Does anyone have any ideas on how to figure out why this is happening? Maybe there is some bizarre setting on my profile in the workgroup? I'm at a loss on how to figure this out. All my searching leads to other people that didn't start their server or something like that but this is not my case.

You might want to use Ethereal to see what's going on. It may give you more information.

Related

Paramiko get stdout from connection object (not exec_command)

I'm writing a script that uses paramiko to ssh onto several remote hosts and run a few checks. Some hosts are setup as fail-overs for others and I can't determine which is in use until I try to connect. Upon connecting to one of these 'inactive' hosts the host will inform me that you need to connect to another 'active' IP and then close the connection after n seconds. This appears to be written to the stdout of the SSH connection/session (i.e. it is not an SSH banner).
I've used paramiko quite a bit, but I'm at a loss as to how to get this output from the connection, exec_command will obviously give me stdout and stderr, but the host is outputting this immediately upon connection, and it doesn't accept any other incoming requests/messages. It just closes after n seconds.
I don't want to have to wait until the timeout to move onto the next host and I'd also like to verify that that's the reason for not being able to connect and run the checks, otherwise my script works as intended.
Any suggestions as to how I can capture this output, with or without paramiko, is greatly appreciated.
I figured out a way to get the data, it was pretty straight forward to be honest, albeit a little hackish. This might not work in other cases, especially if there is latency, but I could also be misunderstanding what's happening:
When the connection opens, the server spits out two messages, one saying it can't chdir to a particular directory, then a few milliseconds later it spits out another message stating that you need to connect to the other IP. If I send a command immediately after connecting (doesn't matter what command), exec_command will interpret this second message as the response. So for now I have a solution to my problem as I can check this string for a known message and change the flow of execution.
However, if what I describe is accurate, then this may not work in situations where there is too much latency and the 'test' command isn't sent before the server response has been received.
As far as I can tell (and I may be very wrong), there is currently no proper way to get the stdout stream immediately after opening the connection with paramiko. If someone knows a way, please let me know.

Secure Python chat with SSH - How?

I am trying to create a secure chat server with python, and after many hours of hunting all I have discovered is that I should use SSH, and that Paramiko seems to be the best python module for it (I may be wrong). I cannot find out how to implement this though, and being quite new to Python, the docs were a bit deep for me, especially as I didn't really know what to look for!
Any links to example code would be greatly appreciated, especially concerning the server (there seems to be hundreds of examples about connecting to an ssh server, but none about creating them - am i missing something vital here? I have heard that it is possible to create an ssh server in python, but the apparent lack of code on the internet is worrying me)
Thanks
EDIT:
My ultimate goal is to create a secure chat client with python, and I would like to keep it as simple as possible, however, security is the main goal. I have seen and made several chat clients in the recent past, however, they required telnet to connect to them, so were not secure, I wish to correct this.
Try Twisted, an asynchronous networking engine written in Python. They have a very simple example called chatserver.py. Get that running and then make it listen on SSL using a self-signed SSL certificate.
Here is another example.

Handling https requests using a SOCK_STREAM proxy

I'm working on a project that allows a user to redirect his browsing through a proxy. The system works like this - a user runs this proxy on a remote PC and then also runs the proxy on his laptop. The user then changes his browser settings on the laptop to use localhost:8080 to make use of that local proxy, which in turn forwards all browser traffic to the proxy running on the remote PC.
This is where I ran into HTTPS. I was able to get normal HTTP requests working fine and dandy, but as soon as I clicked on google.com, Firefox skipped my proxy and connected to https://google.com directly.
My idea was to watch for browser requests the say CONNECT host:443 and then use the python ssl module to wrap that socket. This would give me a secure connection between the outer proxy and the target server. However, when I run wireshark to see how a browser request looks like before ssl kicks in, it's already there, meaning it looks like the browser connects to port 443 directly, which explains why it omitted my local proxy.
I would like to be able to handle to HTTPS as that would make for a complete browsing experience.
I'd really appreciate any tips that could push in the right direction.
Well, after doing a fair amount of reading on proxies, I found out that my understanding of the problem was insufficient.
For anyone else that might end up in the same spot as me, know that there's a pretty big difference between HTTP, HTTPS, and SOCKS proxies.
HTTP proxies usually take a quick look into the HTTP headers to determine where to forward the whole packet. These are quite easy to code on your own with some basic knowledge of sockets.
HTTPS proxies, on the other hand, have to work differently. They should either be able to do the whole SSL magic for the client or they could try to pass the traffic without changes, however if the latter solution is chosen, the users IP will be known. This is a wee bit more demanding when it comes to coding.
SOCKS proxies are a whole different, albeit really cool, beast. They work on the 5th layer of the OSI model and honestly, I have no clue as to where I would even begin creating one. They achieve both security and anonymity. However, I do know that a person may be able to use SSH to start a SOCKS proxy on their machine, just read this http://www.revsys.com/writings/quicktips/ssh-tunnel.html . That link also gave an idea that it should be possible to use SSH from a Python script to make it much more convenient.
Hope this helps anyone with the same question as I had. Good luck!

Python urllib2.urlopen bug: timeout error brings down my Internet connection?

I don't know if I'm doing something wrong, but I'm 100% sure it's the python script brings down my Internet connection.
I wrote a python script to scrape thousands of files header info, mainly for Content-Length to get the exact size of each file, using HEAD request.
Sample code:
class HeadRequest(urllib2.Request):
def get_method(self):
return "HEAD"
response = urllib2.urlopen(HeadRequest("http://www.google.com"))
print response.info()
The thing is after several hours running, the script starts to throw out urlopen error timed out, and my Internet connection is down from then on. And my Internet connection will always be back on immediately after I close that script. At the beginning I thought it might be the connection not stable, but after several times running, it turned out to be the scripts fault.
I don't know why, this should be considered as a bug, right? Or my ISP banned me for doing such things? (I already set the program to wait 10s each request)
BTW, I'm using VPN network, does it have something to do with this?
I'd guess that either your ISP or VPN provider is limiting you because of high-volume suspicious traffic, or your router or VPN tunnel is getting clogged up with half-open connections. Consumer internet is REALLY not intended for spider-type activities.
"the script starts to throw out urlopen error timed out"
We can't even begin to guess.
You need to gather data on your computer and include that data in your question.
Get another computer. Run your script. Is the other computer's internet access blocked also? Or does it still work?
If both computers are blocked, it's not your software, it's your provider. Update Your Question with this information, and how you got it.
If only the computer running the script is stopped, it's not your provider, it's your OS resources being exhausted. This is harder to diagnose because it could be memory, sockets or file descriptors. Usually its sockets.
You need to find some ifconfig/ipconfig diagnostic software for your operating system. You need to update your question to state exactly what operating system you're using. You need to use this diagnostic software to see how many open sockets are cluttering up your system.

HTTP Negotiate windows vs. Unix server implementation using python-kerberos

I tried to implement a simple single-sign-on in my python web server. I have used the python-kerberos package which works nicely. I have tested it from my Linux box (authenticating against active directory) and it was without problem.
However, when I tried to authenticate using Firefox from Windows machine (no special setup, just having the user logged into the domain + added my server into negotiate-auth.trusted-uris), it doesn't work. I have looked at what is sent and it doesn't even resemble the things the Linux machine sends. This Microsoft description of the process pretty much resembles the way my interaction from Linux works, but the Windows machine generally sends a very short string, which doesn't even resemble the things microsoft documentation states, and when base64 decoded, it is something like 12 zero bytes followed by 3 or 4 non-zero bytes (GSS functions then return that it doesn't support such scheme)
Either there is something wrong with the client Firefox settings, or there is some protocol which I am supposed to follow for the Negotiate protocol, but which I cannot find any reference anywhere. Any ideas what's wrong? Do you have any idea what protocol I should by trying to find, as it doesn' look like SPNEGO, at least from MS documentation.
Update: now it seems to at least produce reasonable data (clean installation of firefox on something different than windows7...), however I am getting:
(('Unspecified GSS failure. Minor code may provide more information', 851968), ('', 100004))
The Negotiate data now seems to be OK, so this is some other error. Any idea where can I decode what 100004 means?
On Windows, you also need to make sure that network.auth.use-sspi = true, so that Firefox will use the native Windows Kerberos (SSPI) rather than look for a GSSAPI DLL.
If that's not it, and you want to provide a packet capture of the Kerberos and HTTP traffic, I'm happy to look at it.

Categories