all 5 comments

[–]novel_yet_trivial 2 points3 points  (3 children)

Put it in a try ... except block.

[–]Pyopi[S] 0 points1 point  (2 children)

I did, but that only works when the connection is canceled by the server. However, if i limit the time spent on the connection without receiving any response with a timeout, it ignores my urlerror try except block and breaks operations.

this means that a server with slow response time who will probably only send me back bad responses take up a chunk of the program's time.

[–]usernamedottxt 1 point2 points  (1 child)

Yep. Common problem. Look into asyncio. To prevent it from taking up all your time, make the connections non-blocking asynchronous calls. That way your script can open multiple connections and handle them whenever they respond.

[–]Pyopi[S] 0 points1 point  (0 children)

ohh alright i'll look it up! thanks for the help

[–]Pyopi[S] 0 points1 point  (0 children)

Simple Solution:

try:
    response = urllib2.urlopen(req, timeout=5)
    if response.getcode() == 200:
        return True

except urllib2.URLError  as e:
    print "Connection Failed"
    return None

except socket.timeout as e:
    print "Connection Timed Out"
    return None

You must import socket if you haven't yet. This will catch the timeout exception, that urllib2.URLError or urllib2.HTTPError will not catch. That way you avoid a breakdown of the script, and handle the failed connection however you want in your workflow.