Tuesday, October 7, 2008

Dealing with HTTP Timeouts in Ruby

The standard ruby open-uri library makes connecting to local and remote resources completely transparent, which is great most of the time, it lets me do things like this:

stats = Hpricot(open(interface_url))

and I can easily replace the url with a file at unit test time.

However, yesterday I was trying to consume some XML from a web service that is on a machine that is...pegged, like pegged with a load of 30, because it's a data mining box in the middle of a huge run. Also, producing the XML requires database queries, which exacerbates the load problem. So it took about 2 minutes to get back to me -- 1 minute more than the standard open-uri timeout.

Open-uri calls Net:HTTP to do it's HTTP protocol based file opens, and Net::HTTP is a lot more like a standard HTTP library in another language -- i.e. it lets you set timeouts. Unfortunately with flexibility comes some complexity, but it's nothing worth crying about:

@site_url = URI.parse(site_url)

http = Net::HTTP.new(@site_url.host, @site_url.port)
http.read_timeout=360 # timeout in seconds. Yeah, that's 6 minutes.
req = Net::HTTP::Get.new @site_url.path

res = http.start { |web|

stats = Hpricot(res.body)

Of course, real production code would wrap this in a begin..rescue block and retry.

1 comment:

  1. ... Why are you using an instance variable for @site_url? :)

    Just saying! :D