Whither FTP?

I recently installed an update of a software package running on an Amazon EC2 host.

In the configure step I found there was an unsatisfied dependency: it wanted ossp-uuid, which was not available on the system.  Neither was yum able to find it: there was an alternative uuid, but no hint of anything from ossp.  Turned up some problems with yum too (a hung security-update process from weeks ago and a corrupted database), but that’s another story.  Checking my box at home, the reason I hadn’t stumbled on the dependency is that ossp-uuid is installed as a standard package here.  A case of different distros having different packages in their standard repos.

In the absence of a package, installing from source seemed the obvious thing to do.  So I made my way to ossp.org, from where navigation to an ossp-uuid source download is easy.  Reassuringly I see Ralf Engelschall is in charge (whois lists him too), but worryingly none of the packages are signed.  A summary look at the source package reassures me it looks fine, though I don’t have time for exhaustive review.  In the unlikely event of a trojan package having found its way to the site, I expect some reader of my blog will alert me to the story!

Anyway, that’s getting ahead of myself.  The unexpected problem I faced was actually downloading the package, which is available only through FTP.  Firefox from home timed out; lynx or perl GET from the ec2 machine returned an unhelpful error.  Looks like a firewall in the way of FTP building its data connection.  Installing an old-fashioned commandline ftp I found neither active nor passive mode would work, meaning neither the client nor the server could initiate the data connection.

Before going into an exhaustive investigation of those firewall components over which I have control (my router being #1 suspect at home), I decided to try other routes.  The problem was resolved when I was able to access the FTP server from my own (webthing) web server, then make the package available over HTTP from there to the ec2 box.

In the Good Old Days before the coming of web browsers and bittorrent, FTP was THE protocol for transferring files.  In 1990s web browsers it shared equal status with HTTP and others, and even into this century it was widely seen as a superior protocol to HTTP for data, particularly bigger files.

Now by contrast, the widespread use of blind firewalls requires me to jump through hoops just to use the protocol.  The rant I once published about everything-over-HTTP is coming to pass, and is not a good thing.

Posted on May 1, 2012, in ftp, http, internet, security. Bookmark the permalink. 3 Comments.

  1. I may have got a bit lost in your explanation there, but… is it your own firewall that’s causing the problem? Sounds like it, if you could FTP from another server.

    For what it’s worth, I use FTP for business purposes on a daily basis. Provided both sides are expecting it, it works just fine. And it’s by far the quickest way to copy a whole folder structure containing ~1000-odd files. Although I’m not sure why HTTP couldn’t work just as well in principle, except that the clients (at least, the ones I’m familiar with) aren’t designed with this kind of use-case in mind.

    Excuse my ignorance, but – when FTP times out, does it return any kind of diagnostic information that might help you to deduce where the problem is? Some kind of traceroute-like feedback would be handy, although I have no idea whether it would be feasible.

  2. My own firewall, possibly (I said my router was #1 suspect). But not necessarily: it could be something at my ISP or elsewhere. That’s kind-of my point: looking from a consumer perspective, I now have to jump through hoops to use FTP!

    In any case, I wouldn’t expect that to apply to boxes at Amazon/EC2, a major “cloud” provider!

    The timeout doesn’t tell me anything. That’s kind-of the point of a packet-filtering firewall: you set it to drop incoming packets on an unauthorised port, so an attacker gets no information back. A few years ago everyone implementing such a firewall knew about FTP (and other services like RPC) data connections on pseudo-random ports, and accepted those coming from existing control connections – meaning an established session. Now it seems that kind of provision is fading into history.

  3. From a consumer perspective – you’re undoubtedly right that it’s becoming harder to do anything online that can’t be done with Internet Explorer. But arguably it was always hard to do these things (can you imagine your father doing them?) – the big shift is that such a wide range of services now are available through a web browser, and that is slowly eating away at the need for all the other protocols out there.

    For what it’s worth, I blame Google. I think they’ve taken over from Microsoft as the driving force behind dumbing down the Internet.

    Back in 2004, Joel Spolsky wrote of the endangered market for desktop apps: “The future belongs to the people who can make Javascript sing”. That’s a prophecy I’ve seen come closer to fulfilment every year since then. The whole “browser-is-the-computer” idea was trendy in the late 90s, but it’s only now that the effects are becoming apparent. And one of those effects is that – there’s no room, on a consumer-grade PC, for anything other than HTTP. (Not over TCP, anyway. There’s still room for a few UDP-based apps, such as Skype and online games.)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: