Learn Physics or Get Bored Trying

Has buffer protection become second-nature?

I took a break from the physics recently to have a crack at the challenge that is now known to have been posted by GCHQ at And it was really rather fun, once I got into it. My day job as a developer these days is almost exclusively based around the sort of high-level languages that almost seem divorced from the machine-code that they ultimately represent, so “getting my hands dirty” with assembly language and stack pointers was refreshing, if something of a busman’s holiday. A couple of nights ago I reached the end, and while the jobs being advertised weren’t of interest (wrong country, wrong pay-grade), it was still rather satisfying, if lacking in cake.

[Mild spoiler alert; I’ll try to avoid giving away anything too major, but some hints might be dropped.]

But having completed the challenge, and knowing that I’d almost certainly done much of it the “hard” way (for which, read blindly fumbling while flailing maniacally), I thought it would be instructive to peruse some of the solutions and walkthroughs that others have been posting lately. Indeed having not studied much cryptography at all, I hadn’t been aware that one of the stages was actually working through the RC4 algorithm to encrypt/decrypt that stage’s payload. As far as I could tell, it was a black box, turning gibberish into something slightly less gibberish. One of the stages required us to implement a simple virtual machine code in order to make sense of a data-dump, and it was interesting to compare my implementation (knocked together quite inexpertly in JavaScript: if anyone asks, I can see if I can still dig it out) with others written in C and Python for example.

The final part of the puzzle involved downloading an .exe file, and reverse-engineering it to figure out how to feed it a suitable “license” file to produce a URL that leads to the passphrase for the challenge page. And here I noticed something I thought quite interesting from reading other people’s solutions. It turns out from a brief inspection of the executable that it depends on the Cygwin “crypt” library, and what is quite clearly a hashed password appears in the file’s data block. So cracking that hash will give us something we can use somehow within the license file; examining the assembly code will clarify the “somehow” part. The code for loading the license file into memory turns out to be pretty straightforward, so what I did was to grab a password cracker (I used John the Ripper) and let it spin for a few hours. Eventually the password dropped into the vending slot, and I was good to go (once a few more details were cleared up).

And from what I’ve read, that seemed to have be most folks’ solution. A few noted in passing that one of the instructions in the code had a potential buffer-overflow problem, but no-one seemed to care. However, on one blog (containing videos explaining solutions for all parts), the author (who had commented on the buffer problem) noted that GCHQ subsequently contacted him to say that they were aware of the vulnerability, and that indeed it was deliberately placed there to allow people to skip the password hash check.

Well I’ll admit I felt kind of silly at this point. As soon as I read that, it not only became obvious how I could’ve avoided a few hours of password-grinding, but indeed it went some way to explaining the slightly quirky way in which the check was being performed by the code: it was laid out that way in order to make it easier to exploit the buffer overflow to get the desired result. But what I found really interesting was the fact that clearly a number of people spotted the dodgy code, but it didn’t occur to them to use it as part of a code-breaking and “hacking” challenge!

Of course this is all merely anecdotal, but it made me wonder whether, for some people at least, the security message might be sinking in—when unsafe code is so immediately recognisable to a developer that their inner alarm bells sound before they’ve even had time to process what’s wrong, perhaps we’re finally starting to outgrow the bad habits that the pre-Internet innocence fostered. Buffer overflows aren’t a major issue for the kinds of development we do at my current job, but if I could install a nervous tic in any developer who comes to work for us, at the merest hint of unsafe SQL parameter passing (there is no excuse), I’d like to think it would make the world just a little better.

At the same time, though, the message is only any use if people remember why it was written in the first place. Taking part in challenges such as this, or the various educational hacking sites (such as I think helps developers stay in touch with the reasons behind some of the do’s and don’ts of secure coding.


December 5, 2011 - Posted by | Tech | , ,

No comments yet.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: