Documents leaked by Edward Snowden last month reveal a $250M program by the NSA known as Operation BULLRUN, to insert vulnerabilities into encryption systems and weaken cryptography standards. It now seems nearly certain that the NIST-certified random number generator Dual\_EC\_DRBG, adopted as the default in RSA Security’s BSAFE toolkit, contains a back door usable only by the NSA which allows them to predict the entire future output of the generator given only 32 bytes.
So it’s not the easiest time for NIST to suggest they should make a cryptography standard weaker than it was originally proposed. Nevertheless, I support them in this and I hope they go ahead with it.
Starting in 2004, major advances in the cryptanalysis of hash functions resulted in demonstrated breaks in the popular MD5 and breaks in SHA-1 which have not been demonstrated only because it seems no-one wants to rustle together the necessary computing power. While no attacks have been shown which reduce the theoretical security of the successor SHA-2 hash functions, they are similar enough to the broken systems to make people a little nervous. So, given the success of the earlier AES contest, in 2007 NIST announced a public competition to design a successor: SHA-3. They asked that each submission achieve the maximum possible security for a hash function with an n-bit output: 2^(n/2) security against collision attacks, and 2^n security against second preimage attacks.
Sixty four submissions to the competition were reduced to fifty-one, then fourteen, then five, and finally in October 2012 a winner was announced: Keccak, from a team of four including AES co-designer Joan Daemen. Then, on August 22, at the CHES workshop in Santa Barbara, NIST cryptography expert John Kelsey announced that they proposed to make a few tweaks to the Keccak proposal in the final published standard. Three of the tweaks were uncontroversial, but the fourth caused a stir: a proposal to reduce the security of Keccak in order to improve performance. Fifteen days later, BULLRUN and NIST’s part in it were revealed.
To understand why this might be a good idea, it’s important to know a little about how Keccak works. At the core is the Keccak-f function, an unkeyed 24-round block cipher with a 1600-bit block. From this a “sponge construction” is used to build hash functions with various output sizes and security properties.
Our faith in the Keccak-f function is purely heuristic; while it has roughly twice as many rounds as it appears to need for full security, while we have proofs that various particular techniques such as linear and differential cryptanalysis will not work against it, and while the best attack that the world’s best cryptanalysts have been able to publish requires an astonishing 2^1575 queries to break it, we are unable to prove that, perhaps tomorrow, some novel and devastating attack will not show it to be critically insecure. Every hash function will necessarily contain such a component, at least until a proof that P!=NP is found, and possibly long after that.
However, if we believe the Keccak-f function to be secure, the sponge construction allows us to prove very precise assertions about the security of the resulting hash function. It has a parameter called the “capacity”, measured in bits; a larger capacity means a slower but more secure hash function. With a c-bit capacity and a sufficiently large output, the resulting hash function will achieve 2^(c/2) security against collision attacks, and 2^(c/2) security against second preimage attacks. To meet the NIST requirement that SHA-3-512 have 2^512 security against second preimage attacks, therefore, the team proposed a very slow hash function with a gigantic 1024-bit capacity.
When it came to crafting the final version of the standard, NIST noticed something they should have noticed in 2007: no-one needs 2^512 security against second preimage attacks. It’s likely that you couldn’t do 2^512 computing operations given all of the material and energy in the observable Universe from now until the entire Universe undergoes heat death. It’s basically inconceivable that anyone will ever need more than 2^256 security against any attack for any purpose, and it is in any case silly to pretend that we know enough about cryptography to provide such insane heights of security. So making gigantic sacrifices to the performance of SHA-3-512 in order to pretend to reach such heights is pointless, and will serve only to dissuade people from using the new standard and reaching for the more practical 2^256 levels of security required. Kelsey proposed a simple change – for the most important output lengths of 256 and 512 bits, make the capacity of the hash function equal to the output size, so that the second preimage security was equal to the collision security at 2^128 and 2^256 respectively.
As I hope I’ve made clear, I understand why people are made uncomfortable by this. Here’s what I want people to take away: