One rationale seems to be the standardization of PQ cryptography and thus the ability to go directly from weaker cryptography to PQ, rather than in 2 steps (112->128->PQ).
On the chopping block:
* ECB (\o/)
* Triple DES (TDEA)
* Finite field DSA (for new signatures)
* ECDSA at strengths lower than 112 bits
* RSA below 2048 bits
* RNGs, HMACs, HKDF, PBKDF and hashes based on SHA1 and the truncated 224-bit SHA-2/3 modes
No big surprises. The 224's are interesting, because folklorically they have value in hash constructions where resistance to length extension is useful. In practice, everyone just uses HMAC anyways.
On the one hand i am glad that ECB dies officially as a mode on the other hand i wonder what NIST officially recommends when you want to encrypt data that's shorter than one block. xD
regarding finally transitioning away from SHA1: about fucking time :D
For instance the CTR mode can be used to encrypt any number of bits, down to a single bit.
The problem of the other modes vs. ECB is that they require the generation and the transmission of an "intialization vector", i.e. either a counter value or a random number, depending on the mode, so besides the short encrypted data a longer whole block must be transmitted. This can be avoided only when a set of small data are considered as parts of a long sequence of encrypted data, so the encryption mode is not reinitialized at each new message, but the last state is remembered.
ECB is a valid encryption mode only when it is used to encrypt random numbers having the length of the block (or other kind of data for which there is a strong guarantee that there will be no repeated values). It is secure for challenge-response authentication, if the challenges are unpredictable random numbers. ECB would be a perfectly secure method for encrypting other encryption keys, which must be random, except that one might want to encrypt together with the values of the keys other data, such as identifiers or error detection codes, in which case ECB could not be used to encrypt the additional non-random data.
"Triple AES" sounds as something insecure if it is similar to triple DES. (Insecure in the sense of providing a small additional strength obtained with a big increase in the time and energy needed for encryption/decryption.)
Only amateurs would choose to implement a "Triple AES", so it is very likely that they will also write a buggy implementation. Triple DES has not been used because it was a good strengthening method, but only because it could be used with unmodified hardware modules designed for simple DES. When a cipher strengthening is done in software, there are much better methods.
The best way to strengthen AES above the standard AES-256 is to double the block length from 128 bits to 256 bits. Increasing the key length over 256 bits is much less useful, because the key length is not the weakest point of AES-256. A 256-bit key is strong enough even against quantum computers, but short 128-bit blocks can be a vulnerability in certain applications. The key schedule algorithm of AES, which converts the cipher key into a set of round keys, is mediocre, so the length of the cipher key is the least important concern about the strength of AES.
The original Rijndael proposal had a stronger variant with 256-bit blocks, which has not been retained in the standard. Nevertheless, it is easy to implement it with the Intel/AMD AES instructions or with the Arm Aarch64 AES instructions. Intel has even published an application note describing how to do this, when the AES instructions have been introduced in the Westmere CPUs.
After increasing the block length, increasing the number of rounds can provide additional strengthening. Another choice would be to replace the standard key schedule algorithm with a stronger non-standard algorithm (i.e. one providing more random round keys). Increasing the key over 256 bits provides a much less useful strengthening in comparison with the cost required for executing the additional necessary operations.
NIST is basically the publishing arm of the NSA, so it really depends on whether the NSA is taking the "protect national information assets" or the "attack foreign information assets" part of its mandate more seriously from year to year.
NIST does a lot of really neat work outside of crypto standards. Judah Levine and all the other
metrology folks are awesome. It's unfortunate that they get grouped together by comments like this.
Sorry, yes I only meant in the context of cryptography of course. NIST is a great organization and it's really a historical accident that they do anything with crypto.
I'm surprised to see symmetric algorithms in this list. It's been a while since I worked adjacent to the field (I'm not a cryptographer but spent a lot of time working with them in a past life), but my understanding is that PQ refers to replacing those algorithms that are vulnerable to advances in quantum computing, e.g., public key algorithms, such as RSA, that use relative primes and are therefore subject to attack by efficient implementations of Shor's algorithm.
AIUI, symmetric algorithms such as 3DES are not subject to these attacks, but my understanding could be wrong.
Both ECB and TDEA are dangerously outmoded even if quantum cryptanalysis is never realized; ECB because you can see penguins through it, and TDEA because of the 8 byte block size.
Makes sense to skip the 128-bit step and go straight to post-quantum (PQ) cryptography, especially if it’s inevitable. And yeah, good riddance to ECB, that should’ve been axed ages ago.
Rumors suggest a toy 22bit RSA cipher factorization was recently demonstrated in China on D-wave quantum annealing platforms, and several paper details of the scaling potential were censored.
i.e. the NIST advice to incorporate quantum resistant algorithms shouldn't be taken lightly. For some, transitioning means wrapping a well-tested RSA system in something newer like FIPS 203, 204, or 205.
We live in interesting times for certain, as gnugpg with Kyber support has static build fails on some platforms (libassuan 3.0.1 bug). =3
I don't know of anyone working in the space that takes that demonstration seriously, but I didn't go digging much; let me know if you find someone. For a lot of cryptography engineers, the mention of "D-Wave" is enough to shut down the inquiry.
I'm referring to the specific D-Wave China RSA demonstration you're talking about, which I've been reading cryptographers dunking on.
Cards on the table, my position on quantum cryptanalysis remains: "Rodents of unusual size? I don't think they exist." It's a very big deal because it's a full employment program for people working on novel asymmetric schemes.
On the chopping block:
* ECB (\o/)
* Triple DES (TDEA)
* Finite field DSA (for new signatures)
* ECDSA at strengths lower than 112 bits
* RSA below 2048 bits
* RNGs, HMACs, HKDF, PBKDF and hashes based on SHA1 and the truncated 224-bit SHA-2/3 modes
No big surprises. The 224's are interesting, because folklorically they have value in hash constructions where resistance to length extension is useful. In practice, everyone just uses HMAC anyways.
*