On the Subversion of NIST by the NSA

Nadiya Kostyuk and Susan Landau wrote an interesting paper: “Dueling Over DUAL_EC_DRBG: The Consequences of Corrupting a Cryptographic Standardization Process”:

Abstract: In recent decades, the U.S. National Institute of Standards and Technology (NIST), which develops cryptographic standards for non-national security agencies of the U.S. government, has emerged as the de facto international source for cryptographic standards. But in 2013, Edward Snowden disclosed that the National Security Agency had subverted the integrity of a NIST cryptographic standard­the Dual_EC_DRBG­enabling easy decryption of supposedly secured communications. This discovery reinforced the desire of some public and private entities to develop their own cryptographic standards instead of relying on a U.S. government process. Yet, a decade later, no credible alternative to NIST has emerged. NIST remains the only viable candidate for effectively developing internationally trusted cryptography standards.

Cryptographic algorithms are essential to security yet are hard to understand and evaluate. These technologies provide crucial security for communications protocols. Yet the protocols transit international borders; they are used by countries that do not necessarily trust each other. In particular, these nations do not necessarily trust the developer of the cryptographic standard.

Seeking to understand how NIST, a U.S. government agency, was able to remain a purveyor of cryptographic algorithms despite the Dual_EC_DRBG problem, we examine the Dual_EC_DRBG situation, NIST’s response, and why a non-regulatory, non-national security U.S. agency remains a successful international supplier of strong cryptographic solutions.

Posted on June 23, 2022 at 6:05 AM25 Comments

Comments

Clive Robinson June 23, 2022 7:21 AM

@ ALL,

Remember that I’m of the firm oppinion that the NSA quite deliberately manipulated NIST and the AES contest.

Resulting in weak implementations of the AES algorithm that had time based side channels that could be exoloited at some considerable distance across a network and beyond the ability of end point users to see.

The result was most AES implementations an most original AES libraries were “Not secure for On-Line use”.

I can also see pre NSA “finessing” in the development of mechanical cipher machines. It also became clear later that the Swiss based Crypto AG was very much under the influence and help of the NSA and preceading agencies to harm and weaken many nations Cryptographic systems.

So the question is not are these “points” true, but can the NSA disprove them?

My money is not on the NSA as I happen to think the “points” are more probable than not, based on my own study of the NSA et al behaviours.

Lucy June 23, 2022 7:56 AM

I am in favor of abandoning end-to-end architecture in favor of a more controlled network-centric model (no, I don’t mean certificate authorities). Unfortunately this approach is associated with Russian efforts.

AES was selected years before anyone had ever heard of the “cloud.”

Still, this is where “experts” work – singular algorithms that must be bulletproof, but no one ever really knows, and which are extremely sensitive to implementation.

j June 23, 2022 7:58 AM

hmm….

Seems to me that some problems are made to appear hard enough that few have the knowledge and resources to solve and evaluate them.

For me, if it is simple and easy to understand, I can be more sure of having something that really works.

The whole shared library discussion comes to mind. Why don’t compilers put out ‘linear C’. Turn your multi-file kludge into a single linear C program that can easily be examined and understood and verified?

Now we have 10base2 coming back as single pair ethernet as if that is new!

John

JonKnowsNothing June 23, 2022 8:01 AM

@Clive

re: can the NSA disprove them?

The current mantra of governments globally is the catch all phrase

  • We neither confirm nor deny ….

Someday I hope to see MSM reports doing a reality check on that statement. If it is not denied, then it’s confirmed.

  • Denial is asserting that a true statement is not true.

===

Search terms

Denial

TimH June 23, 2022 9:28 AM

It’s not just NIST/NSA deliberately weakening stuff. MS Bitlocker is AES-128 by default… and you have to go into Group Policy to change it to 256 bit. It uses AES in CBC mode, with no choice of XTS mode. Elephant Diffuser RNG was removed, and maximum PW length is 20. The recovery PIN is AES-128 only, and the internet recovery option is obviously available to authorities.

A mess.

Wolfgang Rupprecht June 23, 2022 10:18 AM

The timing attacks on crypto algorithms seem much harder to solve in software than hardware. I wonder how hard it would be to add a cpu instruction pair to do a time based lock. At the start of the crypto operation the CPU would execute the start-timed-lock instruction. At the end it would call the wait-for-timed-lock instruction. The downside is that this simplistic approach would require the lock to be set to the maximum possible execution time. Depending on the algorithm, this could lead to a huge slowdown. Just tossing this out there in case some bright CPU designer wants to run with it.

Quantry June 23, 2022 12:26 PM

Yowza. The article* reads like pure NIST propaganda.

“NIST standards were—and continue to be [trusted].”

Why would it take over 6 years ( March 2007 [1] thru September 2013 [2] ) to initiate a rethink of this DRBG ? Just because thats when Snowden embarrassed them a second time?

“Trusted” is not a term I would use.

Why anyone with enuf influence would NOT “roll their own” is beyond me.

[1] h–ps://www.wired.com/2007/11/securitymatters-1115/

[2] h–ps://www.cryptrec.go.jp/en/topics/cryptrec-er-0001-2013.html

[*] Regarding h–ps://harvardnsj.org/wp-content/uploads/sites/13/2022/06/Vol13Iss2_Kostyuk-Landau_Dual-EC-DRGB.pdf

Ted June 23, 2022 2:28 PM

@Quantry

Yowza. The article* reads like pure NIST propaganda.

Lol. I’m going to take a slightly different tack here and say I think the article is excellent. 🙂

Many of you are probably familiar with various parts of this history. However, it’s a significant and exceptionally informative piece for for people like me. It covers much more than just the Dual_EC_DRBG­ debacle.

I agree with you that it’s definitely not anti-NIST. However, I think it highlights the material reasons that a reclaimed NIST plays such a substantial role in the cryptography standardization process and has so much buy-in from a wide range of well-informed groups.

I honestly don’t think I’m drinking the kool-aid. 😉

SpaceLifeForm June 23, 2022 6:40 PM

@ John, Ted

re: code review

Seems to me that some problems are made to appear hard enough that few have the knowledge and resources to solve and evaluate them.

This is True. But there are problems that are actually hard.

Your rabbit holes on this can start with Bison, Flex, and BNF. You will learn about this if you build your own toolchain fron source code.

And about building make with no make.

I will not mention cross-compiling.

Whoops, I just did. Ignore cross-compiling at first. It is a CF.

Did I mention configure? Wait until you understand how that works.

For me, if it is simple and easy to understand, I can be more sure of having something that really works.

This is True. On the surface.

But I may tell you some stuff about a simple C “Hello World” program.

Jon June 23, 2022 8:14 PM

OT, but perhaps not entirely: Anyone have any sense about Richard Clarke’s current preoccupations? Perhaps insights on what he conceptualizes as his career arc?

fanny June 23, 2022 8:47 PM

@ Quantry

Yowza. The article* reads like pure NIST propaganda. […] “Trusted” is not a term I would use.

Look up the information-theoretical definition of “trust” and reconsider that. It’s not something we should actually want, but arguably fits the reality of NIST and the NSA. (And even an ordinary dictionary reveals “trust” to potentially be more about predictability than helpfulness.)

The article makes the statement that “NIST remains the only viable candidate for effectively developing internationally trusted cryptography standards.” In practice, D. J. Bernstein seems to have been fulfilling this role since Snowden’s leaks, to an extent that worries some.

Joe in TX June 23, 2022 9:37 PM

I think people are looking too hard.

Relying on an external entity with reasonable assurance grants you near immunity from screw ups. Why do they want to create or promote a standard, when that just makes you a target to be called dumb within 18 months? ENISA can’t be blamed because they weren’t involved! 😀

It’s risk avoidance.

SpaceLifeForm June 24, 2022 2:23 AM

@ John, Ted

re: Code Review

Hello World!

Why don’t compilers put out ‘linear C’. Turn your multi-file kludge into a single linear C program that can easily be examined and understood and verified?

Did I mention m4 and macro processing?

So, I typed in the minimal hello.c program (via vim, natch).

It is 6 lines of text. I am going to try to show you the source code, using a combination of markdown and html. Yeah, I know, but I am a glutton for punishment, and have now spent almost an hour on this.

If it does not show, you can find it on the web.

Anyway, doing this live, no preview. I hope my eyeball parser is working.

cat hello.c
#include <stdio.h>
int main()
{
printf("Hello World!");
return 0;
}

But, then I run it thru the preprocessor and output that to a text file.

gcc -E hello.c > pre.c

The file pre.c is still readable C source code.

But, now, in my test, it is 748 lines of source code instead of 6. Granted, there are many empty lines. But there is also a huge amount of external library declarations that will not even apply functionally to the behaviour of the hello world program.

Basically, you avoid this exercise unless you really, really suspect a compiler bug. Did I mention that there are problems that are actually hard?

Point being, a simple 6 line C program really grows in size because of the include directive. The remaining 5 lines are intact.

Can you imagine doing this with a larger codebase?

Also, note that all comments in the source code are lost.

You do comment your source code, right?

SpaceLifeForm June 24, 2022 3:20 AM

@ John, Ted, Clive

re: code review

Eyeball parser worked. Combination of markdown and html macro processing worked. I think I understand the filter order problem, if you avoid preview.

Another way to look at this: I spent a hour on code review, and the output is as expected and readable. It compiled cleanly on the first try!

Clive Robinson June 24, 2022 4:57 AM

@ SpaceLifeForm, John, Ted

re: Code Review

But it’s not just the pre-processor, that adds to the load of code you do not see…

Most C compilers work in an OS environment process space. Which requires a whole load of often assembler code.

Often from the C0 file… That sets up the environment interface going through to a second file that sets up the process space and eventually “jumps to __main”.

Oh and remember, with few exceptions nearly every programing language goes back to C in some way.

Such is the way of the unseen Danta code World.

wiredog June 24, 2022 5:18 AM

The reason we trust NIST is that crypto is hard. In my sinful youth I bought a copy of Applied Cryptography and played around with implementing my own system. Ever try to build a good random number generator? Oof. And can I trust the Borland C compiler? And, well, I know my limits.

Denton Scratch June 24, 2022 6:14 AM

The fall of the Soviet Union […] this was also the time just before the development of the public Internet

That would be “just before the development of the worldwide web”.

When the Computer Security] Act was passed in 1987, the public Internet was not a reality

It doesn’t surprise me when journalists conflate the internet with the worldwide web. It does surprise me when it’s done by well-informed experts.

The agency sought to strengthen its ties to cryptographic standards organizations

The authors do this quite a lot: referring to “the agency” in a context where they probably mean NIST, not “The Agency” (but it’s not clear).

The paper describes why, despite the DUAL_EC_DRBG debacle, NIST is still trusted to set cryprography standards. The reasons seem to be:

  • NIST now has built-in technical expertise, which it previously lacked
  • NIST standard-setting processes are much more transparent
  • The relationship between NIST and the NSA has been reformed.

But the way I read the paper, the main reason is that no other standards agency can do the job, because NIST has exclusive power to set FIPS standards. That, it seems to me, is an awful reason to trust someone – “because there’s nobody else”. We didn’t know about the NSA’s ability to corrupt NIST processes back in the 80s; how do we know there aren’t new channels for corruption now? This is the same organisation that approved Clipper as a FIPS. They may have new processes, but they don’t have a new boss.

Clive Robinson June 24, 2022 6:40 AM

@ Wiredog, All,

Re : It’s tough to do it right.

Ever try to build a good random number generator?

Yes… I mentioned some of the issues here a few years back and mentioned the importance of screening.

As this involves not just from the E firld but the H field I mentioned Mu-Metal (pronounced as mew-metal) another poster took considerable exception to what I was saying and basically called me a charlatan…

Then when,

https://www.lightbluetouchpaper.org/2009/09/08/tuning-in-to-random-numbers/

Appeared, they did not apologise…

Anyway, as for Borland and it’s,C compilers, I rather liked the early Pre-Windows ones, and still have one running on a Linux box under a DOS emulator, to still support some “old software” I wrote back in the past…

Worse I still have my own “hand crafted” C compiler tool chain, based on a predecessor of Small-C…

Yes back when “Software Developers” on 8bit and 16bit computers were enjoying the “cave-man” bear skin and saber-tooth tiger style existance of Z80 and 6502 and were slowly scratching our heads over Intel’s weird 8086/8 obvious “non-starter” when there was Motorola’s so much superior 68K, that you could get BSD Unix running on…

Anyway some people reading along might not have been a twinkle in their Dad’s eye, and conceiveably even their Dad may not have been a twinkle in their Grandfather’s eye ={

Gunslinger June 27, 2022 1:52 AM

I agree with TimH, it’s not just the NSA weakening crypto. Another example is IPSEC. There’s a lot of options, but the government does provide guidance on what they consider to be most secure, called IPMEIR (IPSec Minimal Essential Interoperability Requirements), but there are no manufacturers providing the ability to choose the recommended options for your VPN. Only if you set up your own IPSEC system do you get that chance, and most users aren’t doing that.

Phillip Coffman July 3, 2022 10:06 PM

I thought provably secure was a contemporary focus. Why any algorithm might rest on authority is silly, if you are really being honest. A one-way function in a finite space? While quantum computing is something of a contemporary topic, for now we continue arguing over authority. It does not really matter who has governance, were one to insist on provability. Sorry, I just reflexively reject conspiracy theories for any silly reason. Snowden v. NSA, whatever… You are not keeping your eye on the ball. And yes, cryptography becomes pathologically confusing for far too many. A fact one should never use as basis, mercy me, too much. Get over President Nixon.

SpaceLifeForm July 4, 2022 5:30 AM

@ Phillip Coffman, Ted, Clive, ALL

No Algorithm is dependent upon any Authority.

You can use whatever cryptography you want. The problem will be the momentum of the choice, which all kinds of software will end up using, especially browsers.

It does not have to be a conspiracy theory at all, it could just end up being poor decisions.

Razor fight ensues. See Occam v Hanlon.

And, if it turns out that the decisions were poor, and that the algorithm is easily attackable, well then, how fast can the problems be fixed?

It will not be fast.

Remember, this theoretical quantum attack has not really been demonstrated in reality (except small bit lengths), there is no hardware to our knowledge that is effective, even though Shor showed that it is possible.

But, if a bad choice is made, under some alleged time constraint (because the quantum cryptography ghost is approaching – be scared!), and the code gets rolled out to the world, but ends up easily attackable, then even if the ghost never shows up, Houston, we have a problem!

The cryptography choice may end up being attackable with current hardware.

This is a classic situation where you do not want to bleed.

Clive Robinson July 4, 2022 7:55 AM

@ SpaceLifeForm,

Re : No Algorithm is dependent upon any Authority.

Actually its “use” is, by what they try to call “convention”, and we sometimes call force majeure.

For instance a US Gov authority can insist on FIPS approved systems “use” only (and do).

Whilst FIPS or similar can give the illusion of freedom of choice, it can be rigged to force a single choice. In essence that was what the NSA tried to do with NIST and RSA with the Dual EC algorithm…

There are other examples that can be given but that alone should be sufficient.

So “Choice” and “Freedom” are like that “grass on the other side of the fence” green lush and desirable, but the man who controls the fence keeps you penned in to their choice denying you the freedom of your choice.

If you want to see such “fence control” in action outside of Government look at any major web browser and how they effectively force you to do what Google and similar want…

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.