|
| trasz wrote:
| RSA, being American company, cannot refuse NSA's backdoors.
| Discovery of the backdoor hurt RSA's business, so it's
| understandable RSA has beef with them.
| er4hn wrote:
| This wasn't added via a secret order however. RSA had a
| business agreement with the NSA to add the backdoor. RSA was
| paid $10 Million for this.
| rdtsc wrote:
| > RSA, being American company, cannot refuse NSA's backdoors.
|
| The key is selling to American government, and any entity
| related to it. But no, they can't mandate RSA build anything.
| Of course, if they refuse, they'll find another company which
| would, pay them lots of money, and then issue a certification
| requirement that only this particular backdoor algorithm is
| "approved" and then wait for RSA to go out of business.
| elmo2you wrote:
| Since when can they not refuse a NSA backdoor? Where does the
| mandate come from, with which the NSA supposedly can instruct
| commercial/private entities to integrate technological back
| doors? Does it even have such a legal mandate. I'm sure the NSA
| will argue that they do, but that doesn't mean they actually
| have it.
| bdamm wrote:
| Government buyers that are less important (e.g. state level
| tollways) would be mandated to buy the backdoored algorithm
| by having the federal government cook it into a specification
| of how to buy tollway equipment, for example. Once the
| backdoored algorithm is in the product suite, it can be put
| to work on a more tactical level.
| trasz wrote:
| Some of the ways are already known: your company can be
| denied lucrative government contracts if you deny. Or you
| might learn you can't export your products due to export
| restrictions. Other ways are known to exist, but details are
| not available yet - go read about National Security Letters,
| or kangaroo "secret courts".
| HappySweeney wrote:
| Not that I don't agree, but how do you know the secret
| courts are kangaroo?
| Spooky23 wrote:
| Courts that aren't adversarial are just interpreting law.
| Secrecy makes it worse by eliminating accountability by
| the petitioner and judge.
|
| For a non-secret example, look at the Social Security
| "fair hearings", where an administrative law judge
| basically listens to a petition and makes a decision. The
| standards vary significantly by locale.
| kook_throwaway wrote:
| The fact that it's secret.
|
| Also that you aren't even allowed to show up to defend
| _yourself_. [1]
|
| Also that they denied 11 out of 34,000 requests over a 35
| year period.
|
| Also that the judges are appointed by _one person_ and
| don 't even need congressional approval.
|
| How could it possibly _not_ be a kangaroo court?
|
| [1] https://en.m.wikipedia.org/wiki/Ex_parte
| bdamm wrote:
| We keep having this come up with some of the EC curves like NIST
| P-256 for example. There's no evidence that it is actually
| backdoored, but the consensus seems to be that the construction
| is suspicious, unlike the construction for SHA-2.
|
| What do we do with it? Not many in a product development team
| that is interacting with other companies or organizations can
| meaningfully defend not using a NIST curve because it looks
| suspicious.
| fmajid wrote:
| It is difficult to get a man to understand something when his
| salary depends upon his not understanding it.
|
| -- Upton Sinclair
| johnklos wrote:
| From the wonderful fortune(6) database: Anyone
| who is capable of getting themselves made President should
| on no account be allowed to do the job. -- Douglas Adams,
| "The Hitchhiker's Guide to the Galaxy"
|
| I think the RSA chief can be trusted to do what's in the best
| financial interest of the RSA, even when that is in contradiction
| of the correct thing, so long as there's plausible deniability.
|
| I'm glad this is being brought up and not forgotten.
| [deleted]
| CamperBob2 wrote:
| nullc's flagged comment may not have been the best way to get the
| point across, but it's an important point nevertheless.
| Conversations about the US intelligence community's repeated
| attempts to suppress and subvert modern encryption standards
| never seem to mention Crypto AG, perhaps the most egregious
| example we know about. A great article just came out that
| highlights some of the shenanigans:
|
| https://spectrum.ieee.org/the-scandalous-history-of-the-last...
| ... In 1966, the relationship among CAG, the NSA, and the
| CIA went to the next level. That year, the NSA delivered
| to its Swiss partner an electronic enciphering system
| that became the basis of a CAG machine called
| the H-460. Introduced in 1970, the machine was a
| failure. However, there were bigger changes afoot at CAG:
| That same year, the CIA and the German Federal
| Intelligence Service secretly acquired CAG for
| US $5.75 million.
|
| I'm surprised no one has submitted this one, actually.
| DaftDank wrote:
| Reading about this saga in Ben Buchanan's book "The Hacker and
| the State" made me realize how every government agency (NIST in
| this case) seems to be always second fiddle to the "needs" of the
| NSA/national security apparatus. It seems clear from the book
| that there was a point in time when they essentially just left it
| in the NSA's hands to develop, knowing it was probably not
| secure. Not exactly some huge revelation that the national
| security apparatus can exert power and leverage over other
| government groups, or even private companies, but the extent to
| which it happens was surprising.
| nullc wrote:
| Budiansky's Code Warriors emphasized the point that the NSA and
| its precursors has actively withheld information from the
| civilian government, including the president. Unfortunately,
| the very secrecy of it prevent us from knowing the full extent,
| we only know of the specific cases where its been documented.
| er4hn wrote:
| Another problem is how NIST should come up with standards. NIST
| is in charge of standards, but that means that they need to
| turn to subject matter experts for each separate field. They
| need to define the standards for everything from measuring
| weights, to chemicals in wastewater, to cryptography.
|
| So then for each standard you then end up with the government
| equivalent of an open process where there are requests for
| comments, maybe a meeting or two to discuss, and trusted folks
| end up defining the bulk of the document with oversight from
| editors.
|
| Where this breaks down is when you have the subject matter
| expert on crypto in government, the NSA, be interested in
| undermining the standards for their own specialty to serve
| their internal agenda.
| tptacek wrote:
| Two things real quick:
|
| Art Coviello is a salesman who headed the company that _bought_
| RSA and took the name. It would be a little weird to expect him
| to meaningfully know what a cryptographer even is. The idea that
| Coviello would himself be weighing NIST against crypto eprints is
| pretty silly.
|
| And, more importantly, the only important cite here is Shumow and
| Ferguson. Schneier didn't analyze Dual EC (he never did work in
| elliptic curves at all, and claimed not to trust their math);
| here, he's simply reporting on Shumow and Ferguson's paper, and
| he doesn't even say Dual EC was backdoored. Nor, for that matter,
| do the cites before Shumow and Ferguson.
|
| (Before anyone jumps on my back about this: I basically shared
| Schneier's take on this, that Dual EC was too conspicuous to
| really be a backdoor, and that the right response was to ignore
| and never use it. I was wildly wrong about how prevalent Dual EC
| was --- I couldn't imagine any sane engineer adopting it, because
| it's slow and gross. If I'd known before the BULLRUN revelations
| that, for instance, every Juniper VPN box was using Dual EC, I'd
| have been a lot more alarmed and a lot less charitable about it.
| Oh well, live and learn.)
| jldugger wrote:
| > Art Coviello is a salesman who headed the company that bought
| RSA and took the name. It would be a little weird to expect him
| to meaningfully know what a cryptographer even is.
|
| I don't expect any random person to know, but why would anyone
| spend that much money to buy that company without doing enough
| due dilligence to what a crytographer does? I don't imagine
| they'd be any expert in cryptoanalysis, but you'd likely listen
| do your own cryptographers on RSA staff, right?
| S_A_P wrote:
| Not disagreeing with your take, but I think its important to
| note that I just don't see it being possible that Art came up
| with his take without any input from folks in the company. I
| would imagine there were meetings where these talking points
| were constructed. Right?
| tptacek wrote:
| I think it seems crazy now, but that's because we know a lot
| more about the practical applications of malicious RNGs; they
| aren't an abstract concern now. But they kind of were when
| the big debate was alleged to have happened at RSA.
|
| Also: I'm naturally going to sound like I'm defending RSA
| here, and I am not. I feel like --- I'll probably be proven
| wrong by this in time because we live in a fallen world ---
| no major company in the world would in 2021 swap out a
| crucial cryptographic component for one DOD was demanding
| while cryptographers were making noise about how janky it is.
| That should have been the standard in 2007 or whatever, too.
| wahern wrote:
| > I think it seems crazy now, but that's because we know a
| lot more about the practical applications of malicious
| RNGs;
|
| RNGs were understood to be the lynchpin of secure systems
| for decades, including long before 2007; and it was also
| widely assumed both now and then that they were one of the
| most common vectors for attack by the NSA.
|
| Why RSA added Dual_EC_DRBG is easy to explain in dollars &
| cents: 1) RSA was literally paid to add it, and 2) most of
| RSA's revenue comes, directly or indirectly, through
| government contracts (e.g. FIPS compliance, etc).
|
| As for why RSA insiders didn't speak up: there are
| mountains of scholarship explaining why people just keep
| their heads down. Even if you were absolutely convinced
| beyond a shadow of a doubt that Dual_EC_DRBG was a
| backdoor, intelligent people are very good at rationalizing
| things. Anybody who has worked at a large company,
| including RSA, understands that your day-to-day work and
| the company's business is as a practical matter <10%
| technical and >90% everything else (sales, profit seeking,
| integration, etc, etc). More importantly, if you're a
| company doing business in a space dominated by U.S.
| government requirements and processes, or even just
| patriotic, the NSA having a backdoor is hardly the worse
| thing in the world. There are amazing cryptographers in
| China. Even the ones who fancy themselves world citizens
| and above the fray of nationalism, how many do you think
| would stick their head out were they in a position to
| identify possible formal government attempts to manipulate
| technology?
|
| Moreover, a backdoor doesn't necessarily mean insecure;
| it's not a categorical truth that any backdoor means broken
| security, that's just a rule of engineering thumb built on
| the experience that securely maintaining the keys to
| backdoors is supremely difficult, often more difficult than
| any other aspect. Nobody has yet come close to _breaking_
| Dual_EC_DRBG, AFAIU. From a purely technical perspective,
| Dual_EC_DRBG is still secure. The keys haven 't leaked, and
| the algorithm remains as impenetrable as ever. At the end
| of the day, that's all the rationalization most people
| would ever need to keep their head down. The "security" of
| Dual_EC_DRBG is a socio-political debate, not a technical
| one.
| tptacek wrote:
| I disagree with basically all of this.
|
| I disagree that cryptography engineers understood
| viscerally how good a target RNGs were or how viable a
| PKRNG would be (further evidence for that would be the
| contortions attackers have to go through to extract
| enough wire state from Dual EC to mount the most
| straightforward attacks). I think you can formulate an
| argument that any major cryptographic primitive is the
| "lynchpin", and indeed you see people doing that, for
| instance with the SIMON/SPECK block cipher designs ---
| block ciphers, after all, are the lynchpin of secure
| systems.
|
| I agree, obviously, that RSA added Dual EC because DOD
| demanded it. But most of RSA's revenue didn't come from
| BSAFE, or even things that relied on BSAFE. They were a
| crappy token company that bought RSA, then built a bunch
| of multi-factor authentication stuff that had more to do
| with IP reputation than with cryptography.
|
| I don't really buy that anybody working inside RSA was
| absolutely convinced that Dual EC was a backdoor. I sort
| of don't buy that anyone was really even seriously paying
| attention. I think people think of RSA as a cryptography
| company, but that is not at all what RSA was at the time
| this happened.
|
| None of this matters, really. We arrive at the same place
| about RSA's culpability. But if you came to HN hoping to
| find someone to stick up for RSA's decision here, you
| haven't been paying attention to the tenor of this place.
| All you're going to get here is hair splitting; that's
| the interesting conversation we can actually have.
| There's no viable debate about whether adopting Dual EC
| was defensible. Even when I was saying I doubted Dual EC
| was a backdoor, I still didn't think _using it_ was
| defensible.
| wahern wrote:
| > for instance with the SIMON/SPECK block cipher designs
| --- block ciphers, after all, are the lynchpin of secure
| systems.
|
| The lynchpin to ciphers are the keys. That's the very
| definition--proof of security reduces to the question of
| whether you know the key or not.
|
| Unless you exchange a database of one-time pads, you
| invariably need an RNG to generate keys for your ciphers.
| _That 's_ your lynchpin right there. The _key_ is the
| lynchpin, and RNGs generate your keys. You don 't need to
| feel it; it's cryptography 101. Granted, it's such a
| basic and fundamental aspect to secure systems that it
| usually gets lost in all the bike shedding.
| healsjnr1 wrote:
| I've got some direct personal experience in this one. A
| few key points from how I saw it play out inside:
|
| - there was a lot of noise made about this by the bsafe
| crypto team when it was first implemented (anecdotal, but
| I trust the people that were there and the context below
| helps reinforce this). From what I heard there was clear
| communication that adding EC drbg to the toolkits the way
| nsa wanted was insecure.
|
| - that happened before my time, but by the time I got
| there it was kind of an inside joke that EC drbg was an
| NSA backdoor (I think this was around 2010)
|
| - the above was tempered by the fact that it was so
| horrendously slow, no one could imagine it being used
|
| - even though RSA demanded it was the default RNG for the
| toolkit, the first part of documentation strongly
| suggested changing this default
|
| - my memory is that this work on EC drbg funded
| development bsafe SSL toolkits. So while the money may
| have been relatively small, it opened up a new product
| for BSAFE
|
| The smoking gun and the bit that made it really obvious
| that something was off about this came in its use as part
| of the TLS toolkits.
|
| There was an explicit, but unexplained, requirement that
| the _first 20 bytes_ of random generated during the
| handshake were sent unencrypted as part of the handshake.
|
| EAY led that crypto team, they knew their stuff and they
| knew that this was off and there was no legitimate reason
| for doing this.
|
| My take: this team new what was happening and they made
| it clear to management. As a really the people who made
| the decision to take NSA money knew what it was and the
| implication and went ahead anyway.
|
| As a foot note, when we did the cleanup on this we found
| that in some of the toolkits the way that the 20bytes was
| sent was flawed and would have meant that an attempted
| backdoor using this would have failed. Whether this was
| intentionally or not _shrug_.
| tptacek wrote:
| This is great.
|
| Just to be clear: the TLS integration and 20 bytes of
| random stuff was definitely a smoking gun; nobody thinks
| anything but that Dual EC is a backdoor after learning
| about it.
|
| EAY is Eric A. Young? I didn't realize he'd worked on
| BSafe.
| AlexCoventry wrote:
| > From a purely technical perspective, Dual_EC_DRBG is
| still secure.
|
| I think that depends on the techniques you're thinking
| of. The usual way of proving such a system is secure is
| to reduce a break to a solution of a bedrock problem like
| discrete log, and according to the second reference in
| the OP, "Cryptanalysis of the Dual Elliptic Curve
| Pseudorandom Generator", no such proof was provided in
| this case. I would say that without such a proof, it's
| not "technically" secure.
| skinkestek wrote:
| Just a quick thought:
|
| I think it wasn't that long before that NSA had warned
| against some other crypto that was widely thought to be
| safe and everyone later realized that it had been a good
| thing.
|
| Can it be that some people thought NSA were doing them a
| favour again?
| tptacek wrote:
| You're probably thinking of DES, which happened long
| before Dual EC (and long before many of the people
| working at RSA started their careers). But you can see
| that effect even today, for instance with NSA's
| "deprecation" of Suite B cryptography and the shade that
| cast over conventional elliptic curve cryptosystems.
|
| I don't think one can reasonably defend adoption of Dual
| EC as somehow hedging a bet that NSA had found
| vulnerabilities in trivial block-based CSPRNGs, though. I
| think that decision was essentially indefensible, even at
| the time it was made; it's just more clearly batshit now
| than it was then.
| skinkestek wrote:
| Ok, thanks. I appreciate your opinion on it and guess you
| are right.
| adyavanapalli wrote:
| The link to the keynote wasn't resolving in the article, so
| here's the YouTube link: https://youtu.be/aB2gG-cRj10
| sneak wrote:
| It's important to remember that RSA received cash payments from
| the USG to backdoor this. It wasn't just an "oops, we were
| insufficiently vigilant". They actively participated.
| elmo2you wrote:
| > They actively participated. And that, in my opinion, makes
| them a criminal enterprise.
|
| Maybe not within a US context, for arguable the US government
| gave them a mandate for this deception. But within an
| international context they should probably be held accountable
| and barred from doing business abroad (as they are essentially
| an agent/extension of a US intel agency).
|
| Never going to happen, of course. Not with how that whole
| industry operates. But that only shows how little the whole lot
| of them and their industry should not be trusted in the first
| place.
| some_furry wrote:
| I feel like this detail isn't emphasized enough in the coverage
| of RSA's participation with Dual EC. Wasn't it like $10
| million?
| sneak wrote:
| This is important to remember when you see industry
| professionals paying money to RSA to attend their events, or,
| worse yet, speaking at them.
|
| Supporting those who make us less safe is a clear signal
| about where your priorities lie.
| nullc wrote:
| WHAT DID YOU SAY? I CANT HEAR YOU OVER THE SOUND OF THE $10
| MILLION DOLLARS THAT JUST SPONTANEOUSLY LANDED IN MY LAP.
|
| CONCERNS? YES I AM CONCERNED THAT IF I AM NOT HELPFUL THE CIA
| WILL NOT COLLUDE WITH THE GERMANS TO PURCHASE MY COMPANY OUTRIGHT
| AND USE IT FOR DECADES TO SHIP BACKDOORED CRYPTOGRAPHIC PRODUCTS
| LIKE THEY DID WITH CRYPTO AG. WHAT WAS THAT YOU WERE SAYING ABOUT
| COWS AND FREE MILK?
___________________________________________________________________
(page generated 2021-09-03 23:02 UTC) |