|
| throwaway654329 wrote:
| The history in this blog post is excellently researched on the
| topic of NSA and NIST cryptographic sabotage. It presents some
| hard won truths that many are uncomfortable to discuss, let alone
| to actively resist.
|
| The author of the blog post is also well known for designing and
| releasing many cryptographic systems as free software. There is a
| good chance that your TLS connections are secured by some of
| these designs.
|
| One of his previous lawsuits was critical to practically
| protecting free speech during the First Crypto War:
| https://en.m.wikipedia.org/wiki/Bernstein_v._United_States
|
| I hope he wins.
| nimbius wrote:
| the author was also part of the Linux kernel SPECK cipher talks
| that broke down in 2013 due to the nsa's stonewalling and hand
| waving for technical data and explanations.
|
| nsa speck was never adopted.
|
| https://en.m.wikipedia.org/wiki/Speck_(cipher)
| ddingus wrote:
| Interesting read!
| aliqot wrote:
| Given his track record, and the actual meat of this suit, I
| think he has a good chance.
|
| - He is an expert in the domain
|
| - He made a lawful request
|
| - He believes he's experiencing an obstruction of his rights
|
| I don't see anything egregious here. Being critical of your
| government is a protected right for USA. Everyone gets a moment
| to state their case if they'd like to make an accusation.
|
| Suing sounds offensive, but that is the official process for
| submitting an issue that a government can understand and
| address. I'm seeing some comments here that seem aghast at the
| audacity to accuse the government at your own peril, and it
| shows an ignorance of history.
| newsclues wrote:
| Trump Card: National Security
| CaliforniaKarl wrote:
| That's a valid reason (specifically, 1.4(g) listed at
| https://www.archives.gov/declassification/iscap/redaction-
| co...). And while the NIST returning such a response is
| possible, it goes against the commitment to transparency.
|
| But still, that requires a response, and there hasn't been
| one.
| maerF0x0 wrote:
| I'd add
|
| * and it's been 20 yrs since the 9/11 attacks which
| predicated a lot of the more recent dragnets
| kevin_thibedeau wrote:
| The dragnets existed before 9/11. That just gave
| justification for even more funding.
| throwaway654329 wrote:
| Which programs do you mean specifically?
|
| We know the nature of the mass surveillance changed and
| expanded immensely after 9/11 in a major way, especially
| domestically.
| KennyBlanken wrote:
| Every piece of mail that passes through a high-speed
| sorting machine is scanned, front and back, OCR'd, and
| stored - as far as we know, indefinitely. That's how they
| deliver the "what's coming in your mailbox" images you
| can sign up to receive via email.
|
| Those images very often show the contents of the envelope
| clearly enough to recognize and even read the contents,
| which I'm quite positive isn't an accident.
|
| The USPS is literally reading and storing at least part
| of nearly every letter mailed in the United States.
|
| The USPS inspectors have a long history of being used as
| a morality enforcement agency, so yes, this should be of
| concern.
| greyface- wrote:
| Some more details: https://en.wikipedia.org/wiki/Mail_Iso
| lation_Control_and_Tra...
| throwaway654329 wrote:
| Agreed. It's even worse: they also have the capability
| with the "mail covers" program to divert and tamper with
| mail. This happens to Americans on U.S. soil and I'm not
| just talking about suspects of terrorism.
| UpstandingUser wrote:
| I've heard rumors that this was going on for a long time
| before it's been publicly acknowledged to have -- before
| OCR should have been able to handle that sort of variety
| of handwriting (reliably), let alone at scale. Like a
| snail-mail version of the NSA metadata collection
| program.
| nuclearnice1 wrote:
| Apparently not a pre 9/11 program, if Wikipedia is
| correct.
|
| https://en.m.wikipedia.org/wiki/Mail_Isolation_Control_an
| d_T...
| fanf2 wrote:
| TFA says: _<>_ (July 2001,
| that is)
| throwaway654329 wrote:
| Yes, Duncan Campbell's report is legendary ( https://www.
| duncancampbell.org/menu/surveillance/echelon/IC2... ).
| This is the same guy who revealed the existence of GCHQ,
| and was arrested for this gift to the public.
|
| To clarify, I was asking them for their specific favorite
| programs as they didn't indicate they only meant the ones
| in the blog post.
| michaelt wrote:
| There was the Clipper Chip [2] and the super-weak 40-bit
| 'export strength' cryptography [3] and the investigation
| of PGP author Phil Zimmerman for 'munitions export
| without a license' [4].
|
| So there was a substantial effort to weaken cryptography,
| decades before 9/11.
|
| On the dragnet surveillance front, there have long been
| rumours of things like ECHELON [1] being used for mass
| surveillance and industrial espionage. And the simple
| fact US spies were interested in weakening export SSL
| rather implied, to a lot of people, they had easy access
| to the ciphertext.
|
| Of course, this was before so much stuff had moved
| online, so it was a different world.
|
| [1] https://en.wikipedia.org/wiki/ECHELON [2]
| https://en.wikipedia.org/wiki/Clipper_chip [3] https://en
| .wikipedia.org/wiki/Export_of_cryptography_from_th... [4]
| https://en.wikipedia.org/wiki/Pretty_Good_Privacy#Crimina
| l_i...
| feet wrote:
| I'll also add
|
| Which have not prevented anything and instead are used in
| parallel construction to go after Americans
| gene91 wrote:
| I don't like the collateral damages of many policies. But
| it's not fair to say that the policies "have not
| prevented anything" because we simply don't know. The
| policies could have stopped in-progress evil acts (but
| they were never revealed to the public for intel reasons)
| or prevented attempts of an evil acts (well, nothing
| happened, nothing to report).
| Quekid5 wrote:
| One cannot prove a negative, but given how much public
| recording of _everything_ there is these days (and in the
| last decade+), I 'd say it's safe to err on the side of
| them not having prevented much of consequence. ("Absence
| of evidence..." doesn't really apply when evidence
| _should_ be ample for the phenomenon to be explained.)
| colonwqbang wrote:
| The bar for public policy should be set quite a bit
| higher than "it could have done some good at some point,
| maybe".
|
| In comic books, we read fanciful stories about the good
| guys saving the world in secret. But the real world
| doesn't really work like that.
|
| When the police seize some illegal drugs, what is the
| first thing they do? They snap a picture and publish it
| for society to see:
|
| https://www.google.com/search?q=police+seize+drugs&tbm=is
| ch
|
| because citizens want to see that their tax money is
| being used successfully. The same would likely be done by
| the surveillance authorities if they saw significant
| success in their mission.
| feet wrote:
| I find it rather funny that we know about the parallel
| construction which they attempt to keep hidden, yet don't
| know about any successful preventions. I would assume
| they would at least want people to know if a program was
| a success. To me, the lack of information speaks volumes
|
| This is on top of all the entrapment that we also know
| about, performed by the FBI and associated informants on
| Islamic/Muslim communities
|
| The purpose of a system is what it does
| sweetbitter wrote:
| Considering that they do not obey the law, if they had
| actually stopped any terrorists we would be hearing all
| about it from "anonymous leakers" by now.
| [deleted]
| maerF0x0 wrote:
| It also could have stopped the Gods from smiting us all,
| but there's no evidence that it has.
|
| This article[1] is a good start at realizing the costs
| outweigh the benefits. There's little or no evidence of
| good caused, but plenty of evidence of harms caused.
|
| [1]: https://www.eff.org/deeplinks/2014/06/top-5-claims-
| defenders...
| daniel-cussen wrote:
| There is evidence of that, in fact. There were many
| serious terrorist attacks in Europe, like in Spain's
| subway (300 dead) and Frankfurt, in the aftermath of 9/11
| and other...uh howmy gonna say this...other stuff, the
| Spanish terrorist attacks were done by Basque
| nationalists or such, not Muslims.
|
| So there's your control group, Europe.
| fossuser wrote:
| I remember reading about this in Steven Levy's crypto and
| elsewhere, there was a lot of internal arguing about lots of
| this stuff at the time and people had different opinions. I
| remember that some of the suggested changes from NSA shared
| with IBM were actually stronger against a cryptanalysis attack
| on DES that was not yet publicly known (though at the the time
| people suspected they were suggesting this because it was
| weaker, the attack only became publicly known later). I tried
| to find the specific info about this, but can't remember the
| details well enough. _Edit: I think it was this:_
| https://en.wikipedia.org/wiki/Differential_cryptanalysis
|
| They also did intentionally weaken a standard separately from
| that and all the arguing about 'munitions export' intentionally
| requiring weak keys etc. - all the 90s cryptowar stuff that
| mostly ended after the clipper chip failure. They also worked
| with IBM on DES, but some people internally at NSA were upset
| that they shared this after the fact. The history is a lot more
| mixed with a lot of people arguing about what the right thing
| to do is and no general consensus on a lot of this stuff.
| api wrote:
| > I remember that some of the suggested changes from NSA
| shared with IBM were actually stronger against a
| cryptanalysis attack on DES that was not yet publicly known
|
| So we have that and other examples of NSA apparently
| strengthening crypto, then we have the dual-EC debacle and
| some of the info in the Snowden leaks showing that they've
| tried to weaken it.
|
| I feel like any talk about NSA influence on NIST PQ or other
| current algorithm development is just speculation unless
| someone can turn up actual evidence one way or another. I can
| think of reasons the NSA would try to strengthen it and
| reasons they might try to weaken it, and they've done both in
| the past. You can drive yourself nuts constructing infinitely
| recursive what-if theories.
| kmeisthax wrote:
| The NSA wants "NOBUS" (NObody-But-US) backdoors. It is in
| their interest to make a good show of fixing easily-
| detected vulnerabilities while keeping their own
| intentional ones a secret. The fantasy they are trying to
| sell to politicians is that people can keep secrets from
| other people but not from the government; that they can
| make uncrackable safes that still open when presented with
| a court warrant.
|
| This isn't speculation either; Dual_EC_DRBG and its role as
| a NOBUS backdoor was part of the Snowden document dump.
| api wrote:
| Here's the counter-argument that I've seen in
| cryptography circles:
|
| Dual EC, a PRNG built on an asymmetric crypto template,
| was kind of a ham fisted and obvious NOBUS back door. The
| math behind it made such a backdoor entirely plausible.
|
| That's less obvious in other cases.
|
| Take the NIST ECC curves. If they're backdoored it means
| the NSA knows something about ECC we don't know and
| haven't discovered in the 20+ years since those curves
| were developed. It also means the NSA was able to search
| all ECC curves to find vulnerable curves using 1990s
| technology. Multiple cryptographers have argued that if
| this is true we should really consider leaving ECC
| altogether. It means a significant proportion of ECC
| curves may be problematic. It means for all we know
| Curve25519 is a vulnerable curve given the fact that this
| hypothetical vulnerability is based on math we don't
| understand.
|
| The same argument could apply to Speck:
|
| https://en.wikipedia.org/wiki/Speck_(cipher)
|
| Speck is incredibly simple with very few places a
| "mystery constant" or other back door could be hidden. If
| Speck is backdoored it means the NSA knows something
| about ARX constructions that we don't know, and we have
| no idea whether this mystery math also applies to ChaCha
| or Blake or any of the other popular ARX construction
| gaining so much usage right now. That means if we
| (hypothetically) knew for a fact that Speck was
| backdoored _but not how it 's backdoored_ it might make
| sense to move away from ARX ciphers entirely. It might
| mean many or all of them are not as secure as we think.
| fossuser wrote:
| I think it's just both. It's a giant organization of people
| arguing in favor of different things at different times
| over its history, I'd guess there's disagreement
| internally. Some arguing it's critical to secure encryption
| (I agree with this camp), others wanting to be able to
| break it for offense reasons despite the problems that
| causes.
|
| Since we only see the occasional stuff that's unclassified
| we don't really know the details and those who do can't
| share them.
| throwaway654329 wrote:
| There are plenty of leaked classified documents from NSA
| (and others) that have been verified as legitimate. Many
| people working in public know stuff that hasn't been
| published in full.
|
| Here is one example with documents:
| https://www.spiegel.de/international/world/the-nsa-uses-
| powe...
|
| Here is another:
| https://www.spiegel.de/international/germany/inside-the-
| nsa-...
|
| Please read each and every classified document published
| alongside those two stories. I think you may revise your
| comments afterwards.
| throwaway654329 wrote:
| You are not accurately reflecting the history that is
| presented in the very blog post we are discussing.
|
| NSA made DES weaker for _everyone_ by reducing the key size.
| IBM happily went along. The history of IBM is dark. NSA
| credited tweaks to DES can be understood as ensuring that _a
| weakened DES stayed deployed longer_ which was to their
| advantage. They clearly explain this in the history quoted by
| the author:
|
| "Narrowing the encryption problem to a single, influential
| algorithm might drive out competitors, and that would reduce
| the field that NSA had to be concerned about. Could a public
| encryption standard be made secure enough to protect against
| everything but a massive brute force attack, but weak enough
| to still permit an attack of some nature using very
| sophisticated (and expensive) techniques?"
|
| They're not internally conflicted. They're strategic
| saboteurs.
| fossuser wrote:
| > "NSA credited tweaks to DES can be understood as ensuring
| that a weakened DES stayed deployed longer which was to
| their advantage. They clearly explain this in the history
| quoted by the author"
|
| I'm not sure I buy that this follows, wouldn't the weakened
| key size also make people not want to deploy it given that
| known weakness? To me it reads more that some people wanted
| a weak key so NSA could still break it, but other people
| wanted it to be stronger against differential cryptanalysis
| attacks and that they're not really related. It also came
| across that way in Levy's book where they were arguing
| about whether they should or should not engage with IBM at
| all.
| throwaway654329 wrote:
| It follows: entire industries were required to deploy DES
| and the goal was to create one thing that was "strong
| enough" to narrow the field.
|
| Read the blog post carefully about the role of NBS, IBM,
| and NSA in the development of DES.
|
| It's hard to accept because the implications are
| upsetting and profound. The evidence is clear and
| convincing. Lots of people try to muddy the waters, don't
| help them please.
| bragr wrote:
| >IBM happily went along. The history of IBM is dark.
|
| Then, as of now, I'm confused why people expect these kinds
| of problems to be solved by corporations "doing the right
| thing" rather than demanding some kind of real legislative
| reform.
| throwaway654329 wrote:
| Agreed. It can be both but historically companies
| generally do the sabotage upon request, if not
| preemptively. This hasn't changed much at all in favor of
| protecting regular users, except maybe with the expansion
| of HTTPS, and a few other exceptions.
| thorwayham wrote:
| dig @1.1.1.1 blog.cr.yp.to is failing for me, but 8.8.8.8 works.
| Annoying!
| jcranmer wrote:
| If anyone is curious, the courtlistener link for the lawsuit is
| here: https://www.courtlistener.com/docket/64872195/bernstein-v-
| na...
|
| (And somebody has already kindly uploaded the documents to RECAP,
| so it costs you nothing to access.)
|
| Aside: I really wish people would link to court documents
| whenever they talk about an ongoing lawsuit.
| xenophonf wrote:
| Good god, this guy is a bad communicator. Bottom line up front:
|
| > _NIST has produced zero records in response to this [March
| 2022] FOIA request [to determine whether /how NSA may have
| influenced NIST's Post-Quantum Cryptography Standardization
| Project]. Civil-rights firm Loevy & Loevy has now filed suit on
| my behalf in federal court, the United States District Court for
| the District of Columbia, to force NIST to comply with the law._
|
| Edit: Yes, I know who DJB is.
| jcranmer wrote:
| That is truly burying the lede...
|
| I spent most of the post asking myself "okay, I'm guessing this
| is something about post-quantum crypto, but _what_ are you
| actually suing about? "
| [deleted]
| kube-system wrote:
| Well, he is an expert in cryptic communication
| lizardactivist wrote:
| An expert, prominent, and someone who the whole cryptography
| community listens to, and he calls out the lies, crimes, and
| blatant hypocrisy of his own government.
|
| I genuinely fear that he will be suicided one of these days.
| gred wrote:
| This guy is the best kind of curmudgeon. I love it.
| bumper_crop wrote:
| This definitely has the sting of bitterness in it, I doubt djb
| would have filed this suit if NTRU Prime would have won the PQC
| NIST contest. It's hard to evaluate this objectively when there
| are strong emotions involved.
| [deleted]
| cosmiccatnap wrote:
| It's funny how often the bitterness of a post is used as an
| excuse to dismiss the long and well documented case being made.
| bumper_crop wrote:
| If NTRU Prime had been declared the winner, would this suit
| have been filed? It's the same contest, same people, same
| suspicious behavior from NIST. I don't think this suit would
| have come up. djb is filing this suit because of alleged bad
| behavior, but I have doubts that it's the real reason.
| throwaway654329 wrote:
| Yes, I think so. His former PhD students were among the
| winners in round three and he has other work that has also
| made it to round four. I believe he would have sued if he
| won every single area in every round. This is the Bernstein
| way.
|
| The behavior in question by NIST isn't just alleged - look
| at the FOIA ( https://www.muckrock.com/foi/united-states-
| of-america-10/nsa... ). They're not responding in a
| reasonable or timely manner.
|
| Does that seem like reasonable behavior by NIST to you?
|
| To my eyes, it is completely unacceptable behavior by NIST,
| especially given the timely nature of the standardization
| process. They don't even understand the fee structure
| correctly, it's a comedy of incompetence with NIST.
|
| His FOIA predates the round three announcement. His lawsuit
| was filed in a timely manner, and it appears that he filed
| it fairly quickly. Many requesters wait much longer before
| filing suit.
| pixl97 wrote:
| When it comes to the number of times DJB is right versus the
| number of times that DBJ is wrong, I'll fully back DJB. Simply
| put the NSA/NIST cannot and should not be trusted in this case.
| bumper_crop wrote:
| You misread. I'm saying his reasons for filing are in
| question. NIST probably was being dishonest. That's not the
| reason there is a lawsuit though.
| throwaway654329 wrote:
| They're not in question for many people carefully tracking
| this process. He filed his FOIA before the round three
| results were announced.
|
| The lawsuit is because they refused to answer his
| reasonable and important FOIA in a timely manner. This is
| not unlike how they also delayed the round three
| announcement.
| lawrenceyan wrote:
| Here's an interesting question. Even if post-quantum cryptography
| is securely implemented, doesn't the advent of neurotechnology
| (BCIs, etc.) make that method of security obsolete?
|
| With read and write capability to the brain, assuming this comes
| to fruition at some point, encryption as we know it won't work
| anymore. But I don't know, maybe this isn't something we have to
| worry about just quite yet.
| Banana699 wrote:
| The thing you're missing is that BCIs and friends are,
| themselves, computers, and thus securable with post-quantum
| cryptography, or any cryptography for that matter, or any means
| of securing a computer. And thus, for somebody to read-write to
| your computers, they need to read-write to your brain(s), but
| to read-write to your brain(s), they need to read-write to the
| computers implanted in your brain(s). It's a security cycle
| whose overall power is determined by the least-secure element
| in the chain.
|
| Any sane person will also not touch BCIs and similar technology
| with a 100 lightyear pole unless the designing company reveals
| every single fucking silicon atom in the hardware design and
| every single fucking bit in the software stack at every level
| of abstraction, and ships the device with several redundant
| watchdogs and deadmen timers around it that can safely kill or
| faraday-cage the implant on user-defined events or manually.
|
| Alas, humans are very rarely sane, and I come to the era of bio
| hacking (in all senses of the word) with low expectations.
| yjftsjthsd-h wrote:
| The encryption is fine, that's just a way to avoid it. Much
| like how tire-iron attacks don't _break_ passwords so much as
| bypass them.
| lawrenceyan wrote:
| Ok that's actually a great point. To make the comparison:
|
| Tire-irons require physical proximity. And torture generally
| doesn't work, at least in the case of getting a private key.
|
| Reading/writing to the brain, on the other hand, requires no
| physical proximity if wireless. And the person(s) won't even
| know it's happening.
|
| These seem like totally different paradigms to me.
| ziddoap wrote:
| I think we are a _long_ way away from being able to
| wirelessly read a few specific bytes of data from the brain
| of an unknowing person. Far enough away that I 'm not sure
| it's productive to begin thinking of how to design
| encryption systems around it.
| lawrenceyan wrote:
| Memory and experience aren't encoded in the brain like
| traditional computers. There's no concept of a "byte"
| when thinking about the human computational model.
| aaaaaaaaata wrote:
| > And torture generally doesn't work, at least in the case
| of getting a private key.
|
| This seems incorrect.
| PaulDavisThe1st wrote:
| > torture generally doesn't work, at least in the case of
| getting a private key.
|
| Why not?
| [deleted]
| [deleted]
| lysergia wrote:
| Yeah I've even had very personal dreams where my Linux root
| password was spoken in the dream. I'm glad I don't talk in my
| sleep. There's also truth serums that can be weaponized in war
| scenarios to extract secrets from the enemy without resorting
| to torture.
| xenophonf wrote:
| Cryptographic secrets stored in human brains are already
| vulnerable to an attack mechanism that requires $5 worth of
| interface hardware that can be procured and operated with very
| little training. Physical security controls do a decent job of
| preventing malicious actors from connecting said hardware to
| vulnerable brains. I assume the same would be true with the
| invention of BCIs more sophisticated than a crescent wrench.
| politelemon wrote:
| So, question then, isn't one of the differences between this
| time's selection, compared to previous selections, that some of
| the algorithms are open source with their code available.
|
| For example, Kyber, one of the finalists, is here:
| https://github.com/pq-crystals/kyber
|
| And where it's not open source, I believe in the first round
| submissions, everyone included reference implementations.
|
| Does the code being available make it easy to verify whether
| there are some shady/shenanigans going on, even without NIST's
| cooperation?
| aaaaaaaaaaab wrote:
| What? :D
|
| Who cares about a particular piece of source code?
| Cryptanalysis is about the _mathematical_ structure of the
| ciphers. When we say the NSA backdoored an algorithm, we don 't
| mean that they included hidden printf statements in "the source
| code". It means that mathematicians at the NSA have knowledge
| of weaknesses in the construction, that are not known publicly.
| [deleted]
| gnabgib wrote:
| Worth noting DJB (the article author) was on two competing
| (losing) teams to Kyber[0] in Round 3. And has an open
| submission in round 4 (still in progress). That's going to
| slightly complicate any FOIA until after the fact, or it
| should. Not that there's no merit in the request.
|
| [0]: https://csrc.nist.gov/Projects/post-quantum-
| cryptography/pos...
| greyface- wrote:
| > the Supreme Court has observed that a FOIA requester's
| identity generally "has no bearing on the merits of his or
| her FOIA request."
|
| https://www.justice.gov/archives/oip/foia-
| guide-2004-edition...
| throwaway654329 wrote:
| It is wrong to imply he is unreasonable here. NIST has been
| dismissive and unprofessional towards him and others in this
| process. They look terrible because they're not doing their
| jobs.
|
| Several of his student's proposals won the most recent round.
| He still has work in the next round. NIST should have
| answered in a timely manner.
|
| On what basis do you think any of these matters can or may
| complicate the FOIA process?
| lostcolony wrote:
| Not really. For the same reason that "here's your github login"
| doesn't equate to you suddenly being able to be effective in a
| new company. You might be able to look things up in the code
| and understand how things are being done, but you don't know
| -why- things are being done that way.
|
| A lot of the instances in the post even show the NSA giving a
| why. It's not a particular convincing why, but it was enough to
| sow doubt. The reason to make all discussions public is so that
| there isn't an after the fact "wait, why is that obviously odd
| choice being done?" but instead a before the fact "I think we
| should make a change". The burden of evidence is different for
| that. A "I think we should reduce the key length for
| performance" is a much harder sell when the spec already
| prescribes a longer key length, than an after the fact "the
| spec's key length seems too short" "Nah, it's good enough, and
| we need it that way for performance". The status quo always has
| inertia.
| ehzy wrote:
| Ironically, when I visit the site Chrome says my connection is
| not secured by TLS.
| kzrdude wrote:
| I was hoping for chacha20+Poly1305
| bsaul wrote:
| side question :
|
| I've only recently started to digg a bit deeper into crypto
| algorithms ( looking into various types of curves etc), and it
| gave me the uneasing feeling that the whole industry is relying
| on the expertise of only a handful of guys to actually ensure
| that crypto schemes used today are really working.
|
| Am i wrong ? are there actually thousands and thousands of people
| with the expertise to actually proove that the algorithms used
| today are really safe ?
| [deleted]
| jacooper wrote:
| Flippo valrosida and Matthey green aren't too happy.
|
| https://twitter.com/matthew_d_green/status/15556838562625208...
| dt3ft wrote:
| Perhaps the old advice ("never roll your own crypto") should be
| reevaluated? If you're creative enough, you could combine and
| apply existing algorithms in such ways that it would be very
| difficult to decrypt? Think 500 programmatic combinations (steps)
| of encryption applying different algorithms. Content encrypted in
| this way would require knowledge of the encryption sequence in
| order to execute the required steps in reverse. No amount of
| brute force could help here...
| TobTobXX wrote:
| > Would require knowledge of the encryption sequence...
|
| This is security by obscurity. Reputable encryptions work under
| the assumption that you have full knowledge about the
| encryption/decryption process.
|
| You could however argue that the sequence then becomes part of
| the key. However, this key [ie. the sequence of encryptions]
| would then be at most as strong as the strongest encryption in
| this sequence, which kindof defeats the purpose.
| thrway3344444 wrote:
| Why is the link in the URL http: not https: ? Irony?
| cosmiccatnap wrote:
| If you spend all day making bagels do you go home and make
| bagels for dinner?
|
| It's a static text blog, not a bank
| theandrewbailey wrote:
| The NSA has recorded your receipt of this message.
| sam0x17 wrote:
| Well https uses the NIST standards so.... ;)
| theknocker wrote:
| eointierney wrote:
| Yippee! DJB for the win for the rest of us!
| sgt101 wrote:
| yeah, but where do all these big primes come from?
| pyuser583 wrote:
| Please include links with https://
| tptacek wrote:
| I may believe almost all of this is overblown and silly, as like
| a matter of cryptographic research, but I'll say that Matt Topic
| and Merrick Wayne are the real deal, legit the lawyers you want
| working on something like this, and if they're involved,
| presumably some good will come out of the whole thing.
|
| Matt Topic is probably best known as the FOIA attorney who got
| the Laquan McDonald videos released in Chicago; I've been
| peripherally involved in some work he and Merrick Wayne did for a
| friend, in a pretty technical case that got fierce resistance
| from CPD, and those two were on point. Whatever else you'd say
| about Bernstein here, he knows how to pick a FOIA lawyer.
|
| A maybe more useful way to say the same thing is: if Matt Topic
| and Merrick Wayne are filing this complaint, you should probably
| put your money on them having NIST dead-to-rights with the FOIA
| process stuff.
| api wrote:
| I don't think it's a bad thing to push back and demand
| transparency. At the very least the pressure helps keep NIST
| honest. Keep reminding them over and over and over again about
| dual-EC and they're less likely to try stupid stuff like that
| again.
| tptacek wrote:
| Transparency is good, and, as Bernstein's attorneys will ably
| establish, not optional.
| ddingus wrote:
| It's as optional as the people can be convinced to not
| worry about it.
| taliesinb wrote:
| Why is the submission URL using http instead of https? That just
| seems... bizarre.
| CharlesW wrote:
| https://blog.cr.yp.to/20220805-nsa.html works too.
| ForHackernews wrote:
| Maybe this is too much tinfoil hattery, but are we _sure_ DJB isn
| 't a government asset? He'd be the perfect deep-cover agent.
| throwaway654329 wrote:
| Please don't do the JTRIG thing. Dan is a national treasure and
| we would be lucky to have more people like him fighting for all
| of us.
|
| Between the two, material evidence shows that NIST is the deep-
| cover agent sabotaging our cryptography.
| crabbygrabby wrote:
| Seems like a baaad idea lol.
| yieldcrv wrote:
| seems like they just need a judge to force the NSA to comply
| with a Freedom of Information Act request, its just part of the
| process
|
| I'm stonewalled on an equivalent Public Record Act request w/ a
| state, and am kind of annoyed that I have to use the state's
| court system
|
| Doesn't feel super partial and a couple law journals have
| written about how its not partial at all in this state and
| should be improved by the legislature
| throwaway654329 wrote:
| This is part of a class division where we cannot practically
| exercise our rights which are clearly enumerated in public
| law. Only people with money or connections can even attempt
| to get many kinds of records.
|
| It's wrong and government employees involved should be fired,
| and perhaps seriously punished. If people at NIST had faced
| real public scrutiny and sanction for their last round of
| sabotage, perhaps we wouldn't see delay and dismissal by
| NIST.
|
| Delay of responding to these requests is yet another kind of
| sabotage of the public NIST standardization processes. Delay
| in standardization is delay in deployment. Delay means mass
| surveillance adversaries have more ciphertext that they can
| attack with a quantum computer. This isn't a coincidence,
| though I am sure the coincidence theorists will come out in
| full force.
|
| NIST should be responsive in a timely manner and they should
| be trustworthy, we rely on their standards for all kinds of
| mandatory data processing. It's pathetic that Americans don't
| have _several IG investigations in parallel_ covering NIST
| and NSA behavior. Rather we have to rely on a professor to
| file lawsuits for the public (and cryptographers involved in
| the standardization process) to have even a glimpse of what
| is happening. Unbelievable but good that _someone_ is doing
| it. He deserves our support.
| PaulDavisThe1st wrote:
| Even though I broadly agree with what you've written here
| ... the situation in question isn't really about NIST/NSA
| response to FOIA requests at all.
|
| It's about whether the US government has deliberately acted
| to foist weak encryption on the public (US and otherwise),
| presumably out of desire/belief that it has the right/need
| to always decrypt.
|
| Whether and how those agencies respond to FOIA requests is
| a bit of a side-show, or maybe we could call it a prequel.
| throwaway654329 wrote:
| We are probably pretty much in agreement. It looks like
| they've got something to hide and they're hiding it with
| delay tactics, among others.
|
| They aren't alone in failing to uphold FOIA laws, but
| they're important in a key way: once the standard is
| forged, hardware will be built, certified, deployed, and
| _required_ for certain activities. Delay is an attack
| that is especially pernicious in this exact FOIA case
| given the NIST standardization process timeline.
|
| As a side note, the NIST FOIA people seem incompetent for
| reasons other than delay.
| yieldcrv wrote:
| > This is part of a class division where we cannot
| practically exercise our rights which are clearly
| enumerated in public law. Only people with money or
| connections can even attempt to get many kinds of records.
|
| As someone with those resources, I'm still kind of annoyed
| because I think this state agency is playing chess
| accurately too. My request was anonymous through my lawyer
| and nobody would know that I have these documents, while if
| I went through the court - even if it was anonymous with
| the ACLU being the filer - there would still be a public
| record in the court system that someone was looking for
| those specific documents, so that's annoying
| throwaway654329 wrote:
| That's a thoughtful and hard won insight, thank you.
| gruturo wrote:
| Yeah, terrible idea, except this is Daniel Bernstein, who
| already had an equally terrible idea years ago, and won. That
| victory was hugely important, it pretty much enabled much of
| what we use today (to be developed, exported, used without
| restrictions, etc etc etc)
| zitterbewegung wrote:
| He won a case against the government representing himself so I
| think he would be on good footing. He is a professor where I
| graduated and even the faculty told me he was interesting to
| deal with. Post QC is his main focus right now and also he
| published curve25519.
| matthewdgreen wrote:
| He was represented by the EFF during the first, successful
| case. They declined to represent him in the second case,
| which ended in a stalemate.
| throwaway654329 wrote:
| The full story is interesting and well documented:
| https://cr.yp.to/export.html
|
| Personally my favorite part of the history is on the
| "Dishonest behavior by government lawyers" page:
| https://cr.yp.to/export/dishonesty.html - the disclaimer at
| the top is hilarious: "This is, sad to say, not a complete
| list." Indeed!
|
| Are you implying that he didn't contribute to the first win
| before or during EFF involvement?
|
| Are you further implying that a stalemate against the U.S.
| government is somehow bad for self representation after the
| EFF wasn't involved?
|
| In my view it's a little disingenuous to call it a
| stalemate implying everything was equal save EFF involved
| when _the government changes the rules_.
|
| He challenged the new rules alone because the EFF
| apparently decided one win was enough.
|
| When the judge dismissed the case, the judge said said that
| he should come back when the government had made a
| "concrete threat" - his self representation wasn't the
| issue. Do you have reason to believe otherwise?
|
| To quote his press release at the time: ``If and when there
| is a concrete threat of enforcement against Bernstein for a
| specific activity, Bernstein may return for judicial
| resolution of that dispute,'' Patel wrote, after citing
| Coppolino's ``repeated assurances that Bernstein is not
| prohibited from engaging in his activities.'' -
| https://cr.yp.to/export/2003/10.15-bernstein.txt
| matthewdgreen wrote:
| I'm saying that the EFF are skilled lawyers who won a
| major case, and they should not be deprived of credit for
| that accomplishment.
| throwaway654329 wrote:
| Sure, EFF played a major role in that case as did
| Bernstein. It made several lawyers into superstars in
| legal circles and they all clearly acknowledge his
| contributions to the case.
|
| Still you imply that he shouldn't have credit for that
| first win and that somehow he failed in the second case.
|
| EFF shouldn't have stopped fighting for the users when
| the government changed the rules to something that was
| also unacceptable.
| matthewdgreen wrote:
| The original poster said "he won a case against the
| government representing himself" and I felt that
| statement was incomplete, if not inaccurate and wanted to
| correct the record. I'm pretty sure Dan, if he was here,
| would do the same.
| zitterbewegung wrote:
| Sorry I didn't know that part. I have only seen Professor
| Bernstein once (he had a post QC t shirt on so that's the
| only way I knew who he was ). I have never interacted
| with him really. He is also the only faculty that is
| allowed to have a non UIC domain. Thank you for
| correcting me .
| throwaway654329 wrote:
| You appear to be throwing shade on his contributions. Do
| I misunderstand you?
|
| A stalemate, if you already want to diminish his efforts,
| isn't a loss by definition - the classic example is in
| chess. He brought the government to heel even after EFF
| bailed. You're also minimizing his contributions to the
| first case.
|
| His web page clearly credits the right people at the EFF,
| and he holds back on criticism for their lack of
| continuing on the case.
|
| I won't presume to speak for Dan.
| mort96 wrote:
| Weirdly, any time I've suggested that maaaybe being too trusting
| of a known bad actor which has repeatedly published intentionally
| weak cryptography is a bad idea, I've received a whole lot of
| push-back and downvotes here on this site.
| throwaway654329 wrote:
| Indeed. Have my upvote stranger.
|
| The related "just ignore NIST" crowd is intentionally or
| unintentionally dismissing serious issues of governance. Anyone
| who deploys this argument is questionable in my mind,
| essentially bad faith actors, especially when the topic is
| about the problems brought to the table by NIST and NSA.
|
| It is a good sign that those people are actively ignoring the
| areas where you have no choice and you _must_ have your data
| processed by a party required to deploy FIPS certified software
| or hardware.
| [deleted]
| [deleted]
| 616c wrote:
| Another upvote from someone with many friends and colleagues in
| NIST. I hope transparency prevails and NISTers side with that
| urge as well (I suspect many do).
| throwaway654329 wrote:
| They could and should leak more documents if they have
| evidence of malfeasance.
|
| There are both legal safe avenues via the IG process and
| legally risky many journalists who are willing to work for
| major change. Sadly legal doesn't mean safe in modern America
| and some whistleblower have suffered massive retribution even
| when they play by "the rules" laid out in public law.
|
| As Ellsberg said: Courage is contagious!
| glitchc wrote:
| Many government or government affiliated organizations are
| required to comply with NIST approved algorithms by regulation
| or for interoperability. If NIST cannot be trusted as a
| reputable source it leaves those organizations in limbo. They
| are not equipped to roll their own crypto and even if they did,
| it would be a disaster.
| icodestuff wrote:
| "Other people have no choice but to trust NIST" is not a good
| argument for trusting NIST. Somehow I don't imagine the NSA
| is concerned about -- and is probably actively in favor of --
| those organizations having backdoors.
| wmf wrote:
| It's an argument for fixing NIST so that it is trustworthy
| again.
| throwaway654329 wrote:
| This.
|
| One wonders if NIST can be fixed or if it should simply
| be abolished with all archives opened in the interest of
| restoring faith in the _government_. The damage done by
| NSA and NIST is much larger than either of those
| organizations.
| [deleted]
| zamadatix wrote:
| "Roll your own crypto" typically refers to making your own
| algorithm or implementation of an algorithm not choosing the
| algorithm.
| lazide wrote:
| Would you really want every random corporation having some
| random person pick from the list of open source cipher
| packages? Which last I checked , still included things like
| 3DES, MD5, etc.
|
| You might as well hand a drunk monkey a loaded sub machine
| gun.
| CodeSgt wrote:
| Surely I'm misunderstanding, are you really advocating
| that people should roll their own encryption algorithms
| from scratch? As in, they should invent novel and secure
| algorithms in isolation? And this should happen.... at
| every major enterprise or software company in the world?
| lazide wrote:
| You are completely misunderstanding yes.
|
| I'm saying some standards body is appropriate for
| validating/vetting algorithms, and having a standards
| body advocate for known reasonable ones is... reasonable
| and desirable.
|
| That NIST has a history of being compromised by the NSA
| (and other standards bodies would likely similarly be a
| target), is a problem. But having everyone 'figure it
| out' on their own is even worse. 'hand a drunk monkey a
| loaded submachine gun' worse.
| dataflow wrote:
| Tangential question: while some FOIA requests do get stonewalled,
| I continue to be fascinated that they're honored in other cases.
| What exactly prevents the government from stonewalling
| practically _every_ request that it doesn 't like, until and
| unless it's ordered by a court to comply? Is there any sort of
| penalty for their noncompliance?
|
| Tangential to the tangent: is there any reason to believe FOIA
| won't be on the chopping block in a future Congress? Do the
| majority of voters even know (let alone care enough) about it to
| hold their representatives accountable if they try to repeal it?
| bsaul wrote:
| holy crap, i wondered why the post didn't mention work by dj
| bernstein outing flaws in curves submitted by nsa...
|
| Well, didn't expect the post to actually be written by him.
| xiphias2 wrote:
| An interesting thing that is happening on Bitcoin mailing list is
| that although it would be quite easy to add Lamport signatures as
| an extra safety feature for high value transactions, as they
| would be quite expensive and easy to misuse (they can be used
| only once, which is a problem if money is sent to the same
| address twice), the current concensus between developers is to
| ,,just wait for NSA/NIST to be ready with the algorithm''. I
| haven't seen any discussion on the possibility of never being
| ready on purpose because of a sabotage.
| potatototoo99 wrote:
| Why not start that discussion yourself?
___________________________________________________________________
(page generated 2022-08-05 23:00 UTC) |