by pytrin on 4/12/14, 10:06 PM with 53 comments
by jleader on 4/12/14, 10:52 PM
It's important to discuss what changes we can (and should) make to make problems like heartbleed less likely in the future, but wildly waving competing generalizations in the air doesn't help anything.
by mehrdada on 4/12/14, 11:12 PM
This is absolutely BS, especially in security and cryptography. Most security related code written by most so-called "professional" software developers is astonishingly terrible (e.g. ECB mode encryption, storing encryption key in code, reusing encryption keys, relying on (unauthenticated) encryption for authenticity, reusing IVs, linear time MAC verification, ...). Most cryptographers are academics. Also, anecdotally, the poisonous "demo an exploit or it doesn't happen" attitude in response to hints at a flawed system design is much more prevalent among "professional software developers" than in academia.
If anything, we should encourage more security experts in academia to engage in implementation, verification, and improvement of security code, not the other way around.
(Not that most academics write good code either, but this is not an academia/industry issue. It is a security expert/non-expert issue.)
by linuxhansl on 4/13/14, 12:18 AM
An example: "anyone can contribute, regardless of background or proficiency". I'd encourage the author to research how open source projects are run before making claims like this.
Also.. How was this bug found again? Oh yeah. By analyzing the _open_ source code.
Professionalism is orthogonal to open source vs. closed source. There's a place for both, and there is good and bad open source and closed source software.
Moving right along nothing to see here.
by whatts on 4/12/14, 11:29 PM
by owenversteeg on 4/13/14, 2:23 AM
I also love how the author puts a thinly-veiled plug of his slimy "open-source" code-selling website in the middle. As benatkin said in his excellent comment [0], all four of their featured products are closed-source. The OSI should sue them for violation of their trademark of the term "open source".
by jokoon on 4/12/14, 10:42 PM
patches and added features need to be reviewed by project owners.
open source mostly mean "you can read the source and modify your version, but that doesn't mean you can make a change that will go into the official release."
There are some very sensitive implementations of software which should be thoroughly examined by experts and criticized if they're not good enough. If there is no resources available to maintain a particular open source software, don't bother use it, ESPECIALLY if it's sensitive like openssl.
Open source allows software companies and other programmers to easily work together to solve a problem. Developer's time is precious so it's often time-saving to use somebody's else work, but that doesn't mean you should use it blindly.
by zobzu on 4/12/14, 11:13 PM
i'd rather say "and you just don't know about the closed source ones because they're harder to find" ;-)
by upofadown on 4/12/14, 11:04 PM
This particular observation comes up any time something goes wrong in any context. The stuff about the shallowness of bugs really has nothing to do with the argument. This bug was in fact quite shallow, some random entity just found it by looking. If more people had of been looking then it would of likely been found sooner. You can only find a bug once.
by benatkin on 4/13/14, 12:30 AM
So I don't think they are in a good position to be talking about the meaning of Open Source, as they're doing in this article.
by njharman on 4/13/14, 1:11 AM
This is almost a non-sequitor (Sp?). Almost none of those software engineers looked at the source (and those few that did got eye bleed).
I quit reading after that.
by stuhood on 4/12/14, 10:52 PM
by markbnj on 4/13/14, 2:16 AM
by Ologn on 4/13/14, 1:55 AM
There are fundamental differences between bugs and security holes. Bugs are something everyone has an interest in fixing. If a bug rarely manifests itself - then it is not that much of a problem.
Security holes are things which some people scrupulously search for, and then sometimes keep secret, for their own ends. Sometimes people even try to create security holes where there are none ( http://lwn.net/Articles/57135 ).
by awalton on 4/13/14, 12:22 AM
Open Source gives you potential to build a rocket to the moon. But it requires money and time and people willing to mind the code, and people with humble attitudes willing to accept when they've made mistakes and patch the code.
Quality Assurance requires effort, and that's where the fallacy of "Free" software really comes from. If you're not paying for it, you're going to pay for it. (Either by being the QE team and fixing bugs yourself or by living with buggy software.)
by quadrangle on 4/13/14, 2:44 AM
Thus, Binpress always looks to combine their one very good point (that better funding for Open Source is important) with a bunch of junk trying to say that buying their proprietary software is the answer.
by fidotron on 4/12/14, 11:24 PM
Only by moving crypto functions to a separate user maintainable black box will this tide ever be stemmed. Of course, verifying that black box then becomes problematic, but it would be easier than the current situation.
by cabinpark on 4/13/14, 12:28 AM
by arikrak on 4/13/14, 12:48 AM