by di on 9/23/22, 7:44 PM with 182 comments
by peteforde on 9/23/22, 10:33 PM
To me, it reads as a bald-faced attempt to discourage public sector entities from using OSS solutions, when in fact there are perfectly good and definitely >100% secure proprietary offerings that cost a reasonable amount when purchased from the sorts of vendors that pay lobbyists to "help" senators write OSS bills.
Do you honestly think Rob fucking Portman woke up one day with strong opinions about FOSS?
Make no mistake: this is a thinly veiled late-stage attempt to displace the growing dominance of OSS-based solutions to the sorts of problems that the government and military used to pay 8 and 9 figures a year to EDS to solve.
An actual, good-faith bill that seeks to address these issues would attempt to incentivize/punish orgs that use FOSS without making meaningful contributions to it.
by buscoquadnary on 9/23/22, 8:45 PM
> The Securing Open Source Software Act would direct CISA to develop a risk framework to evaluate how open source code is used by the federal government. CISA would also evaluate how the same framework could be voluntarily used by critical infrastructure owners and operators. This will identify ways to mitigate risks in systems that use open source software. The legislation also requires CISA to hire professionals with experience developing open source software to ensure that government and the community work hand-in-hand and are prepared to address incidents like the Log4j vulnerability. Additionally, the legislation requires the Office of Management and Budget (OMB) to issue guidance to federal agencies on the secure usage of open source software and establishes a software security subcommittee on the CISA Cybersecurity Advisory Committee.
So basically just another framework to evaluate risk for use by the Federal Government. A nothing burger as it were. Which I am on one hand glad about, because I don't like the government starting to get involved in Open Source which is at it's core "Here's some code I wrote or whatever", but it also isn't doing anything for security.
by stevenalowe on 9/24/22, 1:21 AM
I see nothing new or useful here, what am I missing?
"The Securing Open Source Software Act would direct CISA to develop a risk framework to evaluate how open source code is used by the federal government. CISA would also evaluate how the same framework could be voluntarily used by critical infrastructure owners and operators. This will identify ways to mitigate risks in systems that use open source software. The legislation also requires CISA to hire professionals with experience developing open source software to ensure that government and the community work hand-in-hand and are prepared to address incidents like the Log4j vulnerability. Additionally, the legislation requires the Office of Management and Budget (OMB) to issue guidance to federal agencies on the secure usage of open source software and establishes a software security subcommittee on the CISA Cybersecurity Advisory Committee."
[1] https://insights.sei.cmu.edu/blog/taking-up-the-challenge-of...
by danbmil99 on 9/24/22, 12:20 AM
It's going to be like electrical contracting. You get someone cheap to do the wiring and then a union guy comes in to sign the papers and take a pound of flesh.
by not2b on 9/23/22, 11:48 PM
Looks to me like it will wind up making more money available for developers, mainly outside government, to audit and improve important free software that the feds are currently using. Unfortunately because of the way that government contracts work, companies that are already experienced at doing government contracts might wind up with the bulk of the money. But it isn't going to make things worse and might actually make things better.
by yazzku on 9/23/22, 8:24 PM
by bob1029 on 9/24/22, 1:11 AM
We killed 100% of our Java usage over this. We simply don't have enough in-house talent to make sure things are safe in that bucket. Our customers thought this was a glorious plan as well.
I do think most of the pain should fall to the vendors of the end product, not their oss suppliers. If your shop doesn't have enough resources to validate all vendors are safe, maybe figure out how to do it with fewer vendors.
At a certain level, if you are selling deficient products to sensitive customers, you really need to be stopped. Anything impacting finance, PII, safety, infrastructure, defense, etc. Some extra regulations could go a long way in these areas.
by ananonymoususer on 9/23/22, 9:00 PM
by transpute on 9/24/22, 1:01 AM
> Generate a criticality score for every open source project. Create a list of critical projects that the open source community depends on. Use this data to proactively improve the security posture of these critical projects ... A project's criticality score defines the influence and importance of a project. It is a number between 0 (least-critical) and 1 (most-critical). It is based on the following algorithm by Rob Pike..
Top 20 projects, based on "criticality score" algo output, you can run the script on your favorite OSS project:
> node, kubernetes, rust, spark, nixpkgs, cmsSW, tensorflow, symfony, DefinitelyTyped, git, azure-docs, magento2, rails, ansible, pytorch, PrestaShop, framework, ceph, php-src, linux
by mkl95 on 9/23/22, 10:05 PM
by thayne on 9/24/22, 3:36 AM
The reason log4shell had such a big impact is because of how ubiquitous it was. Sure being free gives OSS a bit of an advantage in becoming ubiquitous, especially as a library.
But there's also plenty of proprietary software that is ubiquitous as well. And proprietary software has plenty of bad security bugs too.
by eyelidlessness on 9/24/22, 4:35 AM
by transpute on 9/24/22, 12:39 AM
> The Technical Advisory Council Subcommittee was established to leverage the imagination, ingenuity, and talents of technical experts from diverse background and experiences for the good of the nation. The subcommittee was asked to evaluate and make recommendations tactical and strategic in nature. These Cybersecurity Advisory Committee (CSAC) recommendations for the June Quarterly Meeting focus on vulnerability discovery and disclosure.
Mr. Jeff Moss, Subcommittee Chair, DEF CON Communications
Mr. Dino Dai Zovi, Security Researcher
Mr. Luiz Eduardo, Aruba Threat Labs
Mr. Isiah Jones, National Resilience Inc.
Mr. Kurt Opsahl, Electronic Frontier Foundation
Ms. Runa Sandvik, Security Researcher
Mr. Yan Shoshitaishvili, Arizona State University
Ms. Rachel Tobac, SocialProof Security
Mr. David Weston, Microsoft
Mr. Bill Woodcock, Packet Clearing House
Ms. Yan Zhu, Brave Software
by jrochkind1 on 9/24/22, 1:48 AM
by transpute on 9/24/22, 12:04 AM
> FWIW, while this specific act may not be enforcing significant regulation, software developers need to understand that there's a ticking clock.
There are several initiatives from LF's OpenSSF and startup Chainguard.
Sept 2022, "Concise Guide for Evaluating Open-Source Software", https://github.com/ossf/wg-best-practices-os-developers/blob...
Sept 2022, "Show off your Security Score: Announcing Scorecards Badges", https://openssf.org/blog/2022/09/08/show-off-your-security-s...
by jrsj on 9/24/22, 12:42 AM
by smm11 on 9/23/22, 9:23 PM
Apparently they never changed one character in a query string in the late-90s.
by Meph504 on 9/26/22, 5:46 AM
>“This important legislation will, for the first time ever, codify open source software as public infrastructure,” said Trey Herr
Open source software is no more "public infrastructure" than the efforts of volunteer organizations. The government should have no say over this matter IMO.
by lmeyerov on 9/24/22, 3:50 AM
The gov has a large culture of tapping integrators who do not give back to OSS, just use, basically the middle man, and leave behind fragile one-offs. Such abandonware should overwhelm recieving depts within 6-12mo, and bringing back integrators for the treadmill of patching superfluous npm CVEs would break their budgets.
So that means pressure to, well, not do that. Either the integrators get more involved, or part of the budgets finally goes to people who are.
by swiley-gin on 9/24/22, 1:42 AM
TFA has no bill number so lets see if we can find it. Actually no, I'm not seeing it. Someone send me an HR? I'll update my comment if you do.
by torstenvl on 9/23/22, 11:45 PM
by mffap on 9/24/22, 9:01 AM
I believe that most of the assessment stuff is covered by many NIST recommendations anyways.
by yawnxyz on 9/24/22, 12:42 AM
by pGuitar on 9/24/22, 3:09 AM
by staticassertion on 9/23/22, 9:12 PM
If you want to avoid having to pass tests, having to maintain insurance, having to do a bunch of bullshit, all just to be a software engineer, get started on fixing things now.
It is absurd that anyone can anonymously provide open source code, with no assurances whatsoever, and that can end up in critical software. And you might be saying "well, it's up to people to audit their dependencies" - and maybe you're right. But I would challenge that everyone has the right to publish code for distribution purposes with zero responsibility.
Publishing code to Github? Sure, go for it, anyone can do it. Publishing packages to package distributors ? No, that crosses a line. I don't want legal requirements, I don't want identification requirements, just to publish and distribute code.
If we want to avoid that we're going to need to step it up - that means, yeah, basic measures like strong 2FA to distribute packages should be a requirement. Signing packages should be a requirement. Acknowledging and triaging vulnerabilities should be a requirement. If you aren't willing to do the above, which is frankly trivial, you shouldn't be allowed to publish software for distribution purposes.
I think we need to start taking a bit more responsibility for the work we do. "NO WARRANTY" doesn't mean "No obligations", it just means no one has a legal right to pursue damages due to your software, you should still do some things.
edit: K I'm rate limited so I can't have this conversation with all of you, thanks again Dang