by spacey on 3/31/22, 4:38 PM with 255 comments
by munificent on 3/31/22, 5:35 PM
> Unlike most other package managers files, Go modules don’t have a separate list of constraints and a lock file pinning specific versions. The version of every dependency contributing to any Go build is fully determined by the go.mod file of the main module.
I don't know if this was intentional on the author's part, but this reads to me like it's implying that with package managers that do use lockfiles a new version of a dependency can automatically affect a build.
The purpose of a lockfile is to make that false. If you have a valid lockfile, then fetching dependencies is 100% deterministic across machines and the existence of new versions of a package will not affect the build.
It is true that most package managers will automatically update a lockfile if it's incomplete instead of failing with an error. That's a different behavior from Go where it fails if the go.mod is incomplete. I suspect in practice this UX choice doesn't make much of a difference. If you're running CI with an incomplete lockfile, you've already gotten yourself into a weird state. It implies you have committed a dependency change without actually testing it, or that you tested it locally and then went out of your way to discard the lockfile changes.
Either way, I don't see what this has to do with lockfiles as a concept. Unless I'm missing something, go.mod files are lockfiles.
by Tainnor on 3/31/22, 5:48 PM
But Ruby's Bundler, for example, has been refusing to run your code if your lockfile is inconsistent for as long as I remember.
Locking dependencies is, generally, a solved problem across most ecosystems (despite node botching its UX). Go doesn't get to claim that it's superior here.
But of course, supply chain attacks are still possible with lock files. Because somebody is going to update your dependencies at some point (often for security reasons). And at that point you might be pulling in a malicious dependency which you haven't carefully vetted (because nobody has time to vet all their dependencies thoroughly nowadays).
That's still an unsolved problem, as far as I know. I don't think that Go has solved it.
by vlunkr on 3/31/22, 7:32 PM
One thought I've had to "reboot" the npm culture is to somehow curate packages that are proven to have minimal and safe dependencies, and projects can shift to using those. I imagine there has to be some sort of manual review to make that happen.
by glenjamin on 3/31/22, 5:07 PM
Part of the churn and noise in the Node.js dependency ecosystem actually stems from security-related issues being noted in a low-level module, and the ripple effects caused by that when a bunch of maintainers have to go around bumping lockfiles and versions.
by infogulch on 3/31/22, 6:53 PM
You can see this tension in virtually every discussion, users resisting using packages that aren't published in the standard library for fear of attacks and poor quality, and maintainers that resist publishing in the standard library for fear of changing requirements and the appearance of better designs. Sure there are admissible entitlement / responsibility arguments against these respective positions, but that's mostly a distraction because both have a valid point.
The problem is that there's no space for intermediate solutions. We need packaging tools to aggregate and publish groups of packages that relate to a particular domain, and organizational tools to ensure quality and continuity of these package groups over time. This mitigates users' fears and reduces their cognitive load by curating the solution space, and it mitigates maintainers fears of ossification and backcompat hell by enabling them to create new package groups.
I'm saying there's an entire dimension of valid tradeoffs in this space, but the current design trend of package managers force us into one extreme or the other.
by codeflo on 3/31/22, 5:53 PM
Here’s the thing: You can argue about being secure by default and encouraging better CI practices. I’d fully agree it isn’t great that one has to know a somewhat obscure flag to get a secure CI build in those environments.
But claiming in what I perceive to be in parts a somewhat grandiose tone to have reinvented the wheel, when you’re just describing a standard approach, can make you sound uninformed.
by nu11ptr on 3/31/22, 6:40 PM
The weird thing about the Go devs is there is always that little bit of elitism under the surface that I detect in their writing (whether it be colors in the playground, the GC, etc). I spent years writing Go and have now moved to Rust. What I find odd is the Rust team has done (IMO) one of the greater achievements in PL history and yet they seem to not have this elitism thing going on (or maybe I just haven't noticed). Go on the other hand, IMO, made some "interesting" language choices (like keeping null) and they seem to want to be celebrated for it and claim their achievements as new and novel.
EDIT: To clarify, I'm talking about the core Go devs - those that work on stdlib and the compiler
by 37ef_ced3 on 3/31/22, 5:09 PM
The peanut gallery loves to complain about superficial aspects of Go. Typically these are people with little or no actual experience using the language and tools. They fixate on imagined problems that don't matter in practice.
But anyone who has used Go full-time for a few years is likely to deeply respect and appreciate it.
by unixbane on 3/31/22, 10:08 PM
by verdverm on 3/31/22, 6:32 PM
by alasdair_ on 3/31/22, 7:00 PM
by strken on 4/1/22, 2:07 AM
package evil
import "fmt"
// the echo is intentional, in case someone actually tries this for some reason
//go:generate echo rm -rf /
func PretendGood() {
fmt.Println("I am good")
}
When they say fetching and building code doesn't execute it, that's specific to go get and go build. There's no guarantee that every go subcommand is safe. This is pretty obvious if you know how go generate works and it isn't a flaw of the language, but if I were new to go, this is the kind of article I'd read but still not understand exactly what was safe and what wasn't.by tgsovlerkhgsel on 4/1/22, 1:06 AM
As I understand it dependencies-of-dependencies are fetched from the config provided by the library that includes the dependency, in the case of a transitive dependency, everyone between you and the vulnerable package needs to update, and they need to do so in order (i.e. if you depend on A, A depends on B, B depends on C, and C fixes a vulnerability, then first B, then A have to update in order for you to pull in the change).
by morelisp on 4/1/22, 9:08 PM
by Friday_ on 3/31/22, 4:59 PM
by Jon_Lowtek on 4/1/22, 3:41 AM
GO doesn't do jack shit to mitigate supply chain attacks. Version pinning with checksum and that is it. But what could go do? Solve supply chain attacks as a language feature? That doesn't even make sense.
Application developers using Go must prevent supply chain attacks against their applications. So go get some SAST for your pipeline.
Sure there is truth in saying: always verify your dependencies (and their dependencies) yourself with a code review on every update. But you should not do that alone, so let's talk about collaborative vulnerability management. (there is more to sast than vulnerability assessment, but we have to start somewhere)
Let's say repositories that publishe go modules should also publish a curated list of known vulnerabilities (including known supply chain attacks) for the modules they publish. This curation is work: reports must be verified before being included in the list and they must be verified quickly. This work scales with the number of packages published. And worse, modules could be published in more than one repository, module publishing repositories can be different from the modules source code repository, and vulnerability lists can exist independent from these repositories - so reports should be synced between different list providers. Different implementations and lack of common standards make this a hard problem. And implicit trust for bulk imports could open the door for takedown attacks.
There is an argument that vulnerability listing should be split from source and module publishing: each focusing on their core responsibility. For supply chain attacks this split in responsibilities also makes it harder for an attacker to both attack suppliers and suppress reports. But for all other issues it increase distance as reports must travel upstream. And it creates perverse incentives, like trying to keep reports exclusive to paying customers.
To pile on the insanity: reports can be wrong. And there are unfixed CVEs that are many years old (well ok maybe not for go... yet). Downstream there are "mitigated" and "wont-fix" classifications for reports about dependencies and many SAST tooling can't parse that for transitive dependencies.
Really, supply chain attacks are the easy case in vulnerability management, because they are so obviously a "must-fix" when detected. (and to please the never update crowd: for a downstream project "fix" can mean not updating a dependency into an attacked version. you are welcome)
Long story short: go get some SAST in your pipelines to defend against supply chain attacks. Don't pretend pinning the version and half-assing a code review when you update them is actually solving supply chain attacks. Don't tell me everyone who uses go can find a sophisticated data bomb or intentional rce in some transitive dependency of some lib they update to a new feature release. And don't give me some "well if its transitive then the lib dev should have." Should have doesn't solve shit.
Examples for Vulnerability Assessment in GO dependency graphs are GitLabs Gemnasium ( https://gitlab.com/gitlab-org/security-products/gemnasium-db... ) or GitHubs Dependabot ( https://github.com/advisories?query=type%3Areviewed+ecosyste... ) among many, many others. Not recommendations, just examples!
Tools like these help you sort out supply chain attacks that other people have already found, before you update into them and push them downstream. Collaboration is useful. Sure you are still left with reading the source changes of all dependency update, because who knows, you may be the first one to spot one, but hey, good for you.
by synergy20 on 3/31/22, 5:56 PM
by tester756 on 3/31/22, 9:34 PM
by staticassertion on 3/31/22, 7:42 PM
by kubanczyk on 3/31/22, 5:22 PM
go mod edit
and go work
both of which are deliberately designed counter-mitigations, i.e. they exist
to poke small holes in the pin-everything wall.
I agree with the spirit of the message though, the surface is much smaller
with Go and it shows much planning went into that.