Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Go Packaging Proposal Process (docs.google.com)
171 points by zalmoxes on July 29, 2016 | hide | past | favorite | 93 comments


Go ecosystem is a complete mess. I understand what Go maintainers tried to do, but Go isn't like Gnome lib,or Linux core lib where dependencies are stable. And the "just vendor" stuff pushed by some Go pundits is stupid. Clutter or Gtk don't vendor Gdk or Glib. So Go right now is about everybody reinventing the wheel, because you can't write reproducible build with 3rd party packages thus create an ecosystem with Go get. The solution ? deprecate Go get in favor of a new tool that supports at least fetching tags for packages. Project based build Tools (like Gb) work for apps, they don't work for libraries. Glide and co have a big problem : everybody is using a different package manager that is incompatible with others.

The question isn't being solved yet because to solve it seriously it would require to change how the language works at first place.


> And the "just vendor" stuff pushed by some Go pundits is stupid.

I see it as an excellent way to allow me to ensure that no matter what my project continues to build and work. And dont say "just pin a version" because there have been packages that lose or update their tags and break things, or the distribution channel (npmjs, github, w/e) go down and suddenly I am up a creek for a deploy/build.

When I vendor, I have what I need to build and deploy my code with a lot fewer points of failure.


...until the unfortunate end user that needs to run tenths of systems runs in a security issue. Then the admins cannot possibly learn how to fork, patch, rebuild, test, deploy in 20 different languages. And they cannot rely on security updates from Linux Distributions because they installed vendorized code blobs.


This has more to do with the lack of dynamic linking than vendoring.


The two are very connected. Yet, even with dynamic linking, the libraries could be always and only bundled with the application, and the application could be designed to work against the bundled versions and not even tested against other versions. On a practical level this makes it extremely expensive to perform updates.


Vendoring is what Linux distributions do. They also provide multi-language automatic builds, security updates and all the other things you mention.

So the problem is not vendoring itself. For a large organization like a Linux distribution, it's a valid technique. However, nobody is maintaining a distribution for the Go ecosystem, or at least not one that's publically available.


vendoring is a solution to the problem of reproducability, not the problem of distribution. Vendoring particularly for application code should be at least an optional part of whatever solution is chosen by the go team. Vendoring is not the only solution in this space, at $work we mirror any third party repository we use.

However, vendoring doesn't solve the problem of getting a library that I have reasonable guarantees will work with the rest of the ecosystem. You can use vendoring in libraries but you can only use their types internally in your apis, if you put them on the boundaries you will run into compatibility issues.


Most systems have package managers with versions and a central repo and dependency tracking.

Problems with fetching builds are rare but if you care package managers frequently have some not exactly recommended plugin to allow vendoring.

Plus with the package manager you can update the optionally vendored dependencies quickly.

Central repos are less likely to be deleted then github repos and some like Crates.io have a policy not to remove crates(or allow authors to remove crates) unless there are legal issues.

A better solution (then vendoring) for larger organizations that is also usually available in the form of mirroring or partial mirroring of the central repository(and pulling from that instead).


> The question isn't being solved yet because to solve it seriously it would require to change how the language works at first place.

And Go 1 compatibility promise [0] would prevent any change.

In theory, you could define a GOPATH to install custom packages. You install half a dozen or so packages, and it's not trivial to determine what all you installed. A Go update arrives. Now some of those packages fail to update. Either spend hours debugging, reporting bugs, or rm -rf $GOPATH and reinstall everything.

The language is wonderful in so many ways, but a number of seemingly silly things make it annoying.

[0] https://golang.org/doc/go1compat


Any change will be in the tools, not the language. The compatibility promise explicitly states that the tools can change (https://golang.org/doc/go1compat#tools).


> And Go 1 compatibility promise [0] would prevent any change.

For example? I cannot see how the language itself could be the problem here.


I understand that the tools will need to be changed, but what needs to be changed in the language? Because import paths are strings interpreted by the tools and not the language, there seems to be good separation between the language and issues of source code organization, packaging, etc.


"The question isn't being solved yet because to solve it seriously it would require to change how the language works at first place."

This doesn't make any sense. What problem are you talking about?


> Go ecosystem is a complete mess.

And yet, people manage to get work done in it.


Not sure how it is a complete mess, maybe a small mess.

Who is reinventing wheels anyway?


Hi folks, My name is on that list and I work on one of the Go package managers (Glide - it's like npm, crates, composer, bundler, etc... but for Go... https://glide.sh).

If you have questions or input those who are working on this problem are listening.


[This question is purely informational and doesn't directly relate to golang, but I want to ask someone who has been involved in domain-specific package management.]

Why is the protocol side of package management not a solved problem already?

At least Go reuses the git protocol to its advantage. But the general problem of package management applies to a lot of things.

"There is a package with a name and a description. That package has versions, which can be greater/lesser than others. Those versions may have deltas from one-another. They contain one or more files. Those files must have their integrity checked. etc, etc..." -> Why does every single language reimplement that?

Nevermind languages actually. Why does everyone reimplement that? Firefox/Chrome extensions, Android/ios app stores, a bazillion different game addons and app-specific stores...

We have a mostly-identical problem being solved a million different ways. pypi, npmjs, glide, they should be running the same server software. It'd be a first step towards standardization and avoid a bunch of mistakes repeated every time a new solution comes out.

Is anyone even trying to do something about this? (and before anyone says "do it yourself", a package management protocol is something I used to work on, not anymore. It keeps getting worse - it's not just a technical problem, also a political one.)


This is a good question. I have a few thoughts.

First, different languages have subtleties that can make a different in package management. For example, the way nested dependencies can work in JavaScript, Go, and PHP are all different. Or, the way you can scan a codebase to determine things is difference.

It would be great if different languages had virtually the same interface. That way metadata and interaction could be about the same. Like diving a car from different manufacturers. In many ways the different package managers are similar enough for programming languages were sorta there (but not all that well).

Second, there is the cultural problem. Many languages are designed on islands from others. Reinventing things, trying different things, and wanting to be their own special flower. Sometimes this is good because a new language is not like the others and that can be a good thing. Other times you reinvent a fairly common wheel.

I'm not aware of anyone solving this in a common way.


At least for the dependency solving part, a generic exchange format has emerged: CUDF (http://www.mancoosi.org/cudf/).

Several efficient solvers have been implemented that supports this input format. You can read about it more in http://opam.ocaml.org/doc/Specifying_Solver_Preferences.html...

[1]: https://www.youtube.com/watch?v=E-gtFnbHcv0&list=UUP9g4dLR7x...

[2]: http://ocaml.org/meetings/ocaml/2014/ocaml2014_17.pdf

This is at least used in OPAM (the OCaml package manager) and is also available for apt. If you are implementing a SAT solver for your package manager nowadays, you are doing it wrong. :)


Sam Boyer, one of the other people in the document wrote a nice article that dives into the complexity of dependency management.

https://medium.com/@sdboyer/so-you-want-to-write-a-package-m...

Discussion https://news.ycombinator.com/item?id=11088125

Sam is currently working on a library for Go https://github.com/sdboyer/gps


In fact, here's Sam and wycats (of Cargo) talking about how a generic library (obviously Go is not the best choice for that) would be nice: https://twitter.com/sdboyer/status/755036174016806912


> At least Go reuses the git protocol to its advantage.

I think this is a mistake. Packaging should come from a central server (that can be overriden with a mirror). Allowing downloads from adhoc git repositories is a real pain. First, it makes mirroring the package repository impossible. Second, it makes it difficult for companies to disallow certain packages in a centralized fashion. Third, it makes distro packaging a royal pain. And finally Git simply wasn't built for the this kind of work, compressed tarballs happen to work great for this purpose. Git's mutable history also presents other challenges for pinning to specific versions of packages. Simply put, sometimes a central authority has advantages.

That all said, vendoring alleviates most of these problems for apps and some of them for libraries. Reusing tools does make it a little easier to get started with too. I'd hate to see these features removed in whatever go get becomes in the long term.


> That package has versions

This is a statement that doesn't always apply.

For example, some packages don't have versions, they only have a single version: the latest version. It's a moving target, but it's there. Older versions are just an unimportant and unsupported implementation side-effect.

Of course, many packages do have versions, but this one factor makes a big difference on how you look at vendoring.


First, your described software still has versions. Each daily snapshot or git commit or whatever is a new version. There's no reason you can't put an auto-incrementing version number on each release and call that your version number. You can even stick a "0." on the front of it and be SemVer complant.

Second, just because some people do this doesn't mean it's a good idea. If you're writing a full application and just want to distribute it, do whatever you want with your versioning (although using a dependency management tool to distribute your webapp or CLI tool is probably not a great idea in the first place), but if you are writing something to be included in another project please use a sane versioning system. Anything else is selfish and disrespectful of your users.


In Go, each version corresponds to a given import/Pkg path.

If you want to change version (major), you need to create a new package.

The advantage is that you do not have to download a manifest such as package.json or whatever. It also provides a constraint on library authors to provide a backward compatible API for their published package.

Eventually, I guess we could use go/types to enforce/check the API backward compatibility requirement.

The main issues for now are:

* make sure that packages importing a same vendored one are able to use the same latest minor version in use.

* allow a project based approach to go get for people who need reproducible builds (or alternatively allow for multiple $GOPATH and make being able to switch between them easy)


Without versions you don't have package management, you have a download script.


Agreed, and you also cant guarentee reproducable builds. Builds that are dependent on when you compile them, even if no changes have been made due to ephemeral dependances are a nightmare.


There are many people who need versions and don't run the tip of master. Even for projects that always try to keep the tip of master as a stable build.

I think of OpenStack here. Multiple times per year they produce releases because many people need them. Yet, the CI/CD system keeps it in such shape you can always run the tip of master and some people do that.

Both options need to be available.

To not have release versions means you're not willing to support a certain class of users. It says something about your intent.


There's Molinillo used in CocoaPods and Bundler: https://github.com/CocoaPods/Molinillo


When I wrote "Seeing How The Other Side Lives, A Package Manager Overview For Go Developers"[1] I noted CocoaPods. Was not aware of Molinillo though. Thanks for sharing.

[1] http://engineeredweb.com/blog/2015/pkg-mgr-overview/


Don't ignore nix/guix.

Personally I don't think we need yet another package manager with painful external dependencies, non-reproducible builds, conflicts, lack of rollbacks, etc. We have that with "go get". What we need is to actually solve all those thing. Make sure there are no fragile external dependencies, but everything is in the package manager, even gcc, binutils. You get the idea.


nix/guix was not ignored.

There are a host of other reasons that we could happily debate for years (no thanks), but bottom line: you can't use it everywhere you can write/compile Go code.


While it's true, that you can't use nix/guix everywhere, but the ideas are orthogonal to where you can write/compile Go code.


Not looking for a debate: Why can't you use it everywhere you could write or compile Go code?


Nix barely supports Windows, for one thing. I don't think Go developers would consider installing cygwin just to run a broken version of Nix. And it doesn't officially support any BSDs other than OS X, I think.


Go package management needs to work with Git, Bzr, Hg, and Svn as it does today. Do you have suggestions on building a tool cross platform that works with packages in all of these that doesn't have external dependencies?


It doesn't matter where the code is stored, certainly not for nix.


I'm sorry, I don't understand. `go get` or any package manager needs to be able to fetch the code remotely. Without interacting with the external binaries would mean the Go tool would need to have custom code to retrieve the codebase and interact with the VCS metadata such as tags. Are you suggesting custom Go code for that instead of using dependencies like git?


No, git would have to be a package in the same package manager, as a build dependency. Think OS package manager, but limited to what developer needs, i.e. just software, no services, no superuser privileges, etc.


Talk about feature creep! It sounds like you want a package manager and ecosystem that can install half an OS, just so you don't have to fuss with installing git on your own...


It's going to be "half an OS" either way. But it can be nice and reliable, if you go the nix road.


More specifically, there is already a package manager and ecosystem that can install all of an OS, and you don't have to fuss with installing anything on your own. This package manager and ecosystem could subsume the problem of language-specific package managers entirely.


What manager? Can it run on Windows? Linux? Mac? If it can't do all 3, it's already failed the first requirement...


Hi

(I've tried Glide but it slowness and few bugs made me move to Govendor. Nothing personal, it just didn't work out that well for me.)

A few questions:

- Is the intention desincentive the use vendoring?

- Is the intention to make a single solution, and then deprecate existing tools (Glide, Govendor and like)?

- Do you have the intention to make a central registry for Go packages? Or the solution will use GitHub and like, as it does today? It would be not backward compatible, but the document don't make this clear.


I think the intention is to go figure it out. I don't know that we can say what it will be just yet.


What are some of the proposed solutions other than vendoring, which tends to be very difficult for large distributions such as Fedora. I was just reading a few days ago about some of the extreme challenges Fedora is having packaging the golang ecosystem.


Can you point me to those Fedora discussions? I'd be curious to learn more.

I understand how operating system package managers deal with things differently than application developers. I've heard of problems with the way PHP apps are handling their dependencies these days and how Linux distro package managers aren't happy about it.

I'm curious to know more in this case.


This email was from July 25th: https://lists.fedoraproject.org/archives/list/golang@lists.f...

But there have been several regarding the challenges Fedora has had as a large distribution while trying to package many Go apps, each with potentially hard requirements on different versions of the same golang package to import. Most of the conversation about it is in the golang sig (special interest group) that I've linked to their mail.

May I suggest you subscribe, send an email out introducing yourself, and ask what issues we are facing? I suspect it would be extremely well received having someone working on a language specific solution asking on the Fedora go sig mailinglist. Just a thought.


Dumb question but what are the reasons a solution like cargo or bundle (never used glide) would be rejected by the go committee?


I think cargo does too much. For example it has a bunch of meta information which could be extracted from the source code(i.e package documentation), the imports must be manually maintained in a config file(the Go way is to extract them from the source code automatically and snapshot the version). Shotly said I feel cargo is bloated, verbose and it wouldn't fit well with the Go philosophy(minimalist, convention over configuration). Why do you need both cargo.toml and cargo.lock? I also doubt Cargo would fix issues not fixed by the existing vendoring tools(e.g libraries that expose common types of different versions)


I never understood why Go conflates where something is (its Github url) with what something is (a specific version of a specific bit of code, what Maven calls an "artifact," and what lots of other solutions call a "package," or "module," or "gem", which has a version and never changes).

This creates a lot of problems that simply don't need to be there:

1. Versioning dependencies is pretty much impossible unless you make your own copy of them, and keep them updated. Good dependency management solutions do not require you to "vendor" your dependencies. They enable reproducible builds by keeping every version of an artifact around.

2. Github could change its URL schema, and most Go projects would stop compiling.

3. The maintainer could delete the Github repo.

4. Source control, specifically Github, is tightly coupled with dependency management, and that should not be necessary.

These are not unsolved problems. Maven solved every single one of these problems years ago. I don't expect people to use Maven to build Go applications, but clearly there is an opportunity to use Maven's good ideas and build a better dependency management system for Go.

I have to say, in defense of Maven, that it is very robust: artifacts are mirrored in many places, including in your own local repo; versioning works well, including availability of older versions; and nobody can screw you by deleting package or changing its Github url. My company uses Maven for Node.js as well as Java, because we've had so many problems with NPM.

I should also point out that the expectation that the core Go team should develop a dependency management system is a bit odd. The dependency management systems for Java, Ruby, Python, Node.js, and Perl were all developed by the community, not the core team. Some of them were later absorbed by the core team, but the point still stands.


You, too, seem to misunderstand the concept of a Go import path. It's a string that identifies a package. It's not a URL. To provide for sane namespacing, people tend to use DNS names and URL paths to name their packages, just like they do with Java package names. There's no requirement that there's any correlation between the package import path and where you fetch that package's code from.

Thus, your statement: 2. Github could change its URL schema, and most Go projects would stop compiling.

is wrong. No single Go project that compiled before would now fail to do so. The common build tools use the import path to determine a path on the local filesystem where source or object code may be found. Neither a change of Github's URL schema, nor a repo deletion by some maintainer would cause your build to fail.

The thing that would break is of course the re-fetching of source code from that repo, but that's kinda obvious, don't you think? You can't clone from a deleted Git repo. A common solution to that problem (language-independent) is to mirror the code somewhere else.

Now, there's one single thing in the standard Go tooling universe that interacts with remote Git repos. It's 'go get', a small convenience wrapper around several VCS commands. It allows you to clone a remote repo to the right place on your disk while using the same syntax for Git, Mercurial and SVN alike. That's basically all it does. Yet, some people expect it to be some kind of elaborate package management system. It's not.


Looong overdue...

With Go being a new language (1.0 in 2012), there is no excuse for not having had a solid pkg solution in 1.0.

It's one of the things holding Go back, in my opinion, producing makeshift solutions like Glide and gopkg.in, which are only minimally adopted.

I hope they also specify some mandatory versioning scheme (like npm with semver), because there are so many Go packages out there which don't even do any versioning!


The idea that somehow every single language and community needs its own crappy package management systems is just utterly broken.

There's no reason you can't use the same package format and tools for code written in multiple languages. You might get some additional convenience by having small language-specific plugins (like the automatic dependency extractors in RPM) but that's just a tiny amount of the necessary tooling. The vast majority is not language-specific at all.

With these language-community systems, there's also the problem of depending on code written in another language. My Go program might well depend on a C library, just as my C program can require a shell script.


Unfortunately the idea that a new language meant for widespread use can ship without its own package management systems is even more broken.

Generally languages need a good tool to allow (nested) dependencies that has a simple (preferably declarative) dependency file.

Would it be nice to be able to have a cross language platform tool(jvm tools do generally play nice across languages in my understanding) like nix that also allows setting up other non language specific dependencies? Yes. But I'm not sure any are good enough/simple enough yet. Also language ecosystems tend to avoid non language dependencies (unless they're so common theyre typically installed with the language like lxml/libxml2) or have crappy ways to try to build c dependencies. So this is if not solved partially worked around.

These package managers are mostly for development, not deployment although frequently they are also used to build or even deploy code.

It would be nice for each language package manager to have plugins to generate self contained deb/rpm/ect. for deployment or even fancier to allow some dependencies to be (assuming it's not a language that prefers static linking/bundling) satisfied by system packages. But these take time and effort to write.


But the solutions don't offer much over "go get", what would be the point to use them?


I thoroughly disagree.

The Golang ecosystem is in an incredibly weak state when you compare it with other modern languages.

Major issues:

-)

Due to the lack of a proper package system, there's also no central pkg repository, which really hurts discoverabilty and growth of the ecosystem a lot. Just compare it to solutions like NPM for node, https://crates.io for Rust, Hackage for Haskell, etc...

Github + https://godoc.org/ is not a replacement for a proper package repository.

-)

Like I mentioned, way, way too many Go packages don't follow a proper release process with versioning and release notes. AT ALL. You often need to mirror repos for your own project, and closely track what upstream projects are doing. git clone + go get for your big projects just doesn't work.

This is incredibly problematic when developing big projects. A proper pkg solution should enforce versioning and release best practices.


Can you explain why the lack of a central package repository is an issue? All of the languages I use in my daily work have central package repositories, but I don't use them for discovery, I use google, GitHub, etc., because to feel comfortable using a package I need to answer some meta-questions about the package ("is this package used by a lot of people?", "is it updated frequently?", "are there are a lot of reported issues and do the owners respond to issues diligently?")

It seems to me that while I'm on the GitHub page anyway, it's reasonably convenient to just copy-paste the URL in a file so I can pull in the package and use it. I can't imagine just using a package repo for discovering new packages, that would be tantamount to just pulling whatever random code someone wrote that looks like it might solve my problem.


One small example: someone could delete that github repository, and you'd be hosed. But many centralized systems (like crates.io) do not let you remove packages. The only way that things accidentally break there is if the government gets involved.


As a package user you could use your own git mirroring server or just use a fork. Even simpler just vendor it and you are safe from package removal. On the other hand a central distribution model is more problematic. As package author I want a certain degree of control over my code. If crates.io gets compromissed or is down everyone suffers. If crates.io received a DMCA(happened on npm) my package can be removed. I put some effort into that package so if for some reasons I want to remove it it should be my choice. Why do want to keep it hostage if it's not yours? I though at leat Mozilla learned that distributed is better than centralized. Do you want a place to find all the rust packages? Make a search engine(see godoc.org). Less is more!


Sure, those are all various options. I was replying specifically to "why not GitHub".


That's certainly true -- unrelated to the discovery point I mentioned, but it is true. The different tradeoffs are well-understood at this point IMO.


I made some arguments about this in another place in the thread: https://news.ycombinator.com/item?id=12188869



How do you find the main vs the forks of a package? What about useful metadata to help you make a decision? Compare godoc.org projects to https://crates.io/crates/bitflags or https://packagist.org/packages/masterminds/html5. Look at the metadata to help you make a decision.

Just the docs for the tip of master are far from what people expect.


You can see the number of imports and import graph on godoc. I don't see that on crates.io. You can also see if it's a fork on github. What is on crates.io link that is missing on Godoc? More colours? The package author chooses how much documentation is necessary for a given package. Given that html/js is a mess I expect more documentation to figure out how it works. Here is the godoc of a package with more text if that's what you are looking for http://godoc.org/google.golang.org/appengine/datastore On top of this the documentation is source code so you don't need a bunch of other fils to keep the documentation in sync.



Yes, there are solutions out there, but the very first sentence on gonuts proves my point:

"Serving 28 versions of 12 nuts published by 80 registered users."

There's just very little adoption for these, and there won't be unless it's official.


I'm new to Go and I read some documents [1][2][3] on what 'vendoring' means but I still don't understand.

What problem does it solve? Why is it needed?

Is there an 'explain for a 5-year-old' version, or better yet, an 'explain for someone who is used to other packaging systems that may not be entirely elegant but work pretty okay most of the time, like Java's Maven or Node's npm'

[1] https://engineeredweb.com/blog/2015/go-1.5-vendor-handling/ [2] https://github.com/golang/go/wiki/PackageManagementTools [3] https://docs.google.com/document/d/1rMoZ0rmpxw6dShalHnkuPtia...


It's common practice for folks building Go binaries (but not libraries) to check in the code for their dependencies. So whenever you build Docker, you get the same thing unless Docker made a change (either by changing Docker itself or explicitly checking in an upgrade to a dep).

There are a few nice things about it; vendoring means you won't have a dependency on other repos or archives being available, and if you're a binary maintainer you can see source changes in your own code and that of your dependencies side-by-side if you're, say, investigating a bug that came up between certain revisions.

The not-so-nice things are pretty much what you'd think. There isn't a command to say "give me upgrades, but not ones that the author flagged as breaking changes". If you're writing a _library_ that is depended on by other _libraries_ there is no one conventional way to safely do breaking changes, so there's a lot of effort to avoid those changes plus sometimes hacks like different repo paths for different versions.

Part of the history is that, internally, Google's got one version of third-party code in the latest rev of their source control tree, by company policy. You upgrade something for one, you upgrade it for the whole codebase. Sounds daunting, but they have lots of testing, canary servers, and so on to catch problems. There is a great talk about some of this by Rachel Potvin (can't recall if it covers the third_party/ stuff specifically):

https://www.youtube.com/watch?v=W71BTkUbdqE

So Go's initial model didn't consider multiple versions of packages because Google had approached upgrades a different way internally. Of course, unlike Google, Go users on the outside will never have any central mechanism to keep all the world in sync when there are API changes. Hence the conversation today about package management.


I think another reason they didn't consider proper versioning is that only a few of the Go core developers have a background in languages with a good dependency management ecosystem. C and C++ have never really had that, and Python never really has either, despite what fans of it seem to think.

I think the bottom line is that the Go team simply doesn't know how good the ecosystem is in a lot of newer languages. I see this reflected in other decisions in Go as well, such as the absurd amount of boilerplate needed to manage some of the Goroutines, and the way string are handled.


> I see this reflected in other decisions in Go as well, such as the absurd amount of boilerplate needed to manage some of the Goroutines, and the way string are handled.

Can you expand on this?


You forgot to mention these modern languages with a good dep system.


It solves the problem of reproducible builds when using third party code. You can record state (v 1.3 of pkg foo) or check in code (vendor).

It only really gets complex on big projects with lots of dependencies which themselves have dependencies.


After more searching, I found a StackOverflow question asking the same thing [1], and the answer appears to be:

"Vendoring is the act of making your own copy of the 3rd party packages your project is using. Those copies are traditionally placed inside each project and then saved in the project repository."

Being familiar with both Maven and npm, in those communities, this act of checking dependencies into source control is a controversial and hotly-debated topic, but in those package managers is more often than not (please bear with my point) considered an anti-pattern, because both of them provide solutions [2][3][4], like a project's declarative list of dependencies along with a baseline (but often, overridable) conflicting-version-resolution algorithm.

Of course, the more I understand the issue, the more questions I have.

Why does Go have multiple, competing package managers? Some, like Glide [5] (without judgements about all the others), encourage the type of declarative version management I quoted.

Why does Go even ship with built-in package fetcher 'go get'? Why would anyone ever build a package fetcher for a publicly-released ecosystem that has no concept of versions or commit-IDs or content-adressed-hashes, or some other mechanism of tracking state to something that's as inherently mutable as a software package?

I have no desire to point fingers, and I don't want to encourage bikeshedding, and I don't want to turn this sub-thread into an in-band debate, as I'm certain it's been done (over and over again) in the past, I just simply feel that 'go get' is fundamentally broken in its understanding of how people in the wild actually write software. I'm suddenly beginning to understand why there are concerted efforts [6][7][8] to address this issue in a community-led, officially-sanctioned way.

[1] http://stackoverflow.com/questions/35109393/ [2] http://maven.apache.org/guides/introduction/introduction-to-... [3] https://github.com/ning/maven-dependency-versions-check-plug... [4] http://bytearcher.com/articles/semver-explained-why-theres-a... [5] https://github.com/Masterminds/glide [6] https://docs.google.com/document/d/18tNd8r5DV0yluCR7tPvkMTsW... [7] https://docs.google.com/document/d/1M4Ldgtxr9vC8wpyTldQe5oUy... [8] https://github.com/sdboyer/gps/wiki/Rationale


go get does what it says and no more. It's sufficient for many large companies, esp with monorepos.

Vendoring also works just fine for many uses and has the virtue of making dependencies explicit.

Pkg management is useful if you have a large ecosystem of rapidly changing libraries with many fragile dependencies which overlap. If can encourage overuse of external code though (eg left-pad).

It is not a panacea or the one true way.


I'm exited to see this progressing.

I'm grateful for anyone who serves on the committee, as it could be a thankless job (some people may not like the agreed upon tool/process, but hopefully a small group that doesn't feel entirely disenfranchised).

I really hope whatever they come up with is welcomed by the community and projects migrate to using it. Basically becoming the "Standard Go Way" to do things.


I hope you will find a "Go solution". I mean stupid simple (for most of the time).


Actually though, I'm hoping for exactly this.

One of the things I love about Golang is that it's opinionated; gofmt forces your code to conform to a uniform style, the linters are top-notch, and the go command line tool builds your code in exactly one way. Some of the decisions made (the lack of generics sticks out) are frustrating at times, but the language authors having opinions stops many developer quibbles from becoming a problem. Now it's time to add packaging to this list.

When we use go at work, everyone conforms to the same coding standards, everyone uses the same documentation tools, everyone deals with the same build system, and everyone (for the most part) writes cleaner, more uniform code because of it. I get why some developers hate Golang: it forces lots of its own opinions on how to do things on you, but in exchange it removes almost completely the bikeshed problem; everyone has to paint their bike shed the same color, so we can move onto the meat of solving the problems. I personally have been fine with go get and the vendoring tools people have created because we're using almost entirely internal libraries so getting the latest version of the master branch is usually the right choice, but I understand the need for a better package management solution. I just hope, like the parent commenter, that the go package solution is as opinionated as the rest of the language; give us an easy, sensible default so that the tooling gets the hell out of my way. I hate working on C++ and Java projects at my job because of how heavy the tooling often is (CMake and Maven, respectively), and that's just for the build system; forget autocomplete or linting or refactoring support unless you're in a commercial IDE. On the other hand, Go is simple enough that emacs is powerful enough to take care of every IDE feature I could ever want. Hell, the command line and nano is plenty to edit Go code, because the build system works exactly one way, and that is so incredibly important to me.

So yeah, I want to see a good solution. But I will take a less complete solution over one that adds more complexity.


btw, I also wrote up a brief "State of Go Packaging" doc you might be interested in reading: https://docs.google.com/document/d/1M4Ldgtxr9vC8wpyTldQe5oUy...


This is needed badly. I am a huge Golang advocate, but I refuse to use the "github.com" package naming/folder setup mess. It's just not elegant.


Projects I care about use a domain I control. It's not that hard to setup: https://jve.linuxwall.info/blog/index.php?post/2015/08/26/Ho...

We even do it with just an S3 bucket for go.mozilla.org: https://go.mozilla.org/


I'm a fan of that as well. Had not considered using S3 for that. Do you know of a good writeup for handling "?go-get=1" paths differently from html paths in S3? I should likely know this but a quick pointer would be helpful.


I have not looked into this.

The S3 method is kinda experimental. It works for first-level packages but fails for nested packages. We need to look into setting up redirects for those.

The best way to do this yourself is to run an instance of https://github.com/niemeyer/gopkg, but that's not #serverless anymore ;)


You spelled Go incorrectly.


If you want to model a real package manager, don't do a comparison of existing tools, all of which have flaws. Start with the feature set of RPM, and then ask a professional software packager + developer what is missing.

In order to have a feature-complete package manager (that is, it does everything you will concievably need) you need at least the following:

  - Unique file/object tracking
  - Upgrade & Rollback
  - Dynamic string interpolation (file paths, for example)
  - Identification of architecture and OS
  - ACL and extended permission support
  - Tracking of both source and binary dependencies & versions
  - Before & after script hooks for every single operation
  - Cryptographic signing & verification
  - Support for multiple central & distributed repositories
This is, of course, completely different from a build system, but has a few features in common. You may not need a package manager at all; you may just need a feature-complete build system, which is how you end up with CPAN. (There is no Perl package manager, though PPM attempts one, and Carton is just a bundler)


I hate using the url for the namespace.


Do you prefer :: instead of / ? Btw go doesn't use URLs, just domain and path (e.g randomsofr.io/pkg/subpkg )


So what's wrong with vendoring all the deps and version controlling everything?

Edit: Seriously, I'm not being a jerk.


vendoring is dependency storage. Management, especially when you have transitive dependencies (deps of deps), is more than storage. We need to solve the non-storage related problems.

Are you familiar with the diamond dependency problem? That's one example.


I'm not familiar with the name, so I looked it up thank you.

Program A depends on B and C which both depend on the stdlib (D). That seems pretty common. I actually have this going on now, but so far so good.

I suppose D' (the new version of D) might break B while improving C and then I'd have a problem, right?


I wonder why I was able to edit document on first open. :o Seems to be solved quite quickly.


Heh. Thanks for the warning. I just changed my document (mentioned at the bottom) to comment-only too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: