Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ugh, I really wish this had been written in Go or Rust. Just something that produces a single binary executable and doesn't require you to install a runtime like Node.


Projects like this have to update frequently, having a mechanism like npm or pip or whatever to automatically handle that is probably easier. It's not like the program is doing heavy lifting anyway, unless you're committing outright programming felonies there shouldn't be any issues on modern hardware.

It's the only argument I can think of, something like Go would be goated for this use case in principle.


> having a mechanism like npm or pip or whatever to automatically handle that is probably easier

Re-running `cargo install <crate>` will do that. Or install `cargo-update`, then you can bulk update everything.

And it works hella better than using pip in a global python install (you really want pipx/uvx if you're installing python utilities globally).

IIRC you can install Go stuff with `go install`, dunno if you can update via that tho.


This whole thread is a great example of the developer vs. user convenience trade-off.

A single, pre-compiled binary is convenient for the user's first install only.


> A single, pre-compiled binary is convenient for the user's first install only.

Its not.

Its convenient for CIs, for deployment, for packaging, for running multiple versions. It's extremely simple to update (just replace the binary with another one).

Now, e.g. "just replacing one file with another" may not have convenience commands like "npm update". But its not hard.

My point is that a pre-compiled binary is extremely more convenient for *everyone involved in the delivery pipeline* including the end-user. Especially for delivering updates.

As someone who's packaged Javascript(node), Ruby, Go and rust tools in .debs, snap, rpms: packaging against a dynamic runtime (node, ruby, rvm etc) is a giant PIAS that will break on a significant amount of users' machines, and will probably break on everyones machine at some point. Whereas packaging that binary is as simple as it can get: most such packages need only one dependency that everyone and his dog already has: libc.


> My point is that a pre-compiled binary is extremely more convenient for everyone involved in the delivery pipeline* including the end-user. Especially for delivering updates.*

The easiest is running "sudo apt update && sudo apt upgrade" and have my whole system updated. Instead of writing some script to get it done from some github's releases page and hoping that it's not hijacked.

Having a sensible project is what make it easy down the line (including not depending on gnu libc if not needed as some people uses musl). And I believe it's easy to setup a repository if your code is proprietary (Just need to support the most likely distribution, like ubuntu, fedora, suse's tumbleweed,...)


I don’t think that’s true. For instance, uv is a single, pre-compiled binary, and I can just run `uv self update` to update it to the latest version.


I literally wouldn’t use uv if it weren’t available via pip.

Reasoning: it’s a Python tool, therefore it shouldn’t require anything (any 3rd party package manager) beyond Python.


It’s a standalone binary. It doesn’t require anything at all. It’s literally just one file you can put anywhere you like. It doesn’t need a third-party package manager.


Sure it does. When you download it manually you become that package manager.


Unless you build self-updating in, which Google certainly has experience in, in part to avoid clients lagging behind. Because aside from being a hindrance (refusing to start and telling the client to update) there's no way you can actually force them to run an upgrade command.


How so? Doesn’t it also make updates pretty easy? Have the precompiled binary know how to download the new version. Sure there are considerations for backing up the old version, but it’s not much work, and frees you up from being tied to one specific ecosystem


No, it doesn’t. At work everything is locked down and you either need to have separate mechanism to deliver updates or use pip.


That's not an argument against the difficulty of "updating a binary file" vs "updating via pip", it's merely addressing what your work deems important and possible.

(Aside from the fact that allowing "use pip" completely defeats the purpose of any other of these mechanisms, so it's a poster-child example of security-theater)


> Re-running `cargo install <crate>` will do that. Or install `cargo-update`, then you can bulk update everything.

How many developers have npm installed vs cargo? Many won't even know what cargo is.


Everyone in the cult knows what cargo is.


Not even "cargo install" is needed.

Just `wget -O ~/.local/bin/gemini-cli https://ci.example.com/assets/latest/gemini-cli` (Or the CURL version thereof) It can pick the file off github, some CI's assets, a package repo, a simple FTP server, an HTTP fileserver, over SSH, from a local cache, etc. It's so simple that one doesn't need a package manager. So there commonly is no package manager.

Yet in this tread people are complaining that "a single binary" is hard to manage/update/install because there's no package manager to do that with. It's not there, because the manage/update/install is so simple, that you don't need a package manager!


> is so simple, that you don't need a package manager!

You might not know the reason ppl use package managers. Installing this "simple" way make it quite difficult to update and remove compared to using package managers. And although they are also "simple", it's quite a mess to manage packages manually in place of using such battle-tested systems


> You might not know the reason ppl use package managers.

People use package managers for the following:

- to manage dependencies - to update stuff to a specific version or the latest version - to downgrade stuff - to install stuff - to remove stuff

any of these, except for the dependency management, are a single command, or easy to do manually, with a single compiled binary. They are so simple that they can easily be built into the tool. Or handled by your OSs package manager. Or with a "shell script" that the vendor can provide (instead of, or next to, the precompiled binary.

I did not say manually, you infer that. But I never meant that. The contrary: because it's so simple, automating that, or have your distro, OS or package manager do this for you, is trivial. As opposed to that awful "curl example.com/install.sh | sudo tee -" or those horrible built-in updaters (that always start nagging when I open the app - the one moment that I don't want to be bothered by updates because I need the app now)

The only reason one would then need a package manager is to manage dependencies. But a precompiled binary like Go's or Rusts typically are statically compiled so they have no (or at most one) dependency.

Imagine the ease of a single ".targz" or so that includes the correct python version, all pips, all ENV vars, config files, and is executable. If you distribute that - what do you still need pip for? If you distribute that, how simple would turning it into a .deb, snap, dmg, flatpack, appimg, brew package, etc be? (Answer: a lot easier than doing this for the "directory of .py files. A LOT)


> Imagine the ease of a single ".targz" or so that includes the correct python version, all pips, all ENV vars, config files, and is executable. If you distribute that - what do you still need pip for?

pip is there so you don't need to do that. In the deployment world, you really want one version per system for everything and know that everything is in sync. To get that the solution was a distribution of software and a tool to manage them. We then extended that to programming language ecosystem and pip is part of the result.

But for workstation, a lot of people wants the latest, so the next solution was to be able to abstract the programming language ecosystem from the distribution (And you may not have a choice in the case of macOS), so what we get is directory-restricted interactions (go, npm,..) or doing shell magic so that the tooling think it's the system (virtual env,...).

It's a neat trick, but the only reason to do so is if you want to distribute compiled version of a software to customer. But if the user have access to the code, It's better to adapt the software to the system (repositories, flatpak...) or build a system around it (VM, containers, ...).


You'd think that, but a globally installed npm package is annoying to update, as you have to do it manually and I very rarely need to update other npm global packages so at least personally I always forget to do it.


I used to also have outdated versions until I used mise. `mise use -g npm:@google/gemini-cli` and now `mise up` will update it. cargo, pip etc too.


I feel like Cargo or Go Modules can absolutely do the same thing as the mess of build scripts they have in this repo perfectly well and arguably better.


I don't think that's the main reason. Just installed this and peaked in node_nodules. There are a lot of random deps, probably for the various local capabilities, and it was probably easier to find those libs in the Node ecosystem than elsewhere.

Also, react-reconciler caught my eye. Apparently that's a dependency of ink, which lets you write text-based UIs in React.

That and opentelemetry, whatever the heck that is


Build updating into the tool. e.g.

  uv self update
  yt-dlp --update
etc.


If you use Node.js your program is automatically too slow for a CLI, no matter what it actually does.


If you even sneeze into performance discussion without providing benchmarks first – you’re a tool.


You don't have to believe me if you don't want to. But I strongly advise everyone who still uses prettier to try a formatter written in Rust, for example dprint. It's a world of difference.


If you told me the Rust-based code linter runs 10X faster, I'd believe it, but it wouldn't matter even if it were 100X.


So are you saying the Gemini CLI is too slow, and Rust would remedy that?


Yes


Ask Gemini CLI to re-write itself in your preferred language


Unironically, not a bad idea.


Contest between Claude Code and Gemini CLI, who rewrites it faster/cheaper/better?


I have already been asking the Gemini CLI questions about itself.


Let's not pretend this stuff actually works


Not sure why this was flagged, because I think the point is that one-shot works terribly, which it does


This isn't about quality products, it's about being able to say you have a CLI tool because the other ai companies have one


Fast following is a reasonable strategy. Anthropic provided the existence proof. It’s an immensely useful form factor for AI.


The question is whether what makes it useful is actually being in the terminal (limited, glitchy, awkward interaction) or whether it's being able to run next to files on a remote system. I suspect the latter.


Yeah, it would be absurd to avoid a course of action proven productive by a competitor.


> This isn't about quality products, it's about being able to say you have a CLI tool because the other ai companies have one

Anthropic's Claude Code is also installed using npm/npx.


Eh, I can't see how your comment is relevant ti the parent thread. Creating a CLI in Go is barely more complicated than JS. Rust, probably, but people aren't asking for that.


They wrote the CLI "GUI" in React using ink, which is all JS-only. I don't know what the Golang way of doing this would be, but maybe it's harder if you want the same result.


There are many GUI building libraries in Go. Sure, you wouldn't be writing JSX (and I agree it's an interesting idea), but it doesn't mean it's more any work to get things rendered in a terminal with other approaches, especially with these AI assistants to help you finish the boring parts.


If the UI is complicated at all, React is a well-established way to do that easily. The one-off tools will be harder, and even the AI won't know them as well as it knows React.


Writing it in Golang or Rust doesn't really make it better


Meanwhile, https://analyticsindiamag.com/global-tech/openai-is-ditching...

I really don't mind either way. My extremely limited experience with Node indicates they have installation, packaging and isolation polished very well.


Node and Rust both did packaging well, I think Golang too. It's a disaster in Python.


Looks like you could make a standalone executable with Bun and/or Deno:

https://bun.sh/docs/bundler/executables

https://docs.deno.com/runtime/reference/cli/compile/

Note, I haven't checked that this actually works, although if it's straightforward Node code without any weird extensions it should work in Bun at least. I'd be curious to see how the exe size compares to Go and Rust!


You can also do this natively with Node, since v18: https://nodejs.org/api/single-executable-applications.html#s...


I was going to say the same thing, but they couldn’t resist turning the project into a mess of build scripts that hop around all over the place manually executing node.


Oh, man!

I guess it needs to start various processes for the MCP servers and whatnot? Just spawning another Node is the easy way to do that, but a bit annoying, yeah.


A Bun "hello world" is 58Mb

Claude also requires npm, FWIW.


That is point not a line. An extra 2MB of source is probably a 60MB executable, as you are measuring the runtime size. Two "hello worlds" are 116MB? Who measures executables in Megabits?


> A Bun "hello world" is 58Mb

I've forgotten how to count that low.


What's a typical Go static binary size these days? Googling around, I'm seeing wildly different answers -- I think a lot of them are outdated.


It depends a lot on what the executable does. I don’t know the hello world size, but anecdotally I remember seeing several go binaries in the single digit megabyte range. I know the code size is somewhat larger than one might expect because go keeps some type info around for reflection whether you use it or not.


Ah, good point. I was just wondering about the fixed overhead of the runtime system -- mainly the garbage collector, I assume.


The Golang runtime is big enough by itself that it makes a real difference from some WASM applications, and people are using Rust instead purely because of that.


Yeah, this just seems like a pain in the ass that could've been easily avoided.


From my perspective, I'm totally happy to use pnpm to install and manage this. Even if it were a native tool, NPM might be a decent distribution mechanism (see e.g. esbuild).

Obviously everybody's requirements differ, but Node seems like a pretty reasonable platform for this.


Also throwing Volta (written in Rust, because of course it is) into the ring. It's the uv of the Node world.


It feels like you are creating a considerable fraction of the pain by taking offense with simply using npm.


As a longtime user of NPM but overall fan of JS and TS and even its runtimes, NPM is a dumpster fire and forcing end users to use it is brittle, lazy, and hostile. A small set of dependencies will easily result in thousands (if not tens of thousands) of transitive dependency files being installed.

If you have to run end point protection that will blast your CPU with load and it makes moving or even deleting that folder needlessly slow. It also makes the hosting burden of NPM (nusers) who must all install dependencies instead of (nCI instances), which isn't very nice to our hosts. Dealing with that once during your build phase and then packaging that mess up is the nicer way to go about distributing things depending on NPM to end users.


I ran the npm install command in their readme, it took a few seconds, then it worked. Subsequent runs don't have to redownload stuff. It's 127MB, which is big for an executable but not a real problem. Where is the painful part?


> and doesn't require you to install a runtime like Node.

My exact same reaction when I read the install notes.

Even python would have been better.

Having to install that Javascript cancer on my laptop just to be able to try this, is a huge no.


Bun can compile to a single executable. Not sure if node has the same feature. Point is it's very doable with JS.


Language choice is orthogonal to distribution strategy. You can make single-file builds of JavaScript (or Python or anything) programs! It's just a matter of packaging, and there are packaging solutions for both Bun and Node. Don't blame the technology for people choosing not to use it.


Why would you want a single-file build anyway, to make it easier to move around on disk? There are reasons for the dep filetree.

Btw, the largest deps in this are React and Open Telemetry.


You mean something like this:

https://www.npmjs.com/package/pkg

or perhaps this one:

https://www.npmjs.com/package/nexe


Node can also produce a single binary executable: https://nodejs.org/api/single-executable-applications.html


My thoughts exactly. Neither Rust not Go, not even C/C++ which I could accept if there were some native OS dependencies. Maybe this is a hint on who could be its main audience.


> Maybe this is a hint on who could be its main audience.

Or a hint about the background of the folks who built the tool.


Definitely not neckbeards who care about such trivia.


See gemmafile which gives you an airgapped version of gemini (which google calls gemma) that runs locally in a single file without any dependencies.

https://huggingface.co/jartine/gemma-2-27b-it-llamafile




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: