Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why would the dev footprint matter? whether it depends on 10 or 10,000 packages, if you managing your project sensibly, `parcel` is a build-only dependency, installing only on your machine, and on CI instances (most of which can take advantage of cached depenency lists already anyway).


Those dependencies, many probably written by unknown authors with 0 stars on github, have complete access to your computer and can execute arbitrary code. Even though it shouldn't affect production, that's still a big problem for your own machine.

To put it another way, would you willingly download and execute 730 programs from unknown authors on your computer?


This is a systemic problem with JS dev in general. I know of zero projects that keep the number of dependencies low enough to enable manual audit. Pretty much everyone out there is blindly installing hundreds, and often thousands, of packages, no matter what setup they choose.


Spitballing...couldn't there be a cloud-based Bundler as a Service? Mind you, there's a risk in letting someone else "see" your code, having 700+ applications of unknown origin isn't any better.

If the host of the BaaS could be trusted, and they constantly vetted all packages, isn't that possibility less risky?


Especially with minified JS, how would you be sure, that you get a minified version of your actual code and not one, which does something else additionally, which you might not want?

Not saying, that running 700+ apps is better, just noting, that bundling as a service might not be a perfect solution either.


What about bundling in a local VM or docker container?


I believe there could be some malicious code added to the bundle by these dependencies regardless of where it is being run


I was mostly addressing the part regarding malicious dependencies gaining access to your local filesystem.

If we are talking about the final bundle itself being compromised, there is not really a technical solution to that other than not using dependencies.


I might be missing something here, but that cloud based service would either need to run webpack, roll-up, or parcel on the files you send them, or write a new application from scratch (without dependencies). I guess businesses are gonna business but if you're writing that application anyway why not just release it for devs to run on their own machines and ci servers?


Yes. Same flow/process. The difference is, the service provider vets the packages. Certainly, given the risks, there's a market at the enterprise level. Sec for them is real.


That's a universal dependency problem, and is why you're advised to always run a not-a-throwaway-project with vulnerability monitors. Heck, if you use something like Github, you even get that for free these days.


More dependencies mean greater attack surface as we’ve seen in npm. I agree with your view, but there is a reason to prefer fewer deps even in dev.


Less code, less bugs.


This is especially true as projects age; i recently ran into yet another bug installing dependencies for a rails app, because of a patch-version change on a small library had a bug that manifested with our otherwise old/legacy stack.

I could see a similar situation for projects stuck on older version of node, lodash or whatever, where some tiny component break everything.

Its always fun when you have a known good build, make a change and trigger an error - only to realize the error is in your build system/dependency graph due to someone else doing testing on a different subset of versions than you need - not due to the change you just did.


That sounds like you forgot to peg your version dependencies, though. No matter the age of your code, proper version pegging in an ecosystem that does not allow version deletions (something npm learned the hard way, but amazingly, something pypi allows to this day) ensures that things don't break until you intentionally bump versions.


I generally expect patch version bumps (0.8.0 to 0.8.1)to give me fixes I want, without regressions or new bugs.


Note that there aren't a lot of ecosystems where everyone follows semver rules. Even on the npm registry, patch version bumps can still very much break your code because there's no validation during publication. It's still on you to make sure you have tests set up that run when code or dependencies change, even for something as simple as a patch version bump.

Although specifically to your example of 0.8.0 to 0.8.1: that's exactly the kind of version that semver guarantees is not safe: major version 0 is the "unstable" version, and the minor/patch rules do not apply to it (see https://semver.org/#spec-item-4).


Parcel, webpack and others are build tools very similar to compilers. They take code, process all of it and spit something out, something that would be distributed to the end users.

Now here is a very old and fascinating story - https://www.quora.com/What-is-a-coders-worst-nightmare/answe... and it's base, the seminal Ken Thompson Hack - https://wiki.c2.com/?TheKenThompsonHack

Sounds dangerous? It should. It is very easy to inject code in a small unknown dependency out of those thousands and effectively recreate the Ken Thompson hack.


Sure, but let's also take the code hosting situation into account: npm now comes with security audits during install, and github now comes with free dependency vulnerability monitoring. While "fewer deps means fewer vectors" is true, the security landscape has changed an unusual amount, and for the better, since that article was written.


> npm now comes with security audits during install, and github now comes with free dependency vulnerability monitoring

Ultimately, these are solutions to problems that should not exist in the first place.


This one is even better (scifi, but possible to execute by humans).

https://www.teamten.com/lawrence/writings/coding-machines/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: