Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't agree with this at all. I think the reason Github is so prominent is the social network aspects it has built around Git, which created strong network effects that most developers are unwilling to part with. Maintainers don't want to loose their stars and the users don't want to loose the collective "audit" by the github users.

Things like number of stars on a repository, number of forks, number of issues answered, number of followers for an account. All these things are powerful indicators of quality, and like it or not are now part of modern software engineering. Developers are more likely to use a repo that has more stars than its alternatives.

I know that the code should speak for itself and one should audit their dependencies and not depend on Github stars, but in practice this is not what happens, we rely on the community.





These are the only reasons I use GitHub. The familiarity to students and non-developers is also a plus.

I have no idea what the parent comment is talking about a "well-formed CI system." GitHub Actions is easily the worst CI tool I've ever used. There are no core features of GitHub that haven't been replicated by GitLab at this point, and in my estimation GitLab did all of it better. But, if I put something on GitLab, nobody sees it.


I am surprised by the comments about GH CI. I first started using CI on GL, then moved to GH and found GH's to let me get things done more easily.

It's been years through and the ease of doing simple things is not always indicative of difficult things. Often quite the contrary...


From what I gather it's that GH Actions is good for easy scenarios: single line building, unit tests, etc. When your CI pipeline starts getting complicated or has a bunch of moving parts, not only do you need to rearchitect parts of it, but you lose a lot of stability.

Bingo. GH Actions is great if you're deploying vanilla web stuff to a vanilla web server. I write firmware. GH Actions is hell.

Easy and good are radically different things.

And this is the core problem with the modern platform internet. One victor (or a handful) take the lead in a given niche, and it becomes impossible to get away from them without great personal cost, literal, moral, or labor, and usually a combo of all three. And then that company has absolutely no motivation at all to prioritize the quality of the product, merely to extract all the value from the user-base as possible.

Facebook has been on that path for well over a decade, and it shows. The service itself is absolute garbage. Users stay because everyone they know is already there and the groups they love are there, and they just tolerate being force-fed AI slop and being monitored. But Facebook is not GROWING as a result, it's slowly dying, much like it's aging userbase. But Facebook doesn't care because no one in charge of any company these days can see further than next quarter's earnings call.


This is a socio-economic problem, it can happen with non internet platforms too. Its why people end up living in cities for example. Any system that has addresses, accounts or any form of identity has the potential for strong network effects.

I would say that your comment is an addition to mine, and I think so too. This is another reason for the popularity of github.

As for me, this does not negate the convenient things that I originally wrote about.


Github became successful long before those 'social media features' were added, simply because it provided free hosting for open source projects (and free hosting services were still a rare thing back in the noughties).

The previous popular free code hoster was Sourceforge, which eventually entered its what's now called "enshittifcation phase". Github was simply in the right place at the right time to replace Sourceforge and the rest is history.


There's definitely a few phases of Github, feature and popularity wise.

   1. Free hosting with decent UX
   2. Social features
   3. Lifecycle automation features
In this vein, it doing new stuff with AI isn't out of keeping with its development path, but I do think they need to pick a lane and decide if they want to boost professional developer productivity or be a platform for vibe coding.

And probably, if the latter, fork that off into a different platform with a new name. (Microsoft loves naming things! Call it 'Codespaces 365 Live!')


Technically so was BitBucket but it chose mercurial over git initially. If you are old enough you will remember articles comparing the two with mercurial getting slightly more favorable reviews.

And for those who don’t remember SourceForge, it had two major problems in DevEx: first you couldn’t just get your open source project published. It had to be approved. And once it did, you had an ugly URL. GitHub had pretty URLs.

I remember putting up my very first open source project back before GitHub and going through this huge checklist of what a good open source project must have. Then seeing that people just tossed code onto GitHub as is: no man pages, no or little documentation, build instructions that resulted in errors, no curated changelog, and realizing that things are changing.


Github was faster than BitBucket and it worked well whether or not JavaScript was enabled. This does seem to be regressing as of late. I have tried a variety of alternatives; they have all been slower, but Github does seem to be regressing.

> Technically so was BitBucket

The big reason I recall was that GitHub provided free public repos and limited private, while BitBucket was the opposite.

So if you primarily worked with open-source, GitHub was the better choice in that regard.


Mercurial was/is nice and imho smooths off a lot of the unnecessarily rough git edges.

But VCS has always been a standard-preferring space, because its primary point is collaboration, so using something different creates a lot of pain.

And the good ship SS Linux Kernel was a lot of mass for any non-git solution to compete with.


And GitHub got free hosting and support from Engine Yard when they were starting out. I remember it being a big deal when we had to move them from shared hosting to something like 3 dedicated supermicro servers.

> Things like number of stars on a repository, number of forks, number of issues answered, number of followers for an account. All these things are powerful indicators of quality, and like it or not are now part of modern software engineering.

I hate that this is perceived as generally true. Stars can be farmed and gamed; and the value of a star does not decay over time. Issues can be automatically closed, or answered with a non-response and closed. Numbers of followers is a networking/platform thing (flag your significance by following people with significant follower numbers).

> Developers are more likely to use a repo that has more stars than its alternatives.

If anything, star numbers reflect first mover advantage rather than code quality. People choosing which one of a number of competing packages to use in their product should consider a lot more than just the star number. Sadly, time pressures on decision makers (and their assumptions) means that detailed consideration rarely happens and star count remains the major factor in choosing whether to include a repo in a project.


Stars, issues closed, PRs, commits, all are pointless metrics.

The metrics you want are mostly ones they don't and can't have. Number of dependent projects for instance.

The metrics they keep are just what people have said, a way to gameify and keep people interested.


So number of daily/weekly downloads on PyPI/npm/etc?

All these things are a proxy for popularity and that is a valuable metric. I have seen projects with amazing code quality but if they are not maintained eventually they stop working due to updates to dependencies, external APIs, runtime environment, etc. And I have see projects with meh code quality but so popular that every quirk and weird issue had a known workaround. Take ffmpeg for example: its code is.. arcane. But would you choose a random video transcoder written in JavaScript just due to the beautiful code that was last updated in 2012?


It is fine if a dependency hasn't been updated in years, if the number of dependent projects hasn't gone down. Especially if no issues are getting created. Particularly with cargo or npm type package managers where a dependency may do one small thing that never needs to change. Time since last update can be a good thing, it doesn't always mean abandoned.

I agree with you. I believe it speaks to the power of social proof as well as the time pressures most developers find themselves with.

In non-coding social circles, social proof is even more accepted. So, I think that for a large portion of codebases, social proof is enough.


> Things like number of stars on a repository, number of forks, number of issues answered, number of followers for an account. All these things are powerful indicators of quality

They're NOT! Lots of trashy AI projects have +50k stars.


You don't need to develop on Github to get this, just mirror your repo.

that's not enough, i still have to engage with contributors on github. on issues and pull requests at a minimum.

Unfortunately the social network aspect is still hugely valuable though. It will take a big change for anything to happen on that front.

> Things like number of stars on a repository, number of forks, number of issues answered, number of followers for an account. All these things are powerful indicators of quality

Hahahahahahahahahahahaha...


OK, indicators of interest. Would you bet on a project nobody cares about?

I guess if I viewed software engineering merely as a placing of bets, I would not, but that's the center of the disagreement here. I'm not trying to be a dick (okay maybe a little sue me), the grandparent comment mentioned "software engineering."

I can refer you to some github repositories with a low number of stars that are of extraordinarily high quality, and similarly, some shitty software with lots of stars. But I'm sure you get the point.


You are placing a bet that the project will continue to be maintained; you do not know what the future holds. If the project is of any complexity, and you presumably have other responsibilities, you can't do everything yourself; you need the community.

There are projects, or repositories, with a very narrow target audience, sometimes you can count them on one hand. Important repositories for those few who need them, and there aren't any alternatives. Things like decoders for obscure and undocumented backup formats and the like.

Most people would be fine with Forgejo on Codeberg (or self hosted).

> Maintainers don't want to loose their stars

??? Seriously?

> All these things are powerful indicators of quality

Not in my experience....


Why are you as surprised?

People don't just share their stargazing plots "for fun", but because it has meaning for them.


In my 17 years of having a GitHub account I don’t think I’ve ever seen a “stargazing plot”. Have you got an example of one?

> People don't just share their stargazing plots "for fun", but because it has meaning for them.

What's the difference?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: