Hacker Newsnew | past | comments | ask | show | jobs | submit | chungy's commentslogin

I have Windows 95 on a Pentium 120 MHz and calc.exe is instantaneous enough that it's probably much less than 300ms to launch.

XP's calculator is hardly any different than 95. It's easy to believe that launching it on a Core 2 Duo of all things is also instant.


You’re both kind of right.

On the average consumer hardware at launch, 95 and XP were slow, memory hungry bloats. In fact everything that people say about Windows 11 now was even more true of Windows back then.

By the end of the life of Windows 95 and XP, hardware had overtook and Windows felt snappier.

There was a reason I stuck with Windows 2000 for years after the release of XP and it wasn’t because I was too cheep to buy XP.



Let me expand on this with a link to the article "Rebase Considered Harmful" [0].

I also prefer Fossil to Git whenever possible, especially for small or personal projects.

[0] https://fossil-scm.org/home/doc/trunk/www/rebaseharm.md


> Surely a better approach is to record the complete ancestry of every check-in but then fix the tool to show a "clean" history in those instances where a simplified display is desirable and edifying

From your link. The actual issue that people ought to be discussing in this comment section imo.


THIS is the hill I will die on.

Why do we advocate destroying information/data about the dev process when in reality we need to solve a UI/display issue?

The amount of times in the last 15ish years I've solved something by looking back at the history and piecing together what happened (eg. refactor from A to B as part of a PR, then tweak B to eventually become C before getting it merged, but where there are important details that only resulted because of B, and you don't realize they are important until 2 years later) is high enough that I consider it very poor practice to remove the intermediate commits that actually track the software development process.


Because nobody cares about the dev process. The number of times I’ve looked back in the history and seen a branch with a series of twenty commits labeled “fix thing”, “oops”, “typo”, “remove thing I tried that didn’t work”, or just a chain of WIP WIP WIP WIP is useless, irritating, and pointless.

One commit per logical change. One merge per larger conceptual change. I will rewrite my actual dev process so that individual commits can be reviewed as small, independent PRs when possible, and so that bigger PRs can be reviewed commit-by-commit to understand the whole. Because I care about my reviewers, and because I want to review code like this.

Care about your goddamn craft, even just a little bit.


Isn't this just `--first-parent`? I think that should probably be the default in git. Maybe the only way this will happen is with a new SCM.

But the git authors are adamant that there's no convention for linearity, and somehow extended that to why there shouldn't be a "theirs" merge strategy to mirror "ours" (writing it out it makes even less sense, since "theirs" is what you'd want in a first-parent-linear repo, not "ours").


I believe the parent is referring to how GNOME 3.0 had some really bad resizing grabs. Single-pixel widths at the edges, and almost impossible to hit corners.

Latter versions significantly improved it.


Has been a major issue for me with Xfce and Gnome over the years, mostly just switched window managers.

Xfce is just ridiculous, it has 1px thin area to grab, and last time I checked they just mentioned you should use alt right click instead.

I was about to suggest Xfce as an example where window resizing is effortless due to the <super>+<right click> behavior. You can just grab the rough sector of a window to resize it.

Any reason why you're not using it?


Fedora and Debian have been shipping RISC-V versions of stable releases for a while. I don't think anyone is really struggling.

arch is, but arch also has some woes making even amd64_v3/v4 builds, arm64 aside.

It is not private, it is merely "reserved". If/when that range opens up for Internet address, you'll be in a world of hurt for having used it.

IPv6 is much more stable on what you can use. fc00::/7 is actually private use.


That will never ever happen. Making 240/4 public will break Amazon (and many others) which do use it privately. The software updates to route it across the net would have been taxing. When making it public was suggested years ago, IETF saw the proposition as encouraging IPv4 and refused to entertain it.

In short: The market has already decided and it's private. It's far from the first time an unofficial arrangement is the de facto standard.


MAME intentionally avoids making any release "1.0" since the project's goal is to "emulate everything"—an effectively impossible goal to reach, so the version is never going to be set at 1.0.


Witcher 2 had a Linux native build, but never Witcher 3.


The problem is the word "emulator" itself. It's a very flexible word in English, but when applied to computing, it very often implies emulating foreign hardware in software, which is always going to be slow. Wine doesn't do that and was wise to step away from the connotations.


No. Ruby exists.


Ruby is now faster than Python, last I saw a comparison, though it used to be the other way around.


You might as well use Lazarus and LCL. It'll give the best of all worlds.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: