Hacker Newsnew | past | comments | ask | show | jobs | submit | danilafe's commentslogin

I'm over at https://danilafe.com.

It's a blog, where I write about compilers, formal verification, and programming languages mostly. Occasionally some web design (with Hugo) sneaks in.


It doesn't have to be one or the other. Both ethical consumption and going vegetarian reduce one's environmental impact, and they're independent of one another. So, while someone "truly" optimizing for environmental impact would better spend their time avoiding meat, someone who enjoys meat can still reduce their environmental impact without becoming miserable. Variables like "income" and "environment" are just parts of the equation for the more important heuristic of happiness.

A lot of the activities on that list are like this. Reading the news has a non-zero impact (hey, I'm on HN, and it definitely helps me keep up to date), and it's "easy" in that it fits into my heuristic for happiness. Same with using a metal straw, and same with picking between credit cards.

In a sense, these activities are "free" in terms of their perceived difficulty, but have a positive, if small, impact. If they're "free", why not do them?


> going vegetarian reduce one's environmental impact

Mmm, yes and no.

It depends where your meat comes from. If you buy meat the way it's produced in the US where you have great big sheds full of cattle in the desert with everything trucked in, then yes.

If you want permaculture, you absolutely must have livestock.

If you want arable farming of any sort, you absolutely must have livestock.

The whole thing breaks down very quickly if you don't have grass and clovers growing in fields, and ruminants eating them, breaking down the tough cellulose, and then shitting it out and trampling it in.


The amount of livestock that we actually would need in that case is probably around 5% of what we actually have (in the US). So it’s still valid if half of all people became vegetarian, and the remaining amount cut their meat consumption to “special occasions only”.

Keep in mind that a lot of our current agriculture is growing feed for livestock as well, so we could cut back on plant farming by a huge amount as well, if we greatly reduced livestock.


> Keep in mind that a lot of our current agriculture is growing feed for livestock as well, so we could cut back on plant farming by a huge amount as well, if we greatly reduced livestock.

Yes and no.

We'd also cut down massively on the amount of food we have available for humans.

What would be better is if people stopped eating soya.


The amount of cattle required to maintain pasture is way fewer than we have right now. From a CO2 perspective factory farmed cattle tends to look a little better than "free-range" mostly due to reduced land use changes (but it is obviously worse from a cruelty perspective). Finally, we can still have farm animals without eating them!!


You might be right, but I was taking that as a given since the article made that claim. I think the general point (of taking smaller actions in lieu of more effective but costly ones) matters more so than the individual "vanity activity".


Won’t that work better without killing & eating them young?


They're not free because they consume your time, which is valuable.


Yes, but only if you would spend that time on something that is more valuable (according to your happiness+ heuristic).


Right, but I think that was the author's point: many of these activities are seen by their participants as "productive", rather than just "this makes me happy". That was a specific point of the post.


I keep seeing Ghostty in the news, and I've tried it, but it feels like just another terminal emulator to men. This coming from someone who spends 90% of the workday in the terminal.

Asking in good faith -- could someone tell me what's special about Ghostty compared to alternatives?


Here is the full about page it: https://ghostty.org/docs/about

Zero trolling when I say this: Two things also make it (more) popular on HN: (1) Mitchell Hashimoto (a well respected hacker who got rich, then kept on hacking) and (2) Zig. (Only Rust could attract more attention.)


To me it has a couple advantages over the other options on the market.

1. Feels 'native' and is built for each platform. This means I can use for example familiar right click context menu's and tabs that I find on every other app. I have the option to use the mouse as well as the keyboard which I appreciate.

2. It has sensible defaults with a "Zero Configuration Philosophy" meaning that many of the things I would usually need to fiddle with are already set.

3. It performs comparably to advanced terminal emulators such as kitty.

The combination of all three (and especially the first) is why I use it.


I tried it a few months ago, and didn't perceive any of those items as particular benefits.

1. Terminal emulators that feel 'native' are ubiquitous. Sure, there are also a lot that have idiosyncratic UIs, but I'm not generally using those -- my go-to when working within a GUI environment is xfce4-terminal, which is about as native as I can imagine, given that I'm using XFCE as my primary desktop environment.

2. Sensible defaults may be good for new users, but I already have my terminal emulators configured exactly as I like them, and my benchmark for switching from one tool to another within the same category isn't how welcoming it is to novice users out of the box, it's how easily I can adjust its configuration to match my long-established preferences. The "zero configuration" philosophy is actually a detriment here, as it leads to configurability being obscured to some extent.

3. When I tested it, its performance was worse than xfce4-terminal, both objectively and subjectively. Its memory consumption was higher and it felt laggier in responding to input.


What's an example of a thing you usually need to fiddle with? I feel like 1 and 3 are already true for my existing native terminal emulators, which I also never configure. So presumably there are things Ghostty does that aren't enabled by default on my other ones that I could take advantage of?


I've been using Kitty terminal and they are absolutely comparable in the feature set: GPU-based rendering, speed, customisation. OS-native controls in Ghostty are not very important to me, I like terminal being a terminal, not a GUI.


I've been using Tabby on MacOS but Kitty looks neat, I'll give it a go. Thanks for mentioning it!


ghostty is faster than kitty (MBP M4 Max), I did not expect that


It's good at what it does. Starts quickly, doesn't get bogged down when dumping large amounts of text into it, got some nice themes.

I find it a bit messy to build but I'm not exactly a compile binaries kind of person anymore so it's probably a good sign that I still manage to figure it out. If stuff like Zig is your thing you'll probably enjoy this part.

My main terminal emulator is the bog slow but reliable Terminator, though in a while I'll probably flip the i3 commands and move over entirely to Ghostty.


Ghostty is quite slow to start on my Linux machine, very close the the first start of GNOME Terminal (or Terminator). Maybe because I'm on Wayland? Are you on X?


I had this issue a few years ago with certain applications and came to find out that it had to do specifically with them using GTK. I googled for it and found the fix, and after all the same apps started practically instantly. Could this be what you're running into?

(I haven't used ghostty so I wouldn't know whether it's actually fast to start up, but what you wrote reminded me about this particular issue.)


Maybe? I've tried removing xdg-desktop-portal-gnome as it seems to cause slow startup for other people but that doesn't seem to fix it.


Yeah, I'm still on X. I'll make the switch when I haven't heard a friend get really annoyed by something Wayland related in a year or so. It's a Debian system. Not sure if it matters but there's an RTX A500 in the machine.

So, perhaps? For a while I was on a local compile of 1.0.0, and a while ago I started pulling the nightly sources and build from those.


It's just better in every way than anything else I have ever tried (at least on a Mac). The author cares and it shows.

BTW, I recently discovered shaders and cursor_blaze is absolutely awesome.


It would be helpful if you actually listed the ways and the else


For one, it’s way faster than both iTerm and terminal.app, the two most used terminal apps on MacOS


Yes, it would.


Have you tried kitty?


As another person that spends the whole day in the terminal. It's sad to see there is no Windows version. I do not understand why I would need gpu acceleration for a terminal, but I would still try it.

I use a company managed/provided machine that runs windows, I do not have to bother maintaining it. All I use is basically Firefox and a MinGW to have a bash


I like that ghostty supports bitmap fonts. Kitty doesn’t (and won’t) support those.

I also like the “ghostty +list-themes” command and the splash page animation on their site.


I used iTerm2 a lot, configured it to my liking, but then I tried Ghostty for curiosity and for some reason I sticked to it. I think it's just cleaner and leaner and the default looks pretty cool and minimalistic, I don't really miss anything from iTerm.

Yes, it's just another terminal emulator, but a pretty solid one that just works.


Their most most recent update replaces all this with a list of recently updated PRs and issues. I've been learning on it heavily since it came out. One of the few recent changes that really feels like a clear improvement.


oh wow. I just had to press the "Try the new experience!" button about ten times for it to finally load the new experience, but I like it


As a sibling comment said, it's a C major chord, but voiced one noted at a time. "usually" / in pop, you hear all the notes at once.


> but voiced one noted at a time

I think OP's point is that the very definition of a chord is a bunch of notes played at the same time.


Whereas when played separately it would be an referred to as an arpeggio. But in harmony we might still refer to it as a chord, as in saying, arpeggiate the C# minor (chord) to start moonlight sonata.

This might better be described as arpeggiating C#m second inversion or even C#m/G# in the right over C# in the left...

This is getting possibly-weird but you could call it an arpeggiation of G#sus4(#5)/C#


I think chords at least three notes played at once, with the exception of maybe power chords. Using your definition, every piece with two or more notes has chords :)

https://www.britannica.com/art/chord-music


As per my knowledge, and as per Britannica, a chord actually uses three or more notes. A two note structure is called a diad, which implies a bit of confusion in the term "power chord" (written as 5, as in G5, which == G D == 1 5).. as it is not by definition a chord but a diad.

This may be a pedantic clarification, but that is the definition


TBH "definition" depends on the theory from which you're looking at the notes.

In the eyes of the Common Practice two simultaneous notes are not chords; in rock they most definitely are; in EDM you don't even care, since timbre is all that matters; in jazz you'd say "it depends" (e.g. might even be a triad with an omitted 5th... depending on context!)

Music theory is too post-hoc.


I've had a reMarkable 2 since 2020 or so. To be honest, the only area of the device I have ever wanted to be hackable was the sync API. I am completely satisfied with the gestures, e-reader and pretty much everything else. But what I'd love to be able to do is to access my files, stored in the cloud, automatically. My use case in particular would be something that passively converts my scribbled annotations into other things.

The API hacking scene is very much dead. Most API implementations have been unmaintained for years now and no longer work. It's a real shame.


Indeed, what sucks the most is to almost not being able to sync it conveniently in local without having to use "their" cloud.


This is a strange article to me. I've not seen any class that teaches Prolog place these constraints (use recursion / don't add new predicates) or even accidentally have the outcome of "making prolog look tedious". What's the joke here?

That aside, I wonder if getting the reverse solution (sum(?, 10)) is better served by the straightforward or the tedious approach. I suspect both would work just the same, but I'd be curious if anyone knows otherwise.


It's been a long time since I took a class like this, but I definitely had a similar experience to the author.

Ideas like fold and map where _never_ mentioned in lisp (to exaggerate, every function had to have the recursive implementation with 5 state variables and then a simpler form for the initial call), at no point did higher-order functions or closures make an appearance while rotating a list by 1 and then 2 positions.

The treatment of Prolog was somehow worse. Often the code only made any sense once you reversed what the lecturer was saying, realizing the arrow meant "X given Y" not "X implies Y", at which point, if you could imagine the variables ran "backwards" (unification was not explained) the outcome might start to seem _possible_. I expect the lecturer was as baffled by their presentation as we were.

In general, it left the rest of the class believing quite strongly that languages other than Java were impossible to use and generally a bad move. I may have been relatively bitter in the course evaluation by the end.


Thie irony is palpable. I had the (misfortune) of only being (mis)taught procedural languages by professors who thought computers were big calculators who could never be understood, but could be bent to your will by writing more code and maybe by getting a weird grad student to help.

Patterns might appear to the enlighted on the zeroth or first instance, but even the mortal must notice them after several goes. The magic of lisp is that if you notice yourself doing anything more than once you can go ahead and abstract it out.

Not everything needs to be lifted to functional valhalla of course, but not factoring out e.g. map and filter requires (imho) a wilful ignorance of the sort that no teacher should countenance. I think it's bad professional practise, bad pedagogy, and a bad time overall. I will die on this hill.


If you are only used to Java (the bad, old, ancient version), you don't even notice that you can factor out map and filter.


Having been on the other side of this (teaching undergrads), I do get why courses would be structured like this. If you actually try explaining multiple things, lots of students freeze up and absorb nothing. Certainly there’s a few motivated and curious students who are three lectures ahead of you, but if you design the class for them, 60% of students will just fail.

So I get why a professor wouldn’t jump in with maps and folds. First, you need to make students solve a simple problem, then another. At the third problem, they might start to notice a pattern - that’s when you say gee your right there must be a better way to do this, and introduce maps and folds. The top 10% of the class will have been rolling their eyes the whole time, thinking well duh this is so boring. But most students seem to need their hand held through this whole process. And today, of course, most students are probably just having LLMs do their homework and REALLY learn nothing


Ah, yes. Like in the class where we learned Moscow ML, where loops don’t and variables ain’t, and Godspeed!


I found the article to be hilarious.

It's existence obviously means some professor didn't allow them to use the standard library and they tongue in cheek show how irritating that is.

I'm sure it's possible they made it up, but we had similar restrictions where I could use an Arduino for my engineering Senior Design Project in college, but no Arduino or module libraries - just our own C. We passed, but barely. It also was weird as it didn't match how normal people would solve the problem.


Yeah, this line made it click:

> Now this is starting to look like Professor of Programming Languages code!

A lot of tier 2/3 CS schools with programming language courses (a survey of multiple languages across various paradigms) teach Prolog this way (and I've seen some fantastically terrible examples of Lisp code coming out of those same courses). It's unfortunate because, at these schools, this is often the only formal course where students branch out of C/Java/C++/C#/Python/JS (and they may only get 2-3 of those over the rest of their courses). It leaves the students with a gross misunderstanding of the languages and dread when they see them again.


>I'm sure it's possible they made it up, but we had similar restrictions where I could use an Arduino for my engineering Senior Design Project in college, but no Arduino or module libraries - just our own C.

When I started compsci, it was the first year of a all-new curriculum for CompSci.

Literally the first week: We had to program an Arduino to play a song on a speaker by toggling GPIO, within a week. No assembly or any of that high-level mumbo-jumbo. We had to read the datasheet, look at the instructions and write the hex representation for each instruction in a txt file.

We had a "linker" tool: It took a basic bootloader (containing some helper functions for GPIO), turned our hex-txt file into actual binary, then just copies that to a set address. The bootloader did nothing more than init the GPIO pins and jump into our code.

We were given the locations of where these helper functions lived, nothing else.

It was to give a quick and intimate understanding of how CPUs actually work and why we use compilers and stuff. It worked really well, but it was so much of a pain that they changed it to allow students to use asm a year or two after.


<Four Yorkshiremen accent>

You were lucky! We had to design and build the computer out of Z80-family components first before we could play around in binary.

There was actually a reason - this was back in '83/84 in a physics lab, and the idea was to be able to build your own logging / control device. PCs existed, but were terribly expensive.


> It also was weird as it didn't match how normal people would solve the problem.

If I can solve a problem by building from preexisting LEGO blocks then I'll probably do that, but it wouldn't be a valuable learning exercise. Students aren't being given problems in need of effective solution, these problems are tools to facilitate understanding.

What you described could be pointless if it made you work on reimplementing trivial stuff that you'd already mastered long time ago instead of focusing on actual problem. Writing your 100th hashmap implementation or yet another I2C driver isn't particularly valuable. Since you mentioned "barely passing", I don't think that was the case though.


It's one of those projects where you never have enough time throughout the year. Coding what we did from scratch wasn't very easy either. I could now probably do the coding in a weekend with the right library code. You learn something either way I guess.


There's nothing wrong with teaching like that. If the class is supposed to teach you something that isn't just "use the stdlib", then it makes sense to ban it.


I'm guessing the author is aware and just venting. It doesn't make it less frustrating though. It's almost like someone asking you to dig a hole with chopsticks.


It reminds me of a satirical post called “Evolution of a Haskell Programmer” in which a nice simple implementation of the Fibonacci function becomes more and more convoluted, abstract, and academic [1].

[1] https://people.willamette.edu/~fruehr/haskell/evolution.html


*factorial function


Or maybe just a very bad implementation of the Fibonnaci sequence.


I was extremely confused for a moment and pondered returning my MSc.


>getting the reverse solution (sum(?, 10))

Doing an "all modes" predicate:

  sum_list(List,Sum)
...where...

  sum_list([1,2,3], 6)
...is "True" as expected would probably be a pretty interesting exercise if you wanted to be able to get all of the lists. You'd probably need to do an "enumerate the rationals" type of thing, since you need to go to infinity in a couple of different directions at the same time. That is, you can't do nested loops/recursing:

   for(x=-∞; x<∞; ++x) {
       for(y=-∞; y<∞; ++y) {
i.e.

   sum_list([],0).
   sum_list([L|Ls],BigSum) :- sum_list(Ls, LittleSum), add(L,LittleSum,BigSum).
...with an all-modes "add/3". Since there are an infinite supply of pairs that add up to, say 10:

       ⋮
    [12,-2]
    [11,-1]
    [10, 0]
    [ 9, 1]
    [ 8, 2]
       ⋮
    [ 0,10]
    [-1,11]
    [-2,12]
       ⋮
...and you can also go to arbitrary list lengths:

  [10]
  [1,9]
  [1,1,8]
  [1,1,1,7]
   ⋮
  [1,1,1,1,1,1,1,1,1,1]
   ⋮
  [0,1,1,1,1,1,1,1,1,1,1]
  [1,0,1,1,1,1,1,1,1,1,1]
   ⋮
  [1,1,1,1,1,1,1,1,1,1,0,0,0,0,0,0,0]
   ⋮
  [-100,-99,-98,...,-12,-11,-9,-8,...,0,1,2,3,4,5,6,7,8,9,10,11,...,99,100]
...your life probably gets easier if you limit the domain to positive integers (without zero) since then the list length doesn't diverge. Then a declarative looking solution probably just has a all-modes equivalent to "sort" somewhere within? Certainly interesting to think about.

https://www.cs.ox.ac.uk/people/jeremy.gibbons/publications/r...

https://prolog-lang.org/ImplementersForum/0103-modes.html

https://www.swi-prolog.org/pldoc/man?predicate=permutation%2...


Woah, it's amazing to hear from you in person.

> Gwern.net has it! It's just that because we use both margins already, there is usually not enough horizontal space.

Upon closer inspection, I do. I should say that I have missed it up until now, though. Some sort of selective blindness, probably.

> Chrome has a 'link to text fragment' feature for this which got standardized and which Firefox is supposed to support at some point. So not so necessary as it used to be. (The unique IDs are often not so unique anyway or unstable.)

Thank you for reminding me of this; I stumbled across it some time ago and was quite surprised. I myself use Firefox, so until it adds support for this feature, I don't think I'll be leaning on it myself (and thus will try to provide anchors).

I would say that another microfeature I love is having stable and unique element IDs :-)

> Links.js hasn't been used in years on Gwern.net.

Thank you, I was failed by GitHub search. I will update the post. I struggled to find the Lorem link when looking for it; I would've expected to at least find something on /about, but did not. I understand that it's a test page, but it does currently have the only description of what the icons mean, and as a reader I definitely could've used a description.


Thank you for your thoughtful comment!

> But again, the page doesn't make use of the feature it is extolling the virtues of!

I said I loved the feature, not that I had the energy to implement it -- though I definitely should. :-)

> Yes, though I might argue that such links, or a link to a ToC page for the group, belong in a more general list of related links.

This is a good idea, and in my ideal world my site (and others) would support this, but it's more difficult to do automatically. Stitching together a series TOC and connecting sequences of articles is easier to automate.

> I'd stay away from an icon per target site/type though - that could quickly get confusing as you essentially have many icons for the same thing (an external link), and every site will use its own preferred set of icons, so the difference won't help generally identify things at a glance.

Yes, I'm not sure that I'd advocate for this in general. Another issue is that I personally get a little confused by some of the icons (what does "SAMA" stand for, if you don't know about OpenAI / Sam Altman already?).

> I'd like all articles to carry an obvious date of publishing and/or last update

Definitely, though as others in the comments have suggested, it may be that Google downranks pages like this. Not that I care much for my search ranking.


> I'd like all articles to carry an obvious date of publishing and/or last update

Definitely, though as others in the comments have suggested, it may be that Google downranks pages like this. Not that I care much for my search ranking.

I've heard the same but I've not seen trustworthy documentation stating that is an issue. It might be plausible that a date older than a certain point will make something be considered out of date, or more so than an article with a newer date, but if you think about it deeper it could be a bad indicator. How do you compare to articles without dates? I'd argue a dated article is better all else being equal, because the extra context makes it potentially more informative, some information doesn't date badly (at least not quickly), and how would the algorithms know what to do with pages containing multiple dates?


My site (OP) actually has a content graph as well, though it has a dedicated page. I didn't want to force users to run arbitrary JavaScript on every page just for aesthetics.

https://danilafe.com/graph/


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: