Hacker Newsnew | past | comments | ask | show | jobs | submit | aaaashley's commentslogin

Except social media feeds are designed to addict. A smoker will spend their time smoking instead of not smoking. Does that mean that smoking is good? Why else would they do it, if not smoking was better? It's not that simple. When we blame the users, we forget tech monopolies are spending billions to engineer systems which are stealing our time.


Seems like these are the asahi linux articles being taken from:

https://asahilinux.org/2023/03/road-to-vulkan/ https://asahilinux.org/2022/11/tales-of-the-m1-gpu/

Also, if you're looking for quality content about GPUs, this series of blogposts is a little outdated but still relevant:

https://fgiesen.wordpress.com/2011/07/09/a-trip-through-the-...

Seems like this is someone who just started learning GPU programming and is just throwing stuff onto the internet. See, their [recent blogpost, which is just a list of unwritten vulkan tutorials](https://hyeondg.org/vulkan_tutorial/0).


Funny thing about that n[static M] array checking syntax–it was even considered bad in 1999, when it was included:

"There was a unanimous vote that the feature is ugly, and a good consensus that its incorporation into the standard at the 11th hour was an unfortunate decision." - Raymond Mak (Canada C Working Group), https://www.open-std.org/jtc1/sc22/wg14/www/docs/dr_205.htm


It wasn't considered bad, it was considered ugly and in the context given that is a major difference. The proposed alternative in that post to me is even more ugly so I would have agreed with the option that received the most support, to leave it as it was.


It was always considered bad not (just) because it's ugly, but because it hides potential problems and adds no safety at all: a `[static N]` parameter tells the compiler that the parameter will never be NULL, but the function can still be called with a NULL pointer anyway.

That's is the current state of both gcc and clang: they will both happily, without warnings, pass a NULL pointer to a function with a `[static N]` parameter, and then REMOVE ANY NULL CHECK from the function, because the argument can't possibly be NULL according to the function signature, so the check is obviously redundant.

See the example in [1]: note that in the assembly of `f1` the NULL check is removed, while it's present in the "unsafe" `f2`, making it actually safer.

Also note that gcc will at least tell you that the check in `f1()` is "useless" (yet no warning about `g()` calling it with a pointer that could be NULL), while clang sees nothing wrong at all.

[1] https://godbolt.org/z/ba6rxc8W5


Interesting, I wasn't aware of that and thought the compiler would at least throw up a warning if it had seen that function prototype.


It's not intuitive, although arguably conforms to the general C philosophy of not getting in the way unless the code has no chance of being right.

For example, both compilers do complain if you try to pass a literal NULL to `f1` (because that can't possibly be right), the same way they warn about division by a literal zero but give no warnings about dividing by a number that is not known to be nonzero.


Right, so if the value is known at compile time it will flag the error but if it only appears at runtime it will happily consume the null and wreak whatever havoc that will lead to further down the line. Ok, thank you for pointing this out, I must have held that misconception for a really long time.


Note that the point of [static N] and [N] is to enforce type safety for "internal code". Any external ABI facing code should not use it and arguably there should be a lint/warning for its usage across an untrusted interface.

Inside of a project that's all compiled together however it tends to work as expected. It's just that you must make sure your nullable pointers are being checked (which of course one can enforce with annotations in C).

TLDR: Explicit non-null pointers work just fine but you shouldn't be using them on external interfaces and if you are using them in general you should be annotating and/or explicitly checking your nullable pointers as soon as they cross your external interfaces.


Wow, that’s crazy. Does anyone have any context on why they didn’t fix this by either disallowing NULL, or not treating the pointer as non-nullable? I’m assuming there is code that was expecting this not to error, but the combination really seems like a bug not just a sharp edge.


Treating the pointer as not-nullable is precisely the point of the feature, though. By letting the compiler know that there's at least N elements there, it can do things like e.g. move that read around and even prefetch if that makes the most sense.


Indeed, at a minimum you should be able to enforce that check using a compiler flag.


You can add that check using -fsanitize=null (and you may want to turn the diagnostic into a run-time trap)


The "development time is more important than performance" motto treats bad performance as the problem with software, when in reality poor performance is a symptom of growing software complexity. I'm sure that each individual coder who has contributed to software bloat believed that their addition was reasonable. I'm sure everyone who has advocated for microservices fully believes they are solving a real-world problem. The issue is that we don't treat complexity as a problem in-and-of-itself.

In physical disciplines, like mechanical engineering, civil engineering, or even industrial design, there is a natural push towards simplicity. Each new revision is slimmer & more unified–more beautiful because it gets closer to being a perfect object that does exactly what it needs to do, and nothing extra. But in software, possibly because it's difficult to see into a computer, we don't have the drive for simplicity. Each new LLVM binary is bigger than the last, each new HTML spec longer, each new JavaScript framework more abstract, each new Windows revision more bloated.

The result is that it's hard to do basic things. It's hard to draw to the screen manually because the graphics standards have grown so complicated & splintered. So you build a web app, but it's hard to do that from scratch because the pure JS DOM APIs aren't designed for app design. So you adopt a framework, which itself is buried under years of cruft and legacy decisions. This is the situation in many areas of computer science–abstractions on top of abstractions and within abstractions, like some complexity fractal from hell. Yes, each layer fixes a problem. But all together, they create a new problem. Some software bloat is OK, but all software bloat is bad.

Security, accessibility, and robustness are great goals, but if we want to build great software, we can't just tack these features on. We need to solve the difficult problem of fitting in these requirements without making the software much more complex. As engineers, we need to build a culture around being disciplined about simplicity. As humans, we need to support engineering efforts that aren't bogged down by corporate politics.


We do see the same in physical engineering too. At some point some products have plateaued, there's no more development. But still you need to sell more, the designers need to get paid and they are used as status symbols and so on.

One example is skirt length. You have fashion and the only thing about it is change. If everybody's wearing short skirts, then longer skirts will need to be launched in fashion magazines and manufactured and sent to shops in order to sell more. The actual products have not functionally changed in centuries.


Or people just start ignoring the trends and only replace their clothes when they wear out.


I don't think that fashion trends are comparable. I think that fashion trends are fine in concept–things get old and we switch things up. It's the way the human superorganism is able to evolve new ideas. Unfortunately, capitalism accelerates these changes to an unreasonable pace, but even in Star Trek communism, people get bored. The cultural energy that birthed one style is no longer present, we always need something new that appeals to the current time.

But clothes still have to look nice. Fashion designers have a motivation to make clothes that serve their purpose elegantly. Inelegance would be adding metal rails to a skirt so that you could extend its length at will. Sure, the new object has a new function, and its designer might feel clever, but it is uglier. But ugly software and beautiful software often look the same. So software trends end up being ugly, because no one involved had an eye for beauty.


Speaking of using custom CSS with YouTube, I do the following for my experience:

- Completely hide the recommended tab

- Make every thumbnail grayscale (to mitigate eye-catching thumbnails)

- Make every video title lowercase (to mitigate eye-catching titles)

Here's my code, although I have to update it every once and a while when YouTube changes:

  yt-thumbnail-view-model { filter: grayscale(); }
  h3[title] { text-transform: lowercase; }
  .ytd-watch-flexy #secondary { display: none !important; }
It's amazing how much a couple small changes can make on your browsing experience. The companies that own these products have a huge incentive to make every element purposefully addictive. I've also patched the iOS Instagram app to remove all Reels (using FLEXtool & Sideloadly), so I can keep up with my friends without falling into the traps. As developers, we have the ability to target these manipulative tactics and remove them, and I encourage you to do this as much as possible.


If you disable History, it automatically removes Recommendations across your devices.


What do you get instead of recommendations? Random junk or just nothing?

I find YouTube recommendations very useful. I only get what I'm interested in or adjacent topics, no junk, no ragebait.


Nothing, you will get an empty start page.


The issue for me is that I really want History (sometimes I need to go back to a video I know I watched 3 weeks ago). It's bs that they need history disabled to also disable recommendations.


Can you explain how you patched the iOS IG app? Seems massively useful if it's not too much of a pain. Please share!


I think CSS tools like that appeal to people who learned web development in a kind of ad-hoc way. When I first started, I just wanted to make designs I had in my head. I kind of went from "header, text, and image on the page" to, "how do I center this?", "how do I change this color?", "how do I space these elements out?" It wasn't long before I had developed a toolkit of CSS ideas, but once you do that, you lose out on a lot of the finer details that make CSS work well. I knew how to work around weird issues using position: absolute and transform, but I wasn't familiar with block formatting contexts, or the intricacies of the box model. When all of your CSS knowledge is just band-aids placed on your other shoddy CSS knowledge, you're working on fumes. At that point, you could imagine the appeal of grabbing a prebuilt toolkit of composable styles that takes away your access a lot of the available CSS footguns.

What changed things for me was reading an short online book-style series about learning HTML/CSS from the ground up. It introduced everything from first principles, and had an approach where they explained why things were the way they were. They didn't just give you their "top 10 ways to center a div" and ask you to leave. I read the whole thing in an afternoon and it changed the way I think about web development. For the life of me, I can't remember what the book was called. If anyone's read something similar I'd love a reference, it was a while ago now and I'd still like to reference it. I specifically remember them saying "display: block is like a word document, and display: flex is like how you'd expect things to work," which illuminated a lot for me, not just about the display property, but generally about the way HTML & CSS were designed.


> What changed things for me was reading an short online book-style series about learning HTML/CSS from the ground up.

Do you remember which one? Your experience sounds a lot like mine and I’d love to learn CSS from first principles.


Unfortunately, no :( I was hoping someone would recognize the book and reply...


That's a really good way to think about it! (Bonus points for ASCII diagrams.) IIRC, I had some similar visualizations in the article but I cut them. And nice paper, interesting way to solve that problem.


JavaScript! It's not perfect, but it's the easiest language to run in the browser :)


Ok I see. Making it work in a browser sure is a benefit


Original author here. I've been reading this website for years. Imagine my shock when I saw my own article on the front page! I'm glad people are enjoying it.

Quick fact about the way the interactivity is done, all of the code for it is in this blogpost.js file: https://aaaa.sh/creatures/blogpost.js, which is only about 100 lines long. Each block has a list of scripts that it pulls from like so:

<div class="code-example" scripts="grid-sm 2d-vector-gfx-lib draw-grid full-algo-intro feather-canvas-edges"></div>

and then there's a set of script tags with those ids. I figured it was a nice solution!


Really pretty outputs! And impressive that the code is shown so elegantly and also interactive.

This algorithm is one I use to demo some features in a language I'm making called calculang [0][1] I like the way you step through the logic.

My only suggestion would be to include a raycasted scene because you can really grab a wide(r) audience (and I'd love to see how you get to voxel scenes!).

Either way - thanks for adding a neat new resource for this algorithm and I'm definitely taking notes Re your clean presentation of the details!

[0] https://next-calculang-gallery.netlify.app/raycasting

[1] https://www.youtube.com/watch?v=hKVXRACCnqU


Haha, I don't know why I didn't think to include a raycast scene, especially since that's what I was using the algorithm for!! Glad you liked it.


This is great. You should add an og:image / social image using one of the interactive bits to help with sharing on other platforms / discord etc!


Great job, thanks for breaking it down the way you did


That's just really fantastic. Well done, indeed.


For me, the point of writing something in C is portability. There were C compilers 30 years ago, there are C compilers now, and there will almost certainly be C compilers 30 years from now. If I want to write a good, portable library that's going to be useful for a long time, I'll do it in C. This is, at least, the standard in many gamedev circles (see: libsdl, libfreetype, the stb_* libraries). Under that expectation, I write to a standard, not a compiler.


The bad news is that C23 broke a lot of existing code[1]. We had to do a lot of work in Fedora to fix the resulting mess. [1] https://gcc.gnu.org/gcc-15/porting_to.html#c23


For the cases in linked doc, does adding -std=gnu17 to packages not suffice?

I would consider the union initializer change (require adding -fzero-init-padding-bits=unions for old behavior) much more hidden and dangerous, which is not directly related to ISO C23 standard.


It's true that it does, yes. However that would still require changes to the build system. In any case for the vast majority of the packages we decided to fix (if you think this is a fix!) the code.


>if you think this is a fix

I would count it as doing maintenance work for the upstream, kudos for doing this!


> Note that the bool type is not the same as int at ABI level,

Huh. Wonder what the benefits of that are. Bools being ints never struck me as a serious problem. I wonder if this catches people who accidentally assign rather than compare tho....


For accessing any post-1970s operating system feature (e.g. async IO or virtual memory) you already cannot use standard C anymore (and POSIX is not the C stdlib).

The libraries you listed are all full of platform-specific code, and also have plenty of compiler-specific code behind ifdefs (for instance the stb headers have MSVC specific declspec declarations in them).

E.g. there is hardly any real-world C code out there that is 'pure standard C', if the code compiles on different compilers and for different target platforms then that's because the code specifically supports those compilers and target platforms.


My argument is that using these non-standard extensions to do important things like memory management in a C library is malpractice—it effectively locks down the library to specific C compilers. I'm sure that's fine if you're writing to clang specifically, but at that point, you can just write C++. libfreetype & stb_* are used and continue to be used because they can be relied on to be portable, and using compiler-specific extensions (without ifdefs) defeats that. If I relied on a clang-specific `defer`, I'm preventing my library from possibly being compiled via a future C compiler, let alone the compilers that exist now. To me, that's the point of writing C instead of C++ for a library (unless you're just a fan of the simplicity, which is more of an ideological, opinion-based reason).


If I touch C it has to have control over allocations, memory layout, and wrapping low level code into functions I can call from other languages.

I'd target the latest C standard and won't even care to know how many old, niche compilers I'm leaving out. These are vastly different uses for C and obviously your gaols drastically change your standard or compiler targeted.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: