Hacker Newsnew | past | comments | ask | show | jobs | submit | dllu's commentslogin

That reminds me of Paquerette Down the Bunburrows [1] which is a very fun pathfinding game where the bunnies will pathfind to try to run away from you. It's not exactly what you described, but it is very fun and surprisingly deep and challenging.

[1] https://store.steampowered.com/app/1628610/Paquerette_Down_t...


You can think of it as: linear regression models only noise in y and not x, whereas ellipse/eigenvector of the PCA models noise in both x and y.


That brings up an interesting issue, which is that many systems do have more noise in y than in x. For instance, time series data from an analog-to-digital converter, where time is based on a crystal oscillator.


Well yeah, x is specifically the thing you control, y is the thing you don't. For all but the most trivial systems, y will be influenced by something besides x which will be a source of noise no matter how accurately you measure. Noise in x is purely due to setup error. If your x noise was greater than your y noise, you generally wouldn't bother taking the measurement in the first place.


“ If your x noise was greater than your y noise, you generally wouldn't bother taking the measurement in the first place.”

Why not? You could still do inference in this case.


You could, and maybe sometimes you would, but generally you won't. If at all possible, it makes a lot more sense to improve your setup to reduce the x noise, either with a better setup or changing your x to be something you can better control.


This fact underlies a lot of causal inference.


I’m not an SME here and would love to hear more about this.


So when fitting a trend, e.g. for data analytics, should we use eigenvector of the PCA instead of linear regression?


(Generalized) linear models have a straightforward probabilistic interpretation -- E(Y|X) -- which I don't think is true of total least squares. So it's more of an engineering solution to the problem, and in statistics you'd be more likely to go for other methods such as regression calibration to deal with measurement error in the independent variables.


Is there any way to improve upon the fit if we know that e.g. y is n times as noisy as x? Or more generally, if we know the (approximate) noise distribution for each free variable?


> Or more generally, if we know the (approximate) noise distribution for each free variable?

This was a thing 30 odd years ago in radiometric spectrometry surveying.

The X var was time slot, a sequence of (say) one second observation accumulation windows, the Yn vars were 256 (or 512, etc) sections of the observable ground gamma ray spectrum (many low energy counts from the ground, Uranium, Thorium, Potassium, and associated breakdown daughter products; some high energy counts from the infinite cosmic background that made it through the radiation belts and atmosphere to near surface altitudes)

There was a primary NASVD (Noise Adjusted SVD) algorithm (Simple var adjustment based on expected gamma event distributions by energy levels) and a number of tweaks and variations based on how much other knowledge seemed relevant (broad area geology and radon expression by time of day, etc)

See, eg: Improved NASVD smoothing of airborne gamma-ray spectra Minty / McFadden (1998) - https://connectsci.au/eg/article-abstract/29/4/516/80344/Imp...


Yeah, you can generally "whiten" the problem by scaling it in each axis until the variance is the same in each dimension. What you describe is if x and y have a covariance matrix of like

    [ σ², 0;
      0,  (nσ)² ]
but whitening also works in general for any arbitrary covariance matrix too.

[1] https://en.wikipedia.org/wiki/Whitening_transformation


It might be cool to train neural network by minimizing error with assumption there's noise on both inputs and outputs.


Ah yeah, I recently changed the url scheme for my blog posts, from

    https://daniel.lawrence.lu/blog/y2025m09d21
to

    https://daniel.lawrence.lu/blog/2025-09-21-line-scan-camera-image-processing/
so maybe that's why OP didn't realize that it had already been posted recently. With the older scheme, not only is SEO bad, but it was really hard to remember which date corresponds to which blog post, and people can brute force search for my hidden (unpublished) blog posts easily.


The gain is fixed. I think the column variation arises from unstable oscillator frequency and maybe some electrical bug/crosstalk between pixels. Not sure exactly.


When converting video to gif, I always use palettegen, e.g.

    ffmpeg -i input.mp4 -filter_complex "fps=15,scale=640:-2:flags=lanczos,split[a][b];[a]palettegen=reserve_transparent=off[p];[b][p]paletteuse=dither=sierra2_4a" -loop 0 output.gif
See also: this blog post from 10 years ago [1]

[1] https://blog.pkh.me/p/21-high-quality-gif-with-ffmpeg.html


In many cases today “gif” is a misnomer anyway and mp4 is a better choice. Not always, not everywhere supports actual video.

But one case I see often: If you’re making a website with an animated gif that’s actually a .gif file, try it as an mp4 - smaller, smoother, proper colors, can still autoplay fine.


I've been thinking of integrating pngquant as an ffmpeg filter, it would make it possible to generate even better pallettes. That would get ffmpeg on par with gifski.


Does ffmpeg's gif processing support palette-per-frame yet? Last time I compared them (years ago, maybe not long after that blog post), this was a key benefit of gifski allowing it to get better results for the same filesize in many cases (not all, particularly small images, as the total size of the palette information can be significant).


I use `split[s0][s1];[s0]palettegen=max_colors=64[p];[s1][p]paletteuse=dither=bayer` personally, limiting the number of colors is a great way to transparently (to a certain point, try with different values) improve compression, as is bayer (ordered) dithering which is almost mandatory to not explode output filesizes.


Gifski (https://gif.ski/) might be a good alternative to look to that's gif-pallete aware.


It’s a shame this isn’t the default.


Those command flags just roll off the tongue like two old friends catching up!

/s


Agreed. On the computer hardware side:

* x86 chips can surpass the M series cpus in multithreaded performance, but are still lagging in singlethreaded performance and power efficiency

* Qualcomm kinda fumbled the Snapdragon X Elite launch with nonexistent Linux support and shoddy Windows stability, but here's to hoping that they "turn over a new leaf" with the X2.

Actually, some Snapdragon X Elite laptops do run Linux now, but performance is not great as there were some weird regressions and anyway newer chips have caught up [1].

On the build quality side, basically all the PCs are still lagging behind Apple, e.g. yesterday's rant post about the Framework laptop [2] touched on a lot of important points. Of course, there are the Thinkpads, which are still built decently but are quite expensive. Some of the Chinese laptops like the Honor MagicBooks could be attractive and some reddit threads confirm getting Linux working on them, but they are hard to get in the US. That said, at least many non-Apple laptops have decent trackpads and really nice screens nowadays.

[1] https://www.phoronix.com/review/snapdragon-x-elite-linux-eoy...

[2] https://news.ycombinator.com/item?id=46375174


I have no faith in Qualcomm to even make me basic gestures towards the Linux community.

All I want is an easy way to install Linux on one of the numerous Snapdragon laptops. I think the Snapdragon Thinkpad might work, but none of the other really do.

A 400$ Arm laptop with good Linux support would be great, but it's never ever going to happen.


Facts are Linux support has heavily accelerated from both Qualcomm and Linaro on their behalf. Anyone who watches Linux ARM mailing lists can attest that.

Things have definitely changed, a lot.


Hardware has already been out for a year. Outside a custom spin by the ubuntu folks, even last years notebooks arent well supported out of the box on linux. I have a Yoga Slim 7x and I tried the Ubuntu spin out at some point - it required me to first extract the firmware from the Windows partition because Qualcomm had not upstreamed it into linux-firmware. Hard to take Qualcomm seriously when the situation is like this.


Qualcomm _does_ upstream all their firmware, but vendors usually require a firmware binary to be signed with their keys, which are burned into the SoC. As a result you cannot use Qualcomm's vanilla firmware and need to extract the original firmware as provided by the vendor, otherwise it won't load. This is an actual security feature, believe it or not. Besides, chances are it wasn't even Qualcomm's firmware, but rather Cirrus for sound or display firmware, etc.

I get the hate on Qualcomm, but you're really one LLM question away from understanding why they do this. I should know, I was also getting frustrated before I read up on this.


I get where youre coming from but I think the job of a company pushing a platform is to make it "boring". ie it should work out of the box on debian/fedora/arch/ubuntu. The platform vendor (Qualcomm) is the only one with enough sway to push the different laptop manufacturers do the right thing. This is the reason why both Intel / Windows push compliance suites which have a long list of requirmements before anyone can put the Windows / Intel logo on their device. If Qualcomm is going to let Acer / Lenovo decide if things work out of the box on linux then its never going to happen.


Fantastic.

Can you please let me know if there is an ISO to get any mainstream Linux distro working on this Snapdragon laptop ?

ASUS - Vivobook 14 14" FHD+ Laptop - Copilot+ PC - Snapdragon X

It's on sale for $350 at Best buy and if I can get Linux working on it it would definitely be an awesome gift for myself.

Even if there's some progress being made, it's still nearly impossible to install a typical Linux distro on one of these. I've been watching this space since the snapdragon laptops were announced. Tuxedo giving up and canceling their Snapdragon Linux laptop doesn't instill much confidence


There's an Ubuntu release specifically targeting new Qualcomm Elite based laptops: https://discourse.ubuntu.com/t/ubuntu-concept-snapdragon-x-e...

This includes Vivobook S15, not sure about the 14.


That covers the Elite, not the cheaper Snapdragon X laptops such as the ASUS Vivobook 14 (X1407QA).

I've followed that thread for almost a year. It's a maze of hardware issues and poor compatibility.

From your other response.

>but vendors usually require a firmware binary to be signed with their keys, which are burned into the SoC. As a result you cannot use Qualcomm's vanilla firmware and need to extract the original firmware as provided by the vendor, otherwise it won't load.

This makes the install process impossible without an existing Windows install. It's easier to say it doesn't work and move on.

It's going to be significantly easier to buy run Linux in an X86 laptop.

Not to mention no out of the box Linux Snapdragon Elite laptop exists. It's a shame because it would probably be an amazing product.


This sounds a lot like how AMD’s approach had changed on Linux and still everyone I know who wants to use their GPU fully used Nvidia. For a decade or more I’ve heard how AMD has turned over a new leaf and their drivers are so much better. Even geohot was going to undercut nvidia by just selling tinygrad boxes on AMD.

Then it turned out this was the usual. Nothing had changed. It was just that people online have this desire to express that “the underdog” is actually better. Not clear why because it’s never true.

AMD is still hot garbage on Linux. Geohot primarily sells “green boxes”. And the MI300x didn’t replace H100s en masse.


Maybe it's just that you're mostly viewing this through the LLM lens?

I remember having to fight with fglrx, AMDs proprietary Linux driver, for hours on end. Just to get hardware-acceleration for my desktop going! That driver was so unbearable I bought Nvidia just because I wanted their proprietary driver. Cut the fiddling time from many hours to maybe 1 or 2!

Nowadays, I run AMD because their open-source amdgpu driver means I just plonk the card into the system, and that's it. I've had to fiddle with the driver exactly zero times. The last time I used Nvidia is the distant past for me. So - for me, their drivers are indeed "so much better". But my usecase is sysadmin work and occasional gaming through Steam / Proton. I ran LMStudio through ROCm, too, a few times. Worked fine, but I guess that's very much not representative for whatever people do with MI300 / H100.


> and occasional gaming through Steam / Proton

And how does that work on AMD? I know the Steam Deck is AMD but Valve could have tweaked the driver or proton for that particular GPU.


I play lots of games on a AMD GPU (RX 7600) for about a year and I can't remember a game that had graphical issues (eg driver bugs).

Probably something hasn't run at some point but I can't remember what, more likely to be a Proton "issue". Your main problem will be some configuration of anti-cheat for some games.

My experience has been basically fantastic and no stress. Just check that games aren't installing some Linux build which are inevitably extremely out of date and probably wont run. Ex: human fall flat (very old, wont run), deus ex mankind divided (can't recall why but I elected to install the proton version, I think performance was poor or mouse control was funky).

I guess I don't play super-new games so YMMV there. Quick stuff I can recall, NMS, Dark Souls 1&2&3, Sekiro, Deep Rock Galactic, Halo MCC, Snow runner & Expeditions, Eurotruck, RDR1 (afaik 2 runs fine, just not got it yet), hard space ship breaker, vrising, Tombraider remaster (the first one and the new one), pacific drive, factorio, blue prince, ball x pit, dishonored uhhh - basically any kind of "small game" you could think of: exapunks, balatro, slay the spire, gwent rougemage, whatever. I know there were a bunch more I have forgotten that I played this year.

I actually can't think of a game that didn't work... Oh this is on Arch Linux, I imagine Debian etc would have issues with older Mesa, etc.


Works very well for me! YMMV maybe depending on the titles you play, but that would probably be more of a Proton issue than an AMD issue, I'd guess. I'm not a huge gamer, so take my experience with a grain of salt. But I've racked up almost 300 hours of Witcher3 with the HQ patch on a 4k TV display using my self-compiled Gentoo kernel, and it worked totally fine. A few other games, too. So there's that!


Don’t know what LLM lens is. I had an ATI card. Miserable. Fglrx awful. I’ve tried various AMDs over the last 15 years. All total garbage compared to nvidia. Throughout this period was consistently informed of new OSS drivers blah blah. Linus says “fuck nvidia”. AMD still rubbish.

Finally, now I have 6x4090 on one machine. Just works. 1x5090 on other. Just works. And everyone I know prefers N to A. Drivers proprietary. Result great. GPU responds well.


Well, I don't know why it didn't work out for you. But my AMD experience has improved fundamentally since the fglrx days, to the point where I prefer AMD over Nvidia. You said you don't know why people say that AMD has improved so much, but it definitely rings true for me.

I said "LLM lens" because you were talking about hardware typically used for number crunching, not graphics displays, like the MI300. So I was suggesting that the difference between what you hear online about the driver and your own experience might result from people like me mostly talking about the 2d / 3d acceleration side of things while the experience for ROCm and stuff is probably another story altogether.


I see. I see. I got tripped up by 'LLM' since I got the GPUs for diffusion models. Anyway, the whole thing sounds like the old days when I had Ubuntu Dapper Drake running flawlessly on my laptop and everyone was telling me Linux wasn't ready: it's an artifact of the hardware and some people have great support and others don't. Glad you do.


Google has previously delivered good Linux support on Arm Chromebooks and is expected to launch unified Android+ChromeOS on Qualcomm X2 Arm devices in 2026.


Isn't Google moving to Fuchsia?


I don't think these are mutually exclusive, they're just unifying ChromeOS and Android for now.


On bare metal or pKVM?


Fuchsia is dead sadly


It's very alive. It's being used for Google Nest hub devices. Through for HN that might as well it being dead, it seems.


“Rumors of my death are greatly exaggerated”

Google folks pop up here and there and say it’s actively worked on. Unless you have more recent information, I believe the project is still alive.


I thought it was the opposite and that it would replace Linux for Google products


Citation needed? I don’t disbelieve you but I haven’t seen anything concrete.


I bought a refurb gen 4 thinkpad on amazon for like $350 and it arrived almost brand new.

Installed arch, setup some commands to underclock the processor on login and easily boost it when I'm compiling.

Battery life is great but I'm not running a GUI either. Good machine for when I want to avoid distractions and just code.


My personal beef with Thinkpads is the screen. Most of the thinkpads I’ve encountered in my life (usually pretty expensive corporate ones) had shitty FHD screens. I got too spoiled by retina screens, and I can’t comfortably use anything with lower DPI.


FWIW if you buy new from Lenovo, getting a more high-res display has been an option for years.

I'm on the other side where I've been buying Thinkpads partly because of the display. Thinkpads have for a long time been one of the few laptop options on the market where you could get a decent matte non-glare display. I value that, battery life and performance above moar pixels. Sure I want just one step above FHD so I can remote 1080p VMs and view vids in less than fullscreen at native resolution but 4K on a 14" is absolute overkill.

I think most legit motivations for wanting very high-res screens (e.g. photo and video editing, publishing, graphics design) also come with wanting or needing better quality and colors etc too, which makes very-highly-scaled mid-range monitors a pretty niche market.

> I got too spoiled by retina screens, and I can’t comfortably use anything with lower DPI.

Did you make a serious effort while having an extended break from retina screens? I'd think you would get used to it pretty quickly if you allow yourself to readjust. Many people do multi-DPI setups without issues - a 720p and a 4k side-by-side for example. It just takes acclimatizing.


I have a 14” FHD panel (158 dpi) on an old (7 year) laptop and there’s more issues with low resolution icons and paddings than with font rendering. I wouldn’t mind more, but it’s not blurry.


I just learned on Reddit the other day that people replace those screens with third party panels, bought from AliExpress for peanuts. They use panelook.com to find a compatible one.


If you buy a X1 from Lenovo the screen is definitely going to be better. if not, you can simply change the screen from most of the other models.


Old Thinkpads are great! I used to have a Lenovo Thinkpad X1 Carbon Gen 6 with Intel Core i7 8640U, 16 GB of RAM, and 1 TB SSD. I installed Arch Linux on it with Sway.


> x86 chips can surpass the M series cpus in multithreaded performance, but are still lagging in singlethreaded performance

Nodding along with the rest but isn't this backwards? Are M series actually outperforming an Intel i9 P-core or Ryzen 9X in raw single-threaded performance?


Not in raw performance, no, but they're only beat out by i9s and the like, which are very power hungry. If you care even a little bit about performance per watt, the M series are far superior.

Have a look at Geekbench's results.[1] Ignore the top ones, since they're invalid and almost certainly cheated (click to check). The iPads and such lower down are all legit, but the same goes for some of the i9s inbetween.

And honestly, the fact that you have to go up to power hungry desktop processors to even find something to compete with the chip that goes in an (admittedly high-end) iPad, is somewhat embarrassing on its face, and not for Apple.

https://browser.geekbench.com/v6/cpu/singlecore


Yes, the M4 is still outperforming the desktop 9950X in single-threaded performance on several benchmarks like Geekbench and Cinebench 2024 [1]. Compared to the 9955HX, which is the same physical chip as the 9950X but lower clocked for mobile, the difference is slightly larger. But the 16 core 9950X is obviously much better than the base M4 (and even the 16 core M4 Max, which has only 12 P cores and 4 E cores) at multithreaded applications.

However, the M2 in the blog post is from 2022 and isn't quite as blazingly fast in single thread performance.

[1] https://nanoreview.net/en/cpu-compare/apple-m4-8-cores-vs-am...


Does an i9 P-core or Ryzen 9X run on 3.9 W while posting on HN?


That's irrelevant to that claim being true or not. The fact that M series win in power efficiency is already addressed.


The closest laptop to MacBook quality is surprisingly the Microsoft Surface Laptop.

As to x86, Zen 6 will be AMD's first major architecture rework since Apple demonstrated what is possible with wide decode. ( Well more accurately it should be since the world take notice because it happened long before M1 ). It still likely wont be close to M5 or even M4 with Single Threaded Performance / Watt, but hopefully it will be close.


  > Actually, some Snapdragon X Elite laptops do run Linux now, but performance is not great as there were some weird regressions and anyway newer chips have caught up [1].
ohh thanks for that link; i was thinking about updating to the latest on my asusbook s15 but i think ill stick with the current ubuntu concept for now... saved me some trouble!


Honor strangely enough doesnt make any efforts to really support Linux

The machine quality is pretty damn good, but Huawei machines are still better. Apple level of quality. And Huawei releases their machines with Linux preinstalled

The company to watch is Wiko. Its their French spin off to sidestep their chip ban. They might put out some very nice laptops, but a bit tbd


Dealing with Honor support is a pain. They don't understand absolutely anything and is impossible to get them out of their script if you have a problem.

I have a Honor 200 pro, and the software is buggy and constantly replaces user configurations with their defaults every 3 or 4 days.

I would avoid anything Honor in the future at any cost.


> On the build quality side, basically all the PCs are still lagging behind Apple,

This is an oft-repeated meme, but not really true. Thinkpads, high-end lightweight gaming laptops like the Asus G14... There are many x86 laptops with excellent build quality.


Marcan (Hector Martin) resigned from Asahi Linux early this year [0].

Asahi Lina, who also did tons of work on the Asahi Linux GPU development, also quit as she doesn't feel safe doing Linux GPU work anymore [1].

[0] https://marcan.st/2025/02/resigning-as-asahi-linux-project-l...

[1] https://asahilina.net/luna-abuse/


marcan and asahi lina is the same person


GP's LKML link is very recent unlike your two links, implying something could've changed.


I have no insight into the Asahi project, but the LKML link goes to an email from James Calligeros containing code written by Hector Martin and Sven Peter. The code may have been written a long time ago.


Apple does tons of optimizations for every component to improve battery life. Asahi Linux, which is reverse engineered, doesn't have the resources to figure out each of those tricks, especially for undocumented proprietary hardware, so it's a "death by a thousand cuts" as each of the various components is always drawing a couple of milliwatts more than on macOS.


Very neat. I recently went to the Waymo depot in Bayshore (Toland St) and snapped a couple of pictures of the new Zeekrs for Wikipedia.

[1] https://commons.wikimedia.org/wiki/File:Waymo_Zeekr_Vehicle_...

[2] https://commons.wikimedia.org/wiki/File:Waymo_Zeekr_Vehicle_...


The line scan photo of the Shinkansen train is amazing!

https://commons.wikimedia.org/wiki/File:Line_scan_photo_of_S... (If you're confused by that page, then so was I. It contains the full photo but it's so long that it gets compressed into a line a few pixels high.)


How are Google allowed to get Zeekr? Are they pre tariff or they have some loophole being corporate?


Tariffs are easy, just pay them. Federal Motor Vehicle Safety Standards are harder... But maybe there's a loophole for commercial transport? or maybe they paid to have the testing done?


They do have a loophole; they import them as kits and “build” them at a Magna facility in Arizona (similar to how early Sprinter vans were re-assembled in the US and sold as Freightliners). But, they are FMVSS compliant (besides steering wheel) and have had several NHTSA organized recalls like any other compliant car might.


Whoah cool, did not know Magna had a factory there!


> some loophole being corporate

Presumably they use the loophole called "paying the tariff".


Which isn’t even really that prohibitive because Chinese vehicles beat Western pricing by five figures.

Plus, all the sensor equipment is made in China anyway. There’s almost certainly no way to have it manufactured in the US.

On top of that, fleet sales don’t have to deal with the antiquated dealer network laws in the US.

And of course American market car manufacturers refuse to produce vehicles that are like this one: space efficient and reasonably sized, instead opting for gigantic bean shaped SUVs with sloping rear roofs that rob you of cargo space while taking up maximum curb real estate.


I would pay lots of money for an electric minivan (or van I guess) with removable seats that can fit a 4 ft x 8ft board.


Ford E-Transit is an electric van for a lot of money. But it looks like Ford wants to stop making them, and 2 seat models look much easier to find. But you'd be able to fit your board no problem.


I think the VW ID buzz is probably the closest thing you’ll find in the US.

Ignore the crazy high MSRP, they are selling poorly and you should be able to get one brand new or lightly used one in the 40s.

I think in a short couple of years they could be a steal on the used market.


Not sure if it's sold in the US (assuming you are from there), but the Kia PV5 is probably your best bet. On top of that it's very reasonably priced (in contrast to the ID buzz)


Not sold in the US.

On the bright side the ID Buzz is deeply discounted, it is really not selling.

The problem is that the Kia EV9 beats it in basically every spec at roughly the same price.


I know nothing about photography, but i really enjoyed your work. Especially the line scan cable car and „ 1390 Market Street“. Thanks for sharing.


How did you get in?


These were parked on Hudson Ave, which is a public street, and not inside the fenced area of the depot. So I just walked up to them.


Your Transamerica pyramid picture is incredible among really cool pictures you have there. Quite cool to photograph for wikipedia like this, the world needs more people like you!


Indeed. May I ask the GP, how did you produce those scrollable images of the Shinkansen?


It was done with a line scan camera. More on the technique here: https://daniel.lawrence.lu/blog/2025-09-21-line-scan-camera-...


This was a great read and would be worth a submission of its own if it hasn't been posted recently.


Thanks, I posted it 4 months ago already: https://news.ycombinator.com/item?id=44996938


Upvoted!


The fact that so many people use FFmpeg and QEMU suggest that he is quite good at documenting, collaborating, and at least making his code remarkably clean and easy to follow. This already puts him way ahead of the average silicon valley senior software engineer that I've worked with. However, he does value independence so I don't think he would have been happy working at a faang-type company for long.


Not really. https://codecs.multimedia.cx/2022/12/ffhistory-fabrice-bella...

>Fabrice won International Obfuscated C Code Contest three times and you need a certain mindset to create code like that—which creeps into your other work. So despite his implementation of FFmpeg was fast-working, it was not very nice to debug or refactor, especially if you’re not Fabrice


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: