Yeah it's fun how Google displays a full-page ad for Chrome every few times I do a Google search on iOS Safari that I have to dismiss before seeing the results.
Well the "solution" for that will be the GPU vendors focusing solely on B2B sales because it's more profitable, therefore keeping GPUs out of the hands of average consumers. There's leaks suggesting that nVidia will gradually hike the prices of their 5090 cards from $2000 to $5000 due to RAM price increases ( https://wccftech.com/geforce-rtx-5090-prices-to-soar-to-5000... ). At that point, why even bother with the R&D for newer consumer cards when you know that barely anyone will be able to afford them?
KDE started doing a similar thing in 2024. They pop up a notification asking for donations once yearly. Whether you click "Donate" or "No Thanks" on the pop-up, it will go away until the next year. I don't mind them doing this, as it clearly works (see https://pointieststick.com/2024/12/02/i-think-the-donation-n... and https://pointieststick.com/2025/12/28/highlights-from-2025/ ). Historically, contributions to KDE mainly came from companies/government agencies funding work on specific technologies/parts of the desktop, and volunteers working on their special interests. This meant there was a giant blind spot for work on areas that weren't relevant for corporations/governments and weren't fun to work on in someone's free time. All the small individual donations make it possible for KDE to act independently of these large companies/government bodies and hire its own developers to work on tasks that may not be commercially relevant or fun, but are important to the project.
IMO it's only fine as long as it respects the user's choice and doesn't keep on asking. If I choose to not donate, do not nag me about it the next year either. If I choose to donate, do not remind me to do it again. I will do it myself if I decide to.
Perhaps it's cultural - where I live repeatedly asking for money is highly frowned upon and only lowers the reputation of the non-profit doing it. The non-profits who only ask once are much more likely to receive multiple donations from the same person.
> It's 2025 and I would have expected the linux foundation or canonical to at least create a label "linux compatible" or "linux tested", so that brands can license it, and maybe spend money to collaborate with hardware vendors so they can write correct drivers, but that has not happened.
A few distros do have something like this. Ubuntu has the "Ubuntu Certified" program https://ubuntu.com/certified and Fedora has "Fedora Ready" https://docs.fedoraproject.org/en-US/marketing/ready/list/ . For a situation like this, that doesn't really matter though. Linux does run on the laptop and Lenovo does officially support running Linux on it. If there's a problem with the CPU scheduling or something for that line of processors, Intel would have to fix it, not Lenovo.
> Open source/linux folks are so politicized against capitalism, proprietary software and patents that they excluded themselves from the economy. Only valve and the steam machine might have a chance of changing that situation but it's not even guaranteed.
I don't know what you're talking about here. The vast majority of Linux kernel development is done by companies, not unpaid volunteers. This has been the case since at least as far back as the mid 2000s.
It may also be due to legacy reasons. Japan was a pioneer in adopting HD TV years before the rest of the world, but early HD cameras and video formats like HDCAM and HDV only recorded 1080i at 1440x1080. If their whole video processing chain is set up for 1440x1080, they’d likely have to replace a lot of equipment to switch over to full 1920x1080i.
> There was a whole cottage industry of folks modding these CPUs as a small side hustle for people who were not comfortable with soldering onto CPU pins if you wanted to put these into a SMP system.
When Intel switched from Slot 1 to Socket 370, there was a market for "slocket" adapters that allowed Slot 1 motherboards to take Socket 370 CPUs. The best of these adapters worked out a way to re-enable SMP on Celerons by tweaking the pin layout to disable the lock Intel had added. What made the BP6 so popular is that it was a native dual-slot Socket 370 motherboard that had this modification built in so it could use unmodified dual Celerons out of the box.
> Performance really did mostly scale linearly with clock speed back then - but for a single CPU. The dual CPU setups were not nearly as efficient due to software not being as multi-threaded as it is today. The big win were folks with two monitors (rare!) who could run apps on their second monitor while playing games on the first. Typically you would only see frame-rate increases with CPU clock - and of course the very start of the serious 3D accelerator (3dfx, nvidia, ATI) scene back then.
Even if you only had one monitor, multitasking was FAR better on a dual-CPU machine than on a single CPU system. For example, if you were extracting a ZIP file, one CPU would get pegged at 100% but the system was still responsive due to the second CPU not having any utilization. If you use a dual-Celeron BP6 system, it's a much nicer and more modern feeling experience than using a single-PII system even with the faster CPU with more cache.
Unfortunately you had to run Windows NT, which was a leap for most folks and had poor support for games and some other software written for Windows 9x (e.g. software that required DOS mode and wasn't compatible with NTVDM). Windows 2000 (Pro) was a bit more approachable, and then of course Windows XP (Pro) smoothed out most of the remaining wrinkles.
I ran Slackware on my BP6 while I was in college. Of course CONFIG_SMP wasn't set in the default kernel config at the time so you had to build your own. Great for running bind, apache, sendmail, etc., and of course NetQuake servers. :)
I think the situation has flipped in the past few years. Since Pipewire came out, I haven't had any problems with audio on Linux and I can dial the latency down to single-digit ms. Meanwhile, on Mac audio has gotten far worse, especially since Tahoe. The latency is tens of ms and I get crackling and skipping when there's high CPU usage.
Audio is still broken pretty regularly in davinci resolve on Linux. Sometimes I need to restart the application to make audio work. And I can’t record sound within resolve at all.
It doesn’t help that they only officially support rocky Linux. I use mint. I assume there’s some magic pipewire / alsa / pulseaudio commands I can run that would glue everything together properly. But I can’t figure it out. It just seems so complicated.
This sounds like a hardware / firmware problem specific to your particular sound chip / card.
Similarly, Bluetooth on my Thinkpad T14 is slightly wonky, and it sometimes fails to register a Bluetooth mouse on wake-up (I have to switch the mouse off and back on). This mouse registers fine on my other Linux machines. The logs show a report from a kernel driver saying that the BT chip behaved weirdly.
Binary-blob firmware, and physical hardware, do have bugs, and there's little an OS can do about that, Linux or otherwise. Macs have less hardware variety and higher prices, which makes their hardware errata lists shorter, but not empty.
That’s possible, but the hardware (a rodecaster pro 2 connected over usb) works just fine in other Linux apps. I can record audio in audacity. And I can play back audio in resolve. I just can’t record audio in resolve.
I think it’s a software issue in how resolve uses the Linux audio stack. But I have no idea how to get started debugging it. I’ve never had any problems with the same hardware in windows, or the same software (resolve) on macOS.
It is hard to blame Linux if only one proprietary app has sound issues.
FWIW I lost sound completely 3 times in the last 2 months on my works windows laptop and it would only come back after a reboot. I assumed it was a driver crash.
Yep, adding onto this, Bitwig's native Linux app has amazing Pipewire integration. It works like an ASIO plugged right into your desktop's audio, letting you attach channels to windows or apps and handle complex monitor/performance/mixing outputs.
It depends on having a properly good implementation, which will come eventually for most apps.
Whenever people bring this up I find it somewhat silly. Wine originally stood for "Windows Emulator". See old release notes ( https://lwn.net/1998/1112/wine981108.html ) for one example: "This is release 981108 of Wine, the MS Windows emulator." The name change was made for trademark and marketing reasons. The maintainers were concerned that if the project got good enough to frighten Microsoft, they might get sued for having "Windows" in the name. They also had to deal with confusion from people such as yourself who thought "emulation" automatically meant "software-based, interpreted emulation" and therefore that running stuff in Wine must have some significant performance penalty. Other Windows compatibility solutions like SoftWindows and Virtual PC used interpreted emulation and were slow as a result, so the Wine maintainers wanted to emphasize that Wine could run software just as quickly as the same computer running Windows.
Emulation does not mean that the CPU must be interpreted. For example, the DOSEMU emulator for Linux from the early 90s ran DOS programs natively using the 386's virtual 8086 mode, and reimplemented the DOS API. This worked similarly to Microsoft's Virtual DOS Machine on Windows NT. For a more recent example, the ShadPS4 PS4 emulator runs the game code natively on your amd64 CPU and reimplements the PS4 API in the emulator source code for graphics/audio/input/etc calls.
The problem is the word "emulator" itself. It's a very flexible word in English, but when applied to computing, it very often implies emulating foreign hardware in software, which is always going to be slow. Wine doesn't do that and was wise to step away from the connotations.
There's a really good video here that shows that it likely happened. https://www.youtube.com/watch?v=-y3RGeaxksY Rhythm Nation is unique because it uses a nonstandard tuning that shifts the notes to less-used frequencies. The video creator found a paper that studied the resonant frequency of various 2.5 inch laptop hard drives, and found that it matched up with the frequency of the low E note used in Rhythm Nation.
Given that they're fine with adding breaking changes to the protocol, I think it's a shame that they're not supporting multi-screen. This will lead to the same problem with Xrandr based multi-monitor where you get screen tearing with mixed refresh rate displays. I would prefer to see "traditional X11 multihead but you can move programs between screens" as the solution for multiple monitors. Even if it worked like Mac OS X where you can't have a single window span across multiple monitors, it would still be better than the current state of X11.
reply