Couldn't find anything in the docs on mapping file sources to resource needs on the host, how much is too much data to dump into the tool on a single workstation?
It depends on the number of rows/columns and the types of the values, but the application displays a dialog asking you if you want to stop the import before completion when it feels like resources are being exhausted.
The software was specifically developed to be able to handle as much data as possible while remaining responsive so the workstation resources will likely be the bottleneck here.
On my 32GB development machine, I can easily load tens of millions rows with tens of columns.
Mozilla has a moment being presented to them right now and I fear they are going to totally blow it. With manifestv3 and inaction on tracking cookies google is obviously acting in their own interests instead of their end users and Mozilla should be pouncing on this in a big and visible way but they just aren't
As a long time thinkpad user the first thing that stood out to me from looking at this is wondering if there is a bios option to correct the flipped Fn and Ctrl buttons.
DPUs are also probably a piece you are missing. Vendors are starting to position these as almost PCIe arbiters with access to both nvme devices and GPUs without the need to involve the main cpu to move data between them.
Cool stuff, I mean at least data comparison/matching would be great but it's pretty wild to think what it would be like if the filesystem itself was implemented by storage firmware as a logical abstraction.
Another paper showing the open office is garbage for any work that requires focus, another paper business people will roundly ignore because they don't actually care about productivity enough to spend money.
The quadratic resource usage on data loading for their infinite scroll and breaking going back to the same place in the feed when hitting the back button the upstream devs are pretty strongly in the running for this.
I saw this when my wifi was spotty in my bedroom and it spent all night switching between wifi and cellular, suspect handshake is expensive on connection change.
From some browsing through the patches they apply to the packages it appears they are making extensive use of function multi versioning. Instead of compiling for the lowest common denominator for the target arch they are shipping pre-compiled versions for each generation and using run-time detection to figure out which to load.
Nothing stopping any other distro from copying the approach other than detail work and increased package sizes.
This guy's argument is basically good coders should be allowed to be garbage people and require others to put up with them because they open source the code he relies on.
To be fair, I mostly tuned out the moment he mentioned Jordan Peterson.
Forcing somebody out of their jobs/hobbies because you disagree with their political views is absurd.
As long as the person who is an amazing coder contributes great code I honestly couldn't care less if they eat meat, support Trump or do sports in their own _private_ lives and I advise you and the rest of the community who want to force their own views upon others do the same.