Hacker Newsnew | past | comments | ask | show | jobs | submit | MrPowerGamerBR's commentslogin

Pretty sure it is this one: https://github.com/ValveSoftware/wine/pull/310

Some news outlets did report on it. However, in my experience after testing the patch applied on top of Wine 11.0, both the Creative Cloud and the Photoshop installer did not work.

I suppose that the thing that the patch fixes is the "offline" Photoshop installers, which are not provided anymore unless if you ask Adobe nicely... or if you get it from third party sources. The PR's creator did say that they didn't pay for Creative Cloud, so I think it is likely that this is what happened.

This made me wonder if anyone had actually tested the patches with a legit Creative Cloud/Photoshop installer, or if everyone just ran with the PR saying "look it works now!!!" but nobody bothered to actually test it. The creator did submit their own precompiled Wine version, however that version is meant to be run via Proton, so I wasn't able to make it work because I don't know how to run things via Proton outside of Steam.

I was able to get the Creative Cloud app in Wine (set to Windows 10 mode) by using some very dubious methods, as in, I asked Claude Code to implement the stub to see what would happen because if AI is sooo good as how people are saying, it should be able to fix things in Wine... right? And surprisingly, it did actually work.

However you aren't able to use Photoshop CC 2021 (the earliest Photoshop version you can install from Creative Cloud, newer versions crash during startup) because the activation popup does not render the input controls. The reason why I think it is trying to render something is because the activation popup background does have the same color as the Adobe website and, if my memory is correct, in Windows that popup is used to ask for your Adobe account credentials.

(Sadly the PR patch does not fix the activation screen)

Of course, if you bypass the activation using... alternatives means, Photoshop CC 2021 does work under Wine, which is why you can find a lot of "Photoshop CC 2021 in Wine!" repositories on GitHub.

https://www.reddit.com/r/linux_gaming/comments/1qdgd73/i_mad...

https://bugs.winehq.org/show_bug.cgi?id=57980


You know what's weird? You being late by 3 days and not bothering to read the comments on reddit [0], then going ahead and trying a Wine 10 patch on Wine 11. Like, what exactly did you expect?

[0] https://old.reddit.com/r/linux_gaming/comments/1qdgd73/i_mad...

Edit: Even worse, other people are finding it easier to get creative cloud to run on older wine versions [1], meaning that there are regressions in Wine that aren't being spotted.

[1] https://old.reddit.com/r/linux_gaming/comments/1qg9wgz/creat...

Edit 2: Worse yet, people aren't pirating Photoshop, they copy the files of a working activated Photoshop installation from Windows so they can run it under Wine [2].

[2] https://web.archive.org/web/20251105052117/https://forum.mat...

Edit 3: The guy who claimed to have fixed creative commons has posted an update [3], [4].

[3] https://old.reddit.com/r/linux_gaming/comments/1qgybfy/updat... [4] https://github.com/PhialsBasement/wine-adobe-installers/comm...

Seems like all you did is misrepresent basically everyone involved?


> then going ahead and trying a Wine 10 patch on Wine 11. Like, what exactly did you expect?

You can copy the PR's diffs and apply it on Wine 11.0, it is not like it doesn't work or that OP patched functions that are only available in Proton.

Seeing that people actually got it to work gets me intrigued, sadly they didn't say if they actually used an official Creative Cloud license, or if they downloaded it from the web from third party sources. Because, as I said before, the installers that OP used are not the installers you normally get from Adobe. So, if you know where OP got the installers, please share. :)

Now, it could be that Proton somehow has something else that fixes the installers, or that there is a regression between Wine 10.0 and Wine 11.0 that breaks the creator's patch. But like I said in my own posts that I linked, I can't find the exact installers that OP is using. The only time I've seen similar installers was when I was downloading pirated Photoshop copies to test it out on Wine.

I won't rule out that maybe there's a regression somewhere, I've already reported regression in Wine before (some of them were even fixed, yay!): https://bugs.winehq.org/buglist.cgi?email1=winehq%40mrpowerg...

> Even worse, other people are finding it easier to get creative cloud to run on older wine versions [1], meaning that there are regressions in Wine that aren't being spotted.

I don't think it is a regression. Hear me out:

The user was installing Photoshop CC 2023 with a installer similar to OP's installer, so I suppose that the installer also installs an older Creative Cloud version.

Maybe that Creative Cloud version does not require the stubbed function, nor does it require WebView2.

To get the RECENT, downloaded right off Adobe's website Creative Cloud installer, you will need to install WebView2 on your Wine prefix and set "msedgewebview2.exe" to Windows 7 mode. This makes the Creative Cloud work up until it tries to start it, which makes it use the stubbed function.

To workaround that, you can set Creative Cloud to Windows 7 mode, because that forces a different code path in the app which does not use the stubbed function (SetThreadpoolTimerEx was only added in Windows 8). However, this makes all apps show that it is "incompatible on your system", so you can't actually install anything from it.

My own patch DOES fix Creative Cloud in Windows 10 mode, so you are able to install Photoshop directly from Wine.

However, the patch (nor OP nor my own patch) fixes Photoshop's activation. And let's not rule out that maybe it IS actually a regression.

> Worse yet, people aren't pirating Photoshop, they copy the files of a working activated Photoshop installation from Windows so they can run it under Wine [2].

I'm not sure why you think that linking MattKC's post is a "gotcha", when I explicitly linked that post on my Reddit post AND MattKC's post also says that you need to bypass activation with GenP. So you aren't activating the application in Wine, you are bypassing the activation altogether.

But maybe you didn't notice that because I've only noticed now that my markdown was broken, because I included "(archived link because MattKC's forum is down)" within the URL by mistake, so the link didn't actually work, whoops. I've fixed that now.

I never said that Photoshop doesn't work in Wine. I said that it does work as long as you bypass activation with external tools. If you are using a legitimate copy from a Windows machine, or if you installed it via CC on Wine, or if it is a pirated copy, it doesn't matter, you WILL need to bypass the activation somehow. Which is the point I made in my post.

> The guy who claimed to have fixed creative commons has posted an update

That update was made after my post, and the installers on the creator's post are STILL not the same installers that you can get downloading from Adobe.

Unless I'm missing something and these installers can ACTUALLY be downloaded from Adobe, because I couldn't find them anywhere and the ones that I get from Adobe's website are the ones that I shared the screenshots of on my Reddit post.

___

Now, if you want to prove me wrong, please go ahead and try the creator's patch and try installing the Creative Cloud app, downloaded directly from Adobe's website.

I really want to be proven wrong because it would be really cool if you could get the Creative Cloud app + Photoshop working in Wine without needing external activation tools.


To be SURE that I'm not "misrepresent basically everyone involved": Right now I tried the Proton build the PR creator made... and Photoshop still does not work. It shows the activation screen with "Loading..." written on it (sometimes it is just a blank box). https://i.imgur.com/QN2rxoO.png

You also aren't able to install Creative Cloud with that fork, the Creative Cloud installer gets stuck on a loading loop, so I needed to copy my Photoshop + Creative Cloud installation from my other Wine prefix.

This is not me throwing the PR's creator to the curb, it is impressive that they were able to fix the installers, even though they aren't the "main" installers, and I'm pretty sure that the PR creator could fix the activation screen too, because I think the issue is similar to the ones they are fixing, they probably just didn't do that yet because they don't know the activation screen is also borked.


Because I really want to be proved wrong, I tried using the patch creator's pre-compiled build with umu-launcher. However I couldn't get it work because umu-launcher kept complaining about a missing container runtime after I set the PROTONPATH to the pre-compiled build. It also did not work with umu's default Proton fork (it did run something, but even after I tried starting winecfg with umu, it just didn't do anything)

This is probably a skill issue on my part, so someone smarter than me could try getting it to work.

Because after using umu it sets up all of the override DLLs on my Wine prefix, I've tried running the Wine build directly, and I must say that the Photoshop GUI DOES render way better here, however the activation screen is still a empty white box (sometimes it does show "Loading..." in the box): https://i.imgur.com/Jxnga5W.png

But when doing this, the Creative Cloud app does not work anymore, it says that it needs to be repaired, but it fails to be repaired. https://i.imgur.com/jdQeU4t.png

This is very scuffed, maybe I should try Lutris and see what happens

Again, I really want to be proven wrong and it would be amazing if someone that ACTUALLY made it work with a PAID Creative Cloud license and used Photoshop CC 2021 WITHOUT bypassing its activation shows up and says "hey look, I got it to work and you are just stupid".


> Of course, if you bypass the activation using... alternatives means, Photoshop CC 2021 does work under Wine, which is why you can find a lot of "Photoshop CC 2021 in Wine!" repositories on GitHub.

It's fair game since they don't support Linux.


> Oh!, and the one thing I miss is Affinity Designer.

While I haven't experimented with it that much yet, Affinity (the new one, the one after the Canva acquisition) does work in Wine 10.20.

Now, I won't say it is a smooth experience, one of the workarounds that I needed to do is use Wine's virtual desktop so Affinity's tooltips are rendered correctly instead of being pure black, and the GUI does seem to not render correctly sometimes (it renders as white until something causes a redraw).

The Canva global marketing lead did say that Linux support is "being discussed seriously internally": https://techcentral.co.za/affinity-for-linux-canvas-next-big...

This makes you wonder: How hard it could be for a business that already has a 80% working application via Wine to patch the application/Wine to make it work 99+%, and then bundle the application with Wine and say that it has "native Linux support"?


  > How hard it could be for a business that already has a 80% working application via Wine to patch the application/Wine to make it work 99+%, and then bundle the application with Wine and say that it has "native Linux support"?
First 80% of a job typically takes 80% of the allocated time. The last 20% of a job typically takes another 80% of the allocated time.


Yeah, the 80/80 rule. That made me smile.


> This makes you wonder: How hard it could be for a business that already has a 80% working application via Wine to patch the application/Wine to make it work 99+%, and then bundle the application with Wine and say that it has "native Linux support"?

CodeWeavers (developers of CrossOver and one of the main contributors and sponsors of Wine and related tools) actually offer something like this as a paid service for companies called PortJump:

https://www.codeweavers.com/portjump


Getting it running in linux is the easiest part dev wise.

It is the rest of the iceberg that causes problems.

- You need your support to be able to support linux which means they will need training and experience helping people in an entirely new system

- Linux comes in finite but vastly more combinations than OSX and Windows which means you are probably going to need to pick something like Ubuntu or struggle with the above

- Gotta track bugs in twice as many places

- Need CI / CD for more platforms

etc


>- Linux comes in finite but vastly more combinations than OSX and Windows which means you are probably going to need to pick something like Ubuntu or struggle with the above

This is easily solvable by distributing the app via a distro agnostic mechanism, like as a Flatpak or AppImage. Using Flatpak also eliminates the need for rolling their own app update mechanism.


AppImage relies on the old, unmaintained and suid root fuse2. Not a wise choice in 2025.


But most of those issues are because Linux doesn't have enough market share. No one brushes off Windows because they need to support Windows and they need to add CI/CD for Windows.

The combination issue is a real issue though that (as far as I know) is mostly solved with Flatpaks, or in case of games, by using the Steam Runtime.

Of course, it is a "chicken and egg" problem of "we don't want to support Linux because there aren't enough users using it" but "we don't want to use Linux because there aren't enough business supporting it".

Thankfully with improvements in Wine the need of having "native" Linux support is shrinking, but at the same time there is still a looooong way to go (like the issues I said before with Affinity).


Windows userland compatibility is outstanding. I can run most 30 year old Windows applications on Windows 11 without a problem. This makes it easy for a commercial vendor to support their applications on Windows.

The same is not at all true on Linux.

Right now at work, I’ve got a bunch of commercial apps built for RHEL9 for which I’m chasing vendors for new builds that work on RHEL10, for a variety of reasons. Dependencies like libXScrnSaver have simply been removed, and so apps linked against that library no longer work.


Funnily enough there are old Windows applications that do work on Wine, but doesn't work on Windows 11


And then people wonder, why electron became a thing.


Wine has some gaping holes in some of its API implementations. Direct2D, for instance, has existed since Windows 7 but is badly implemented in Wine -- there is no antialiasing and the ArcTo() function draws a line. The MS documentation is not that great either, so fixing Wine isn't necessarily easier than porting to native.


This. OMG Affinity is the ONE piece of software I actively miss. I tried the wine setup for it and it just doesn't work to a usable extent.


Yeah, I thought that Affinity would work pretty well in Wine because I've seen a lot of people pointing to the "just follow the guide (AffinityOnLinux repo) and it will work!" but in my experience it didn't work that well as people were saying.

And the guide itself seems to be outdated, the guide says that you need to install some stubs/shims but doesn't say that happens if you don't do it (I think that it would crash) but at least in my experience it did "work" without them when using an up-to-date Wine version.

Sadly Photoshop also doesn't work, if you want to follow the rules and use Creative Cloud it won't work at all, if you decide to sail the seven seas and download an older Photoshop version it will work but it also has some annoying bugs (sometimes the canvas doesn't update after an edit until you try to do another edit).

Don't get me wrong I do think that Wine is an amazing project and I hope that it continues to improve, but sometimes people don't seem to actually point all the issues that it exist when running an application in Wine.


> This makes you wonder: How hard it could be for a business that already has a 80% working application via Wine to patch the application/Wine to make it work 99+%, and then bundle the application with Wine and say that it has "native Linux support"?

I've had cases where running an app under wine worked better than the native linux port :/


Don’t they use Qt for their Win/macOS version already?


I have something similar on my website, and my solution was to make server driven modal/toast responses.

Allow the server to return a modal/toast in the response and, in your frontend, create a "global" listener that listens to `htmx:afterRequest` and check if the response contains a modal/toast. If it does, show the modal/toast. (or, if you want to keep it simple, show the content in an alert just like you already do)

This way you create a generic solution that you can also reuse for other endpoints too, instead of requiring to create a custom event listener on the client for each endpoint that may require special handling.

If you are on htmx's Discord server, I talked more about it on this message: https://discord.com/channels/725789699527933952/909436816388...

At the time I used headers to indicate if the body should be processed as a trigger, due to nginx header size limits and header compression limitations. Nowadays what I would do is serialize the toast/modal as a JSON inside the HTML response itself then, on `htmx:afterRequest`, parse any modals/toasts on the response and display them to the user.


Good idea, thanks for sharing! Nice design also


hx-trigger in the response header would handle that cleanly. Fires an event client-side; one global handler shows the error.

Similar to wvbdmp's approach but without needing the extension


> Every endpoint returns 3–5 different HTML fragments. Frontend and backend must agree on every scenario — success, validation errors, system errors, partial updates, full reloads.

And why would that differ from React?

When I was building a website with React, I needed to model a "apply coupon" endpoint with different states (coupon applied, coupon does not exist, coupon exists but has reached its max usage limit) and it was so annoying because you needed to

1. The backend route that returns JSON with a different model depending on the coupon state

2. The JSON models for each response type

3. And then on the frontend you need to load the data, parse the JSON, figure out which "response state" it is (http status code? having a "type" field on the JSON?) convert the JSON to HTML and then display it to the user

In my experience it added a lot of extra "mental overhead". It is something that should be extremely simple that ends up being unnecessarily complex, especially when you need to do that for any new feature you want to add.

When using htmx, a simple implementation of that would be

1. A backend route that returns HTML depending on the coupon state

2. Some htmx attributes (hx-post, hx-swap) on the frontend to make the magic happen

Don't get me wrong, there are places that you wouldn't want to use htmx (heavily interactive components) but that's why htmx recommends the "islands of interactivity" pattern. This way you can make the boring things that would add unnecessary complexity when using React with htmx, and then you can spend the unused "mental overhead" with the interactive components. (which, IMO, makes it a more enjoyable experience)

At the end of the day it is just choices: Some people may prefer the React approach, some people may prefer the htmx approach. All of them have their own upsides and downsides and there isn't a real answer to which is better.

But for my use case, htmx (truth to be told: I use my own custom library that's heavily inspired by htmx on my website, but everything that I did could be done with htmx + some htmx extensions) worked wonderfully for me, and I don't plan on "ripping it all out" anytime soon.


I haven't checked Hetzner's prices in a while, but OVHcloud has dedicated servers and they do have dedicated servers in the US and in Canada (I've been using their dedicated servers for years already and they are pretty dang good)


Seems to be broadly the same sadly, but thanks it's interesting to see they're all hovering quite close to eachother.


While true, at least with open source you can actually go into the code and try to fix the code if you really want to.

With a closed source business you are at the mercy of them to decide if they really want to fix your issue, even if you are a paid customer.


If I had to guess, they were talking about the DeepSeek iOS app: https://apps.apple.com/br/app/deepseek-assistente-de-ia/id67...


Except that if you require anything that is GPU-related (like gaming, Adobe suite apps, etc) you'll need to have a secondary GPU to passthrough it to the VM, which is not something that everyone has.

So, if you don't have a secondary GPU, you'll need to live without graphics acceleration in the VM... so for a lot of people the "oh you just need to use a VM!" solution is not feasible, because most of the software that people want to use that does not run under WINE do require graphics acceleration.

I tried running Photoshop under a VM, but the performance of the QEMU QXL driver is bad, and VirGL does not support Windows guests yet.

VMWare and VirtualBox do have better graphics drivers that do support Windows. I tried using VMWare and the performance was "ok", but still not near the performance of Photoshop on "bare metal".


People throw around the ideas of VMs or WINE like it's trivial. It's really not.


On linux it's quite trivial. KVM is part of the kernel. Installing libvirt and virt-manager makes it really easy to create vms.

I'd say even passing through a GPU is not that hard these days though maybe that depends on hardware configuration more.


“On Linux it’s quite trivial…” giving big

“ or a Linux user, you can already build such a system yourself quite trivially by getting an FTP account, mounting it locally with curlftpfs, and then using SVN or CVS on the mounted filesystem. From Windows or Mac, this FTP account could be accessed through built-in software.”[1] vibes.

Convenience features in software are huge and even if a system is well designed a system that abstracts it all away and does it for you is easier, and most new users want that, so it often wins. Worse is better etc

[1] https://news.ycombinator.com/item?id=9224


The comment you linked is one of the most misunderstood comments on this site, which makes sense because it's one of the most cited comments on this site.

https://news.ycombinator.com/item?id=23229275

This probably isn't even the best dang comment about the situation, it's just the one I could find quickly.


Perhaps I should have put a larger explanation around it but I am mocking neither sureglymop nor BrandonM but we can still learn lessons from hindsight.

Sure, it’s trivial to set the switch in BIOS for virtualisation, and download a couple of libraries but people like computers doing things for us, we like abstractions even if they sacrifice flexibility because they facilitate whatever the real world application we are attempting.

I think power users of any technology will generally overvalue things that 80% to 95% of the user base simply don’t care about.

I admit that having touched Windows twice in the last 5 years I wouldn’t know but I would be willing to wager that WSL has very few drawbacks or shortcomings in the minds of most of its users.


Also sometimes the harder approach is also not as capable as some people make it out to be, and there are some unsolved caveats.


I don't see what's misunderstood about it, but also it's not right to make fun of the user for it.


Because it's only silly sounding because of hindsight. With today's context of file sync applications being a huge industry, that comment seems silly. But that was the prevailing opinion at the time. Check out this blog post: https://www.joelonsoftware.com/2008/05/01/architecture-astro...

>Jeez, we’ve had that forever. When did the first sync web sites start coming out? 1999? There were a million versions. xdrive, mydrive, idrive, youdrive, wealldrive for ice cream. Nobody cared then and nobody cares now, because synchronizing files is just not a killer application. I’m sorry. It seems like it should be. But it’s not.

That's just what a lot of competent people thought back then. It seems hilariously out of touch now.


But it wasn't my opinion at the time, and I didn't hear from those people. I was in middle school, kids were commonly frustrated syncing their homework to/from a flash drive, family members wanted to sync photos, and everyone wanted something like this.

Before Dropbox, the closest thing we had was "the dropbox," a default network-shared write-only folder on Mac. Of course you could port-forward to a computer at home that never sleeps, but I knew that wasn't a common solution. I started using Dropbox the same month it came out.


I'm happy for you :)


The future is rarely made by people who are comfortable with the status quo. That’s the only thing we can get from this.


His comment appears to me to say "please don't bother my friend". Him saying that file sync "wasn't common knowledge at the time"...ok? It is much easier than the solution the commenter proposed. In this thread, it's the same, people are proposing a complex solution as if it's trivial just because it is trivial to them.


Even the described FTP-based Dropbox replacement is easier than getting a VM to work properly with DRM'd software and/or GPU acceleration.


Really? With GNOME Boxes it's pretty straightforward. I hear KDE is getting an equivalent soon, too.


You can do GPU passthrough in a Gnome box, as in, your VM can see the host's GPU (let's say Nvidia) and it works exactly the same as on the host? Or another metric is if you can run Photoshop in a VM with full hardware acceleration. I haven't tried Gnome box in particular, but this isn't what I'm seeing when I search.


Ah, yeah, seems like I was mistaken and maybe Red Hat's virt-manager was what I was thinking of.

virt-manager is a bit more involved than GNOME's Boxes, I'm not sure I could recommend it to someone that doesn't know what they're doing.


Yeah, reading your original comment I was about to go off until I saw GPU pass through with DRM software. Highly cursed.


Yep, regular VMs where you basically only care about the CPU and RAM are easy, provided nothing in the VM is trying to not run in a VM. USB and network emulation used to be jagged edges, but that was fixed. VirtualBox was my go-to. It never had great GPU support, but the rest was easy.

I'm pretty sure there are solutions to assign an entire GPU to a VM, which ofc is only useful if you have multiple. But those are specialized.


Yeah! Even as a dev who can navigate vim, I absolutely don't want to do that on a daily basis. Give me pretty GUIs and some automation!


Not even close. I mentioned a software package that literally offers a full gui for all your virtualization needs.. how is that comparable to the things mentioned in that comment?


That really depends on what you want to run. Dipping into a Linux laptop lately (Mint) there are things, old things (think 1996-1999) that somehow "just work" out of box on Windows 10, but configuring them to work under WINE is a huge PITA coming with loads of caveats, workarounds and silent crashes.


The silent crashes get me. Also running one exe spawns a lot of wine/wineserver/wine-preloaded processes.


Tried doing 3d modeling in a Windows VM - couldn't get acceleration to pass through.


What 3D modelling were you doing that couldn't be done on linux?


Fusion360 doesn't work on Linux. Or at least I tried multiple times and couldn't get it to work


Really? I recall installing it 3 years ago, and aside from some oddities with popups, it worked just fine. I think it was this script [0]. I don't know if they broke it, I switched to OpenSCAD, which meets my needs.

[0] https://github.com/cryinkfly/Autodesk-Fusion-360-for-Linux


Mostly having software better than FreeCAD, AKA everything that exists on Windows and macOS.


I needed to use Rhino 3D specifically because it had an environmental simulation plugin.


None of your business.


I'm hoping that IOMMU capability will be included in consumer graphics cards soon, which would help with this iirc there are rumors of upcoming Intel and AMD cards including it


Quite a lot of people have both integrated Intel graphics and a discrete AMD/NVidia card.


Sadly I'm not one of those people because I have a desktop with an AMD Ryzen 7 5800X3D, which does not have an integrated graphics card.

However now that AMD is including integrated GPUs on every AM5 consumer CPU (if I'm not mistaken?), maybe VMs with passthrough will be more common, without requiring people to spend a lot of money buying a secondary GPU.


Yes, my Ryzen 7600 has an integrated GPU enabled. AMD's iGPUs are really impressive and powerful, but I do not have any idea what to do with it and despite that I moved to an Nvidia GPU (after 20 years of fanboyism) specifically because I was tired of AMD drivers being terrible on Windows, I STILL have to deal with AMD drivers because of that damn iGPU.

I could disable it I guess. It could provide 0.05% faster rendering if I ever get back into blender.


AMD has SRIOV on the roadmap for consumer gpus which hopefully makes things easier in the future for gpu accelerated VMs

https://www.phoronix.com/news/AMD-GIM-Open-Source

Windows can run GPU accelerated Windows VMs with paravirtualization. But I have no use case for two Windows machines sharing a GPU.


There is also native context for VirtIO, but for now Windows support is still not planned.

Also note some brave soul implemented 3D support on KVM for Windows. Still in the works and WinUI apps crash for some reason.


Anything GPU related isn't great in WSL either.


True, but I don't have the need to run applications that require GPU under WSL, while I do need to run applications that require the GPU under my current host OS. (and those applications do not run under Linux)


I don’t know why there aren’t full fledged computers in a GPU sized package. Just run windows on your GPU, Linux on your main cpu. There’s some challenges to overcome but I think it would be nice to be able to extend your arm PC with an x86 expansion, or extend your x86 PC with an ARM extension. Ditto for graphics, or other hardware accelerators


There are computers that size, but I guess you mean with a male PCIe plug on them?

If the card is running its own OS, what's the benefit of combining them that way? A high speed networking link will get you similar results and is flexible and cheap.

If the card isn't running its own OS, it's much easier to put all the CPU cores in the same socket. And the demand for both x86 and Arm cores at the same time is not very high.


Yes, with pci-e fingers on the ‘motherboard’ of the daughter computer. Like a pci-e carrier for the RPI compute.

Good point about high speed networking. I guess that’s a lot more straightforward.


You may be interested in SmartNICs/DPUs. They're essentially NICs with an on-board full computer. NVIDIA makes an ARM DPU line, and you can pick up the older gen BlueField 2's on eBay for about $400.


> full fledged computers in a GPU sized package

.. isn't this just a laptop or a NUC? Isn't there a massive disadvantage in having to share a case or god forbid a PCIe bus with another computer?


There is ongoing work on supporting paravirtualized GPUs with Windows drivers. This is not hardware-based GPU virtualization, and it supports Vulkan in the host and guest not just OpenGL; the host-based side is already supported within QEMU.


I completely gave up on WINEing Adobe software but I didn't know about the second GPU thing, I thought it was totally impossible. Thank you!

I will do anything to avoid Windows but I miss Premiere.


If it is how I think it is, then yes, it is a proof that you attended the event.

I'm not sure how it is in other countries, but in some countries (example: Brazil) some courses (like Computer Science) require you to have "additional hours", where these hours can be courses, lectures, etc related to the course.

To prove to the university that you did these courses, you need a certificate "proving" that you participated. Most of the time they are a PDF file with the name of the event, the date and your name in it.


> One thing I (in general) miss from those days, was how easy it was to get into modding. Whether that be to make your own maps, or more involved game mods.

Another game from that time that was also easy to mod was The Sims 1.

For a bit of context, EA/Maxis released modding tools BEFORE the game was released, to let players create custom content for the game (like walls and floors) before the game was even released!

And installing custom content was also easy, just drag and drop files in folders related to what you downloaded and that's it.

Imagine any game nowadays doing that? Most games nowadays don't supporting modding out of the box, but of course, there are exceptions, like Minecraft resource packs/data packs. I don't think Fortnite and Roblox fit the "modding a game" description because you aren't really modding a game, you are creating your own game inside of Fortnite/Roblox! Sometimes you don't want to play a new game inside of your game, you just want to add new mods to enhance your experience or to make it more fun. There isn't a "base game Roblox", and while there is a "base game Fortnite" (Battle Royale... or any of the other game modes like Fortnite Festival or LEGO Fortnite) Epic does not let you create mods for the Battle Royale game. You can create your own Battle Royale map, but you can't create a "the insert season here Battle Royale map & gameplay but with a twist!".

Of course, sadly EA/Maxis didn't release all of the modding tools they could (there isn't a official custom object making tool for example, or a official way of editing the behavior of custom objects) but they still released way more modding tools than what current games release.

I think that most modern games don't support that easiness of modding because the games themselves are complex, because as an example: The Sims 1 walls are like, just three sprites, so you can generate a wall easily with a bit of programming skill, the skin format is in plain text in a format similar to ".obj", so on and so forth.

Lately I've been trying to create my own modding tools for The Sims 1, and it is funny when you are reading a page talking about the technical aspects of the game file formats and the author writes "well this field is used for xyzabc because Don Hopkins said so".


Factorio has an extensive modding community - one of the community mods was adopted and became an official expansion.


The official toolkit for modding Baldur’s Gate 3 is extremely extensive. You can make an entirely different game on top of the game if you wanted


Really? Officially they don't even support modifying or creating new levels: https://baldursgate3.game/news/community-update-27-official-.... You are mostly limited to visual changes, UI mods and balance / tuning changes.


That seems an easy fix until one gets nervous about working a month just to release a single -purchase game engine. That's a journey that you can take but you might at least consider a license.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: