Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Thoughts on Cocoa (lists.apple.com)
202 points by pier25 on Oct 5, 2019 | hide | past | favorite | 350 comments


This is why unfortunately the Mac is dead to me. I would love to support it, but when OpenGL was deprecated it was just game over for me. A company that forces me to learn a completely different language to access their basic APIs and keeps changing them is just too mush of a hassle. Every hour spent on maintaining a mac port is an hour not serving the 90% of users who aren't on Mac.

You can complain about OpenGL all you want, and praise Metal, but it doesn't matter how much better it is. i cant rewrite all my shaders, and do the API integration. The fact that the richest company cant hire 10 engineers to maintain compatibility, and are pushing all this work on to thousands if small independent developers is just rude.

With apples history, why spend time embracing, their new tech if you cant count on it sticking around. For developers who write small one-off apps for iOS its fine, but for people who make larger long term applications Apple is very developer hostile.


Although I'm not a fan of OpenGL's removal, I think it's somewhat misunderstood. OpenGL is an outdated design for modern GPUs. Its featureset and architecture is too high level, leading to more buggy drivers and complicated legacy support. There's two things you can do basically - keep OpenGL around in perpituity, or move the complexity and responsibility of it all to libraries.

Apple has defined their single graphics API to be Metal, kind of but not quite like mainstream Linux where this has become Gallium3D (i.e. all open source drivers implement their OpenGL and Vulkan and sometimes even Direct3D support on top of Gallium3D). I fully expect OpenGL to be dropped from many future Linux drivers too. I'm actually looking forward to the day Linux drivers will go Vulkan only, as hopefully it allows things to get less buggy.

Even when OpenGL gets fully removed from macOS, you can still use OpenGL. You can use Vulkan too. You just need a library for it, like ANGLE or MoltenVK. If that would have been Apple's communication, there'd be a lot less fuzz about this.

By the way, OpenGL still works on macOS 10.15. Deprecated does not mean it's not working.


> OpenGL is an outdated design for modern GPUs.

The thing is, in its entire lifetime OpenGL was never meant to be a GPU abstraction layer - that only happened during GeForce2's time (and only for Nvidia). Even SGI's implementation was doing a lot of stuff on the CPU (and SGI even had a pure CPU implementation). OpenGL was always meant to be a high lever immediate graphics API (immediate here meaning as opposed to retained/scene graph APIs like OpenInventor).

> By the way, OpenGL still works on macOS 10.15. Deprecated does not mean it's not working.

Yeah but as it can clearly be seen, you cannot rely on Apple when it comes to keeping things working.


And I think it was a big failure not to adopt Iris Inventor as well.

Every OpenGL newbie has to go through the rite of passage of assembling their own SDK for dealing with fonts, textures, materials, math, meshes,...

Whereas all competing APIs provide such features out of the box on their SDKs, including the console ones.

Plus there is this myth about portability, when any relatively big OpenGL application is full of extensions and multiple code paths to deal with OEM specific features, driver bugs or hardware glitches.

Then there is the little issue that OpenGL, OpenGL ES and WebGL are only superficially alike.


It seems to me then that OpenGL should have been a library instead of a driver all along. Its original intent unfortunately has no bearing on what it has become in the meantime.

> Yeah but as it can clearly be seen, you cannot rely on Apple when it comes to keeping things working.

Well yeah that's what deprecated means. At the moment, developers have a bit longer to get on with the times though, so that's nice I guess.


> should have been a library instead of a driver all along

This would have required there to be a lower-level interface that different vendors could all implement, which was not the case for early graphics hardware. In the decades since graphics hardware has all evolved to look more or less the same to SW (so vulkan became a nice fit), but that wasn't always the case.


FWIW, OpenGL is quite broken. It works but it periodically glitches or does something weird in even minor point releases.


What you're describing has nothing to do with the OpenGL standard but of graphics drivers (OpenGL implementations).


Yes, I’m talking about Apple’s implementation of OpenGL.


This is blatantly false.

You may be thinking about OpenGL 1.x and the fixed pipeline. Modern OpenGL is perfectly capable of running virtually every game and professional app out there.

OpenGL does NOT work on macOS and hasn’t worked for a very long time. Apple only allows to use an ancient version of OpenGL, which is not really my definition of “working”.

Nobody is going to ever remove OpenGL from Linux (or Windows) drivers. OpenGL/D3D powers everything there. Vulkan and Direct3D 12 are just the lowest level and are not a replacement for OpenGL nor Direct3D 11. Drivers may implement them on top of Vulkan/D3D12, but removing? Never.

Finally, Apple has never told anybody to use MoltenVK because they don’t want you to. And I would not be surprised if they end up banning such layers if they find they get popular.


Better start paying attention to what the Vulkan folks are doing, Khronos and friends haven been pushing for OpenGL deprecation for quite a while now.

https://xdc2019.x.org/event/5/contributions/329/attachments/..., 2nd slide

This is just one example, here is a nice gathering at latest Reboot Develop Blue 2019

https://www.youtube.com/watch?v=hcgG1RQ-BJM

As for Windows, OpenGL ICD drivers only run in legacy Win32 mode, UWP and Win32 sandbox mode doesn't allow for ICD drivers.


My application, does precisely what i want in terms of graphics, and runs at 600FPS. Explain to me why my users would benefit from me spending months switching to a different API? Life is too short for that. The loosers in all this are the Mac users. The reason to support Mac is to be nice to the users, not because it makes financial sense. If apple is going to actively make it harder why bother?


> Explain to me why my users would benefit from me spending months switching to a different API?

The idea is that leaving behind old APIs will benefit driver and system stability for the entire ecosystem, benefiting your users in other ways. You might disagree with that view, but Apple is not doing this out of spite alone.

I'm not sure that switching to use a library like MoltenGL for this stuff would take months either, but you're in a better position to judge that I guess.


IMO the issue is not so much that Apple is leaving stuff behind, but that it makes drastic decisions and generally does not communicate its plans.

For example in one of the email responses someone from Apple says that Carbon was a temporary solution but quite frankly that was never clear from the outside world. It seems there were even plans to make Carbon 64 bits that were scratched at the last minute.

Same with the hardware. One day out of the blue Apple announced its new laptops without USB-A ports. Even 3 years later USB-A is still one of the most used peripheral ports in the PC world (including Mac and Windows). Even Apple does not ship peripherals with USB-C cables other than adapters. AFAIK its mice and keyboards still ship with USB-A cables for charging.

These tactics are maybe ok for the consumer world but not for professionals who need reliability above all else.


Sorry, I was a teenager and fairly anti-Apple even at the time. When carbon was introduced I don't see how Carbon could have ever been seen as anything other than a temporary solution. I would have to go back and watch the original introduction, but I even seem to recall that general feeling from the event. So I guess I'm just not understanding your viewpoint here.

USB-A still works, you just need a compatibility layer (a dongle).

OpenGL has been deprecated for years. Why are people reacting like this is a new development.


>USB-A still works, you just need a compatibility layer (a dongle).

Not even that, a cable will do. I have zero dongles, and just bought a few tens of $ USB-C to X cables...


>USB-A still works, you just need a compatibility layer (a dongle).

Assuming you work at a company willing to spend the money on official Apple dongles and not cheap flakey clones.

Unfortunately few companies will do that and now all Apple I/O goes through cheap unreliable dongles.


Or Apple could do it! if its possible to support usigng a wrapper, Apple should make the wraper part of the API, why push the responsibility of backwards copatibility over to developers?


If you want to argue Apple is handling this poorly from a community perspective, you'll find no argument with me.

The reason Apple is doing this in this poor way is that they seem to want to push Metal as the answer. They want incentives for developers to move to Metal by making their lives more difficult on OpenGL, because even with libraries those libraries must be maintained and included. They don't want to bear responsibility for this legacy. I'm explaining, not defending.

This is the direct opposite to the Linux ecosystem, where Gallium3D not only facilitates both OpenGL and Vulkan, but has had attempted Direct3D implementations in the past. Yet another case where Linux has the "everything and the kitchen sink approach", and Apple just flat out dictates a single choice to everybody. That attitude sometimes helps Apple, but sometimes it doesn't.


Mesa Gallium 3D is not an API intended for use by applications. It is a mechanism that allows to share functionality among drivers. Yes, it is possible to create state trackers on top of it, but that will be part of Mesa. It has no public API.

Even among Mesa drivers, usage of Gallium 3D is not mandatory. If the driver team thinks it will make their job easier, they can use it. If they think it won't, they won't. Intel 965 driver (the current driver for Intel GPUs) doesn't use Gallium, for example. The new one ("Iris"), currently in the works, will. For a long time, the only driver that used Gallium 3D was only the AMD one.


You are falling into a common trap these days. There is a difference between API (remember the I stands for interface) and implementation. There is no reason a vendor can't completely change how drivers work without the application having any idea. There is nothing about the API that says it needs to push complexity onto the driver, but you are describing these implementations as inevitability.


>Explain to me why my users would benefit from me spending months switching to a different API?

Because as a user I don't want apps forcing extra baggage like old APIs to be supported forever in the OS. Nor I want apps that are fossilized and don't take advantage of new platform features.

If I didn't care for apps being Mac-specific and taking advantage of what the platform has to offer and moving on with the platform, I'd just as well use Windows or Linux (and vice versa).


If such mandatory "new platform features" dont have anything to offer for a program to make it better, updating it brings in absolutely nothing for the user but the ability to continue to use the program.

I think it is pretty unreasonable to require developers to rewrite their programs periodically just because there is a some new feature in some new API.

Also, software that gets updated just for the sake of it usually ends up worse than before.


I think as a user you shouldn't need to see or know anything about APIs or OS internals. Your apps should just work.


And yet there are decades of apps not supporting new UI features, being unstable or insecure because they were slow to adopt new APIs. This is a balance: Windows font rendering, color management, and resolution scaling took ages to stabilize because things working reduces pressure on the vendors to update.


When Mac stopped supporting NVIDIA the Mac users became losers. They missed out on all the AI and other GPGPU work.


Remember Bumpgate? Nvidia shipped faulty GPUs to Apple (among others including Dell and HP) which caused hardware failures in around 40% of devices, which Apple had to replace. NVidia refused to acknowledge it was their problem/fault, and even went as far as trying to sue Apple, Dell et al. As a result, no more Nvidia on Apple platforms. I can't say I blame them to be fair...


There’s more than meets the eyes when it comes to the Nvidia/Apple situation. From what I’ve heard a big part of Apple’s problem has to do with the quality of Nvidia’s drivers — Apple is requiring that a bar be met, and Nvidia isn’t meeting it.

I can’t verify this claim, but given my personal experience it makes perfect sense. I run a 980 Ti alongside a 950 in my hackintosh tower and graphical glitches on the desktop happen pretty regularly despite both cards being perfectly healthy.

The other thing is that historically, Apple has been unwilling to differentiate the drivers between consumer and workstation cards because generally speaking that concept is kind of silly. Everybody should have workstation-class stability, not just those who shell out 3-5x more cash for a workstation card. It’s obvious why Nvidia would take issue with this,


> I run a 980 Ti alongside a 950 in my hackintosh tower and graphical glitches on the desktop happen pretty regularly despite both cards being perfectly healthy.

The Nvidia web drivers are not great but there are so many factors that could be producing those glitches.

I've built about a dozen hackintoshes since 2010 and while I've experienced many problems I personally haver never seen graphical glitches.

I don't think it's fair to expect the same level of reliability from a frankenstein you've built yourself than a commercial product.


I would normally agree but at this point, hackintosh users are probably one of the largest groups of people making use of the Nvidia web drivers… the only others are going to be old Mac Pro holdouts and the tiny number of Mac users using Nvidia cards as eGPUs.

I think in my case specifically the graphical glitches are caused by having dual GPUs. That’s a somewhat uncommon configuration, but not so strange as to not test for.


Yes but the web drivers users are probably less than 1% of all Nvidia users.


I don't have any graphical glitches with the nVidia Web Drivers. I've used them with a GTX 970, and GTX 980 Ti, and GTX 1070, and a GTX 1080 Ti over the years. They have consistently worked fine.


> There’s more than meets the eyes when it comes to the Nvidia/Apple situation. From what I’ve heard a big part of Apple’s problem has to do with the quality of Nvidia’s drivers — Apple is requiring that a bet be met, and Nvidia isn’t meeting it.

Then Apple needs to dig down into their pocket, maybe they’ve got some extra money there, and solve this problem.


I think it would have been a lot more palatable if they hadn't gone off and made their own API (why not Vulkan?)


> OpenGL is an outdated design for modern GPUs. Its featureset and architecture is too high level, leading to more buggy drivers and complicated legacy support.

This is only a problem for game developers who live right on the GPU hardware and need to squeeze absolutely all the performance out of it. Which is probably a small number of admittedly very important developers: Game engines and AAA studios. For the rest of developers who have more modest requirements, I don't really understand what "modern" graphics APIs provide, besides tons of extra boilerplate code and headaches.

When I was a graphics programming newbie, it was great to have the whole fixed function pipeline all set up and ready for me to experiment and learn. If I had to learn from nothing on Vulkan, I'd probably have given up before the first 1000 lines. Drawing a triangle in OpenGL vs. Vulkan: http://sol.gfxile.net/temp/hellotriangle.png

It's so strange--in most of the software world, newer APIs and tools tend to be higher and higher level abstractions. "Modern" Electron apps run Javascript, HTML and CSS on an embedded browser which itself running on a high-level native toolkit which is probably written in C or C++, which compiles to assembly, etc. You can have a web server in one line of python. In contrast, the more modern graphics APIs get, the lower-level and more verbose they get.


As one of those graphics developers that doesn't need to squeeze absolutely all the performance out of a GPU, I moved into middelware engines, where each API backend can be crazly optimized, while I have the productivity freedom to just ask to draw my scene, with a nicely organized set of meshes, materials and combined shaders, without feeling I am doing graphics programming with two sticks.

The anti-managed bias on some Vulkan circles also doesn't help, specially when the OpenGL deprecation message keeps poping up in Vulkan related talks.

Yes, using managed languages will take some performance out of Vulkan, but guess whose's fault it is when no other API gets offered as alternative, not everyone is jumping of joy to use C for graphics programming.

And here Metal, DirectX, LibGNMN, NVN, WebGL, WebGPU take a much developer friendly attitude.


Gallium3D is a framework for building drivers, not a graphics API. The intel drivers do not use gallium3d (yet), and neither do nvidias.


They could implement OpenGL API over Vulkan, so it's just a library, no need to touch drivers.


On the other hand, as a user, this is one of the reasons I switched to Macs a decade ago. The apps available were of extremely high quality and actively supported.

If something with a GUI hasn’t been updated in 20 years, chances are it’s not gonna be stellar software. Both technology and UI conventions have changed a lot.


No you dont get it. It's not about maintaining or not, its about what a developer spends their time on. Do you want the developer to spend months moving the code from one legacy API to a new one, that will give the user no tangible benefit, or do you want the develop new features requested by the users?


“No tangible benefit” is a very weak assumption. Upgrading to new APIs means familiar UI controls, possibility of integration with system features - off the top of my head, iCloud copy & paste, versioning, handoff, low power/data mode, form inputs behave the same as everywhere else, font rendering, window controls - all very tangible for users.


Except there's no reason those things couldn't be supported in an existing API with less friction to adopt. It's software.

I think the reason software abruptly changes (API switches, rewrites, UX redesigns, etc) is because doing that employs more software people than incremental improvements to old stuff. I don't think this is necessarily a choice, it's just the way it works out.

It's kind of a perfect business in that way. The downside is that occasionally an important customer says "enough!" They're totally justified in doing that, but it doesn't align with growth, so it's the cost of business.


> there's no reason those things couldn't be supported in an existing API

That’s how you get Windows 10, where you’ll encounter three eras of UI, going all the way back to 1995, just to change your power settings. No thanks.


> Do you want the developer to spend months moving the code from one legacy API to a new one

If the situation demands it I would say yes. I treat it as one big refactoring job. There are lots of shims in old software libraries that may not need to be there anymore. There are also lots of vulnerabilities that a fresh set of eyes and a different perspective can illuminate. A product that began its life being developed by a lone developer may have an entire team now. The rewrite is a good opportunity to explore how it works and why certain decisions were made.


> The rewrite is a good opportunity to explore how it works and why certain decisions were made.

... and a good opportunity to break existing and working features along the way.

If something works, why should you fix it?


What Mac-only apps that you use do you consider high quality?

I’m asking because outside of some big ones that I don’t use, like Final Cut Pro or Pro Tools - most of the Mac apps I’ve found are very low on features or “options” (flexibility). They all seem to try and follow Apple’s model of only serving the happy path and completely ignoring anyone who needs more than that - e.g. power users.

Even normal users get annoyed by this tendency, like recently when Apple removed the “mark all as read” option from Mail in iOS.


Some I personally use and would miss:

  Things 3,
  Scrivener (Mac only for 5-6 years, later had a Windows port),
  Acorn,
  Pixelmator Pro,
  Transmit,
  Capo,
  Omni-Graffle,
  Omni-Project,
  BBEdit,
  Dash,
  Ulysses,
  Final Cut Pro X,
  Logic Pro X,
  Numbers,
  Keynote,
  Soulver,
  Quiver,
  Viscocity,
  NVAlt,
Even others that are ported, like MS Word, I prefer the look and operation of the Mac version...

>most of the Mac apps I’ve found are very low on features or “options” (flexibility). They all seem to try and follow Apple’s model of only serving the happy path and completely ignoring anyone who needs more than that - e.g. power users.

Or you know, "do a thing and do it well", the UNIX philosophy...

Not sure what power user needs aren't met though (especially with the full unix userland available as well)


Coda, though that's being replaced by something new any minute now.

Apple News, though it's still moist behind the ears and needs more features.

Sequel Pro, though if there's a Windows equivalent my IT department hasn't found it yet.

There are a bunch of tiny workhorse utilities like FileChute, EasyBatchPhoto, and Subler. Though there's probably a Windows version of Subler.

There are a number of Mac-only apps that attract people, but it's not the 1990's anymore. People don't use a platform for a "killer app." Most people choose a computer because they like the way it works. macOS works better for a lot of people than Windows or Linux. They may also be partial to the hardware for various reasons.

They all seem to try and follow Apple’s model of only serving the happy path

Apple, and its users focus on getting things done. Productivity is highly valued.

completely ignoring anyone who needs more than that - e.g. power users.

"Power users" is just Nerd for "tweakers and hobbyists." Apple hasn't been interested in that demographic since 1984. If you're a later-day "power user," good for you. Linux awaits.

The only reason I keep a Windows box around is for IE11 compatibility testing. And Linux I only use on servers. That's because the way macOS works makes sense to me. It didn't at first, coming from Windows. But now that I understand the workflow conventions, it makes sense, and I prefer it.


Well, I wouldn’t actually call something like having competent window management, something that macOS is sorely lacking, a “hobbyist” feature.

I don’t buy the argument that Apple wants to let you get things done when their stuff lacks things like that. How is removing a feature like “mark all as read” helping me do that?

And if you think Finder works better than Windows Explorer or other products that emulate Windows Explorer, you’re certainly in the minority.


>Well, I wouldn’t actually call something like having competent window management, something that macOS is sorely lacking, a “hobbyist” feature.

I've used Windows and Linux (several wms), I wouldn't call their window management any more competent (if not less).

If you want a tiling wm for macOs, there are a few.

But managing windows is mostly bikeshedding, and for that something like Linux would serve better...


Cmd+Tab and Cmd+` are great. Maybe it's dumb to you, but that's what keeps me coming back to macOS.

Finder still feels less cluttered to me than Explorer. It isn't quite as simple as Finder vs Explorer, it's the built-in assumptions. For me, Finder + Unix filesystem vs Explorer + Windows filesystem is a no-brainer.


Can you elaborate on your comment "competent window management"?


Being able to maximize a window without having to use 2 hands (to hold down the option key). Snapping. Switching directly back to another apps window. Being able to see a separate icon in my dock for every open window.

If I have to install third-party app to do this stuff, it doesn’t really help me when I go over to some junior developers machine to help them.

The problem is that macOS is what they call “app-centric” and they’re the only ones who do it. It’s not better, but Apple will try to convince everyone that it is, just like when they stuck with the single mouse button for a decade, when clearly multiple button mice were considered the standard.


> “The problem is that macOS is what they call “app-centric” and they’re the only ones who do it.”

Actually, it’s document centric - the Windows approach is “app” centric. It has always been like that, 35 years and counting. It’s arguably a “truer” implementation of a windowed UI, than Windows.


No that is incorrect, macOS is app-centric and it’s really not a truer form of anything in particular other than its self. That’s why you only see one icon in the dock per app and not per window or document. And you cannot switch directly from one document in one application to another document in another application. You have to switch apps first, then you can switch documents/windows.

Windows is window-centric. That’s why when you do alt tab, you see all of the windows. That’s also why you classically see a separate task bar item for each window.

I’d like to hear an argument in support of your view though.


> Sequel Pro, though if there's a Windows equivalent my IT department hasn't found it yet.

TablePlus [1] has a Windows port.

[1] https://tableplus.com


Here are some that I would say cater to power users and are flexible and high quality

* OmniFocus and anything from Omni Group

* Alfred

* Keyboard Maestro

* Drafts

* iStat Menus

* Soulver

* BBEdit

* Hazel

* Marked

* MindNode

and tons more that I have not personally used or found useful.


It's probably also worth noting that BBEdit has been around since the early 1990s and has weathered several transitions.


Scrivener used to be Mac-only, until demand was such that they made a Windows version. It's the Mac app for novel-writers.


I've bought Scrivener twice now and have never actually used it — but it's so beautifully made. I just want it to still be here when I finally get around to it.


* Screenflow


It's gonna be 100times as fast as modern software, for one thing. And more debugged too.


>This is why unfortunately the Mac is dead to me. I would love to support it, but when OpenGL was deprecated it was just game over for me. A company that forces me to learn a completely different language to access their basic APIs and keeps changing them is just too mush of a hassle. Every hour spent on maintaining a mac port is an hour not serving the 90% of users who aren't on Mac.

I think their idea is, if you're not going to make a full-on Mac app -- leveraging the extra capabilities, the native APIs, the native look, etc and updating with the platform --, don't bother.

It's perhaps too much for its small market share (2-10% depending on region, though with much better share of the richer, actually-paying-for-software demographic), but if that wasn't the case, what would really differentiate macOS from Windows?

Just the UI and under the hood stuff, when both are running the same identical apps and codebases?


There are well supported ways to keep using OpenGL on MacOS and iOS. This company is offering a wrapper that makes OpenGL run faster on top of Metal than it did natively, and promises support for future OS versions indefinitely..

https://moltengl.com/


So I suppose if Apple cared about not being hostile to developers they'd have the option of buying that company and giving the product away for free.


A company is selling a product that provides a clear value to a specific audience; they're likely to be far more responsive and effective than Apple could be.


Note that this is only for OpenGL ES!


Use ANGLE and keep writing OpenGL. It has Metal and Vulkan backends and papers over all the various GL driver bugs.


There are workarounds for everything. The problem with the Mac is that I have to work around everything. No C API, XCode, lack of extension support, driver bugs, poor documentation, weird manifest files, the OS switching run directory randomly... You cant behave like that when you are a small platform.


Thankfully there are vendors out there willing to move beyond C, unlike Khronos.


A trillion dollars says you absolutely can behave like that.


The trillion dollars says you can behave like that on a large platform (iOS), but not much of that trillion dollars comes from the Mac, a comparatively small platform, does it?


Aren't those destined to converge as one?


> Every hour spent on maintaining a mac port is an hour not serving the 90% of users who aren't on Mac

Oh come on spare us the ethics lesson. You know perfectly well that OpenGL is an archaic stone around the neck for many developers. To the point that many (all) industry players are trying their hand at superseding it (of course also for lock-in, but that’s not the only reason.)

During transitions things come and go, adoption sway and ideas change.

Deal with it, it’s a fact of life


It's also a fact of life that that economics dictates where effort is spent. If I port all my OpenGL code to Metal, that's a big undertaking in both time and money. And what do I gain? Nothing. The program will render the same views both before and after the change. It's not directly adding value. So why would I bother?

Also bear in mind that I currently support the application on Windows, Linux, FreeBSD and MacOS. OpenGL rendering works the same on all four platforms, and I have a unified cross-platform codebase which compiles without any trouble on all of them. If I switch to Metal, that's going to mean dropping all the other platforms which use OpenGL. Or it's going to mean having a separate MacOS-specific implementation for Metal. Dropping the other platforms is unacceptable.

MacOS is a niche. Windows is a bigger market. Depending on the application, Linux might be as well. Consider that Windows, Linux, and even FreeBSD can support beefy modern GPUs while MacOS is quite limited in the on-board GPUs on its laptops and other systems. When push comes to shove, MacOS is the platform which I will drop first. It's already got the oldest and most limited OpenGL implementation and the poorest hardware, and this makes it the worst choice for running my application in the first place.

If Apple want to retain developers like me, then they can provide a first-class current OpenGL implementation based on top of Metal. And then they can also provide a high-quality Vulkan implementation also based on top of Metal (or the other way around). Because if I do upgrade, I'll be upgrading to Vulkan on all four platforms, not to Metal.


Unity does it for me, no need for Apple to move a finger.


It's an example where some effort on Apple's part would save a multiplicative factor of developer effort and user frustration. I expect the cost to users of a couple hundred megabytes of flash storage is a fair trade-off for not having your apps break with every yearly iOS update.


Carbon UI framework has been deprecated for 12 years. Cocoa has been available since the first Mac OS X version, so ~20 years ago, and it's still be main UI framework on macOS, and it will still be for years.

You can't come now and whine that Apple changes UI frameworks too often.


On the other hand, I've been programming in Windows since about the same time.

In the GUI front I had to work on Win32, MFC, WebForms, Silverlight, XAML and WinRT.

I've been trough two DirectX breaking upgrades where we had to rewrite pretty much everything that touched the API because of deprecations.

Sure, old things keep working for users, but if you stick with ancient technology it gets way too hard to hire developers.

I'm not complaining though: I love writing code and upgrading things is a breath of fresh air. Every Microsoft API was significantly better than the one preceding it, IMO.


I write Windows desktop app in WPF since December 2009 and I'm confident that I won't write them in anything else for at least the next 10 years (desktops are supposed to be dead by then but I hope the 'visionaries' are wrong.


I think another issue is that on Apple's platforms the changes are quite abrupt, whereas most other platforms are gradual. E.g., if you wrote a Qt (the toolkit) application in 1996, you could have slowly transitioned your application to Qt 5 version by version. In the meanwhile, you have gone from non-standardized C++ to C++98, to C++11. So, you would have an application with a modern language and toolkit, which was gradually developed over the last 23 years. The same for Windows applications.

For me, the truth is somewhere in the middle. Yes, there is more churn on Apple platforms. On the other hand, it also allows Apple and the ecosystem to move faster. It is amazing what they have been able to achieve with (just naming some random things): wide Touch ID support, Metal, or application sandboxing.


You could have done the same with Carbon to Cocoa: transition a window/view at a time. You can use Cocoa windows and views in Carbon.

Plus you can keep using your C++ core even if the UI is Cocoa. The same applies to Swift, you can write or rewrite parts of the app in Swift and keep part of it in C++/Rust/Objective-C or your preferred language.

The same happens in SwiftUI, you can mix it with NSView and UIViews.


AFAIK Carbon was a C++ API and Cocoa is Objective C(++). In which case the UI is a rewrite , not a transition.


You rewrite one screen at a time.


It's almost like the people complaining the loudest don't have a profitable software product that justifies regular ongoing maintenance or something.


Carbon is a C API.


Carbon continued to work for 13 years after it was officially deprecated. Waiting over a decade to get started on a transition doesn't mean that the transition was abrupt; it just means that you procrastinated until you got a hard deadline and so caused scheduling problems for yourself.


I think that is precisely what they are doing. I can run apps that worked on Windows 95 today in 2019 unmodified. Cocoa or not, I can’t do that with osx apps that are even 3-5 years old sometimes.

Microsoft and Linus have a specific fanatical attitude about the installed base of existing apps. Apple simply does not.


I actually like that. Some software dies and makes room for more modern replacements. All Mac software that I have is modern and consistent.

Contrast this with Windows, where browsing through system settings is like going back in time from Windows 10/8 to XP to 95 all the way to 3.11, if you click "Advanced" enough times.

Some Windows applications (especially installers) pop up dialogs that don't support anti-aliased fonts. It gives an impression of an OS with wildly inconsistent GUI and legacy garbage all over the place.


I'll take an ugly installer over a crashing installer any day of the week.


I'll take a good looking, non-crashing installer over an ugly installer or a crashing installer any day of the week.


> Contrast this with Windows, where browsing through system settings is like going back in time from Windows 10/8 to XP to 95 all the way to 3.11, if you click "Advanced" enough times.

Note why we know this. We know this because those legacy dialogs are useful and no one could be bothered to rewrite them with new GUI toolkits. Probably someone would write new dialogs if the old ones were forced to die. Probably the replacement would even be more convenient. Or probably not (on both counts).


The dialogs are extensible by third party drivers and applications. Microsoft has api’d itself into a corner, that’s why they can’t bring many of the dialogs up-to-date.


what about games? I like to play some old games out of nostalgia


Virtualization? Why is it necessary to eternally keep shims and compatibility hacks in the host/main OS when nowadays we can just trivially run the old OS?


How do I trivially get a copy of Mac OS 10.5 Tiger?

This isn't FLOSS.


Next suggestion will be to keep old computers around if you want "old" software.


Repairing Macs is an issue though.

You should keep a few replacements in reserve in case some minor component fails.


10.5 was Leopard actually, 10.4 was Tiger. And both were PPC and thus would require emulation not just virtualization. For something that ran on those what you'd actually probably want to run would be 10.6, because that would give you an Intel based OS that still included Rosetta for PPC compatibility. I can't think of any applications that ran on 10.5 but wouldn't work under Rosetta off the top of my head, though perhaps there were a few. And I vaguely recall there was a security update in 2012 (about 3 years after 10.6 launch) that caused issues with Rosetta, but in a VM you could just not apply it since a VM that old should be isolated anyway, or maybe they later fixed it.

But at any rate I'd assumed that for someone who had "nostalgia" for old games would therefore have played those old games and thus had that system, and thus could just have kept it around as a disk image. I have Mac OS disk images going back to the beginning, it was by far the best way to install anyway because it was so much faster than actually going off the disc (the ease and customizability of network installs was also wonderful, I miss that). For really old stuff you can just emulate it via something like SheepShaver, no need for VM at all. I actually find that's the same as Windows, for all its vaunted backwards compatibility I had the damndest time getting some old games working under W7 even and it was much, much easier to just keep some VMs around cover 98, 2000, and XP. Those are old enough there is full virtual 3D support, even without hardware passthrough.

If you really wanted an old copy of 10.5 or 10.4 or whatever directly despite having not used/kept them, and you didn't want to get it off the net, you could just ask around a Mac site or buy a DVD. A quick look on Ebay shows tons for sale for $10-20, it's not as if they are some rare collector's item. And if you went on some Mac forum and just asked there'd be people with images like me or old discs just sitting around in closets collecting dust you could have for a stamp. It'd be a one-time issue, because then you'd immediate image it and keep it forever since they aren't big (looks like my Mac OS X Server 10.6.dmg is about 6.95GB).


Both 10.4 and 10.5 could run on x86 platforms.

10.5 supported x86 out of the gate.


Right you are! I wonder why 10.6 stuck in my head, I even got the first Xeon MP (as a free replacement from Apple for my G5 PowerMac liquid cooling blowing out and destroying my system utterly) in 2006 and it was Tiger. What a time since then. I guess 10.6 Server was the first officially virtualizable version so maybe that's why, but yes you're absolutely right, 10.4(.4) and 10.5 should both be runnable in a modified VM in principle.


Though I'd suggest its probably easier to run the PPC versions at this point.


You can download it (and every other old version of macOS) from the developer portal: https://download.developer.apple.com/Mac_OS_X/mac_os_x_v10.5...


Those links don’t work for me anymore, and besides, I’m fairly sure they only had Leopard and Snow Leopard. It is surprisingly difficult to get access to old versions of macOS, even ones that you paid for.


> paid for

I think you mean "obtained a license to use" ;)

While you did purchase the license, (IANAL) I don't think Apple is legally obligated to produce the original software for which you obtained a license for, at least per the license agreement[0].

0: https://images.apple.com/legal/sla/docs/osx_snow_leopard_sec...


Every version of macOS is listed and the downloads work for me. You'll need to log into an apple account first.


I have a valid Apple Developer account logged in and it doesn't seem to work for me :(


eBay?


It's not like there's a limit on space. Also this is a choice, and having choices is pretty good IMHO.


Linux is a very different case. Linus is talking about the kernel breaking comparability with userspace.

The kernel not breaking userspace is quite a different thing, and in particular when comparing the macOS kernel and Linux because Linux specifically has a stable ABI and macOS doesn't. On macOS it's libsystem that provides the stable abi and kernel interfaces change underneath.

Beyond the kernel API's and libsystem you have the actual platform interfaces. Carbon is a deprecated framework and has been for a long time. If apps are using carbon then they must get updated, Apple has been warning about this since 2017 (beyond just the depreciation notice in 2012 and everything in between).

And Windows95 apps almost certainly do not run unmodified... it's been awhile and things may have changed, but as I recall in XP (sp2, even?), they introduced the thing to run an app in compatibility mode and you actually select which is version to be compatible with. This works in some cases and not in others. Not to mention win32 support has held back Windows stability and progress for a long time.


Linus personally has really strong opinions about breaking userspace, but the Gnu/Linux distributions do not. Heck, when I installed Inkscape from the Debian 10 repositories, copy and paste wasn't working because they decided to remove python-uniconverter due to bugs.


I was recently greeted by Brew message that my Mac OS X is too old and is not supported :) It was simply a warning, the installation (GNU tar) run OK and gtar worked. The OS is El Capitan and it's just turned 4 years.


Under 1% of homebrew users are on 10.11 (https://formulae.brew.sh/analytics/os-version/30d/), so continuing to test on that OS version would not be a good use of the maintainers' time.


Interesting that there are five times as many Brew events from Catalina machines than El Capitan machines, and Catalina isn't even out yet.


El Capitan machines are already set up, Catalina machines are being set up?


Since macOS updates are free, is there a reason you haven't moved off of El Capitan?

Do you just like the way it looks and feels? Is there an essential program that you use that need El Cap? Are you using a 2008 MacBook?

I'm not being critical. I have to run Jaguar on a dedicated machine I keep in the closet for a niche reason, so I'm curious what problem you face.


It's old hardware, it cannot run anything newer: Mac Mini 2009. Pretty snappy otherwise (with maxed memory). It just sits on desk, I mostly use it via SSH.


Apple dropped security updates on El Capitan with release of Mojave. El Capitan is supported on MacBooks from 2009. I think it's time to upgrade.


Homebrew in general is very aggressive about dropping support for older versions.


Carbon UI framework has been deprecated for 12 years

Interestingly, Carbon still pops up here and there, even in programs from large companies with tons of resources like Adobe.

The giveaway is when you ask it to do something hard and instead of getting the SPOD, you get the watch icon.


The NSMenu system is still Carbon actually, as far as I know (last I dug deep into it).


As a kid who learned computers on Mac OS 7, that's beautiful


Furthermore, from the start Carbon was intended as a transitionary solution and Apple told devs this many many times. Carbon apps were supposed to be the bubblegum-and-duct-tape answer for holding users over while you worked on your Cocoa version. It’s not Win32 or MFC and Apple never claimed it was.


This is true in hindsight and seemed obvious at the time, but in fact Apple insisted that Carbon was a long-term solution and treated it as an open-ended commitment. They dogfooded it internally, using it to write the Finder and iTunes, two first-party apps so central to the user experience that novices confused them for the OS itself. There were working betas of Carbon 64, and Apple's decision to reverse course and deprecate Carbon on 64-bit platforms in the middle of the release cycle caused significant developer whiplash.


Why in hindsight? It was a compat layer for essentially adapting OS 8/9 software for OSX.

It was deprecated in 2012. In 2017 with High Sierra they announced that carbon apps would require changes to work at all.


Nobody with any sense really believed Apples protestations during the 1999-2007 time period that Carbon was going to be a long term solution, but Apple really tried to sell folks on the idea that it would be, until they suddenly low-key ghosted on the Carbon developer community at WWDC 2007, when 64-bit Carbon was quietly removed from the keynote deck a year after it had been prominently featured.

https://www.macrumors.com/2007/06/13/leopard-drops-carbon-64...

Archive.org records the wailing and gnashing of teeth which accompanied the relatively sudden announcement:

https://web.archive.org/web/20140115143937/http://lists.appl...

https://web.archive.org/web/20140115112428/http://lists.appl...

You can still find developer documents in Apple's archives referring to a 64-bit upgrade path for Carbon apps.

https://developer.apple.com/library/archive/documentation/Ca...


Apple didn't present Carbon as merely a compatibility layer at the time. It was positioned as the procedural C paradigm alongside Cocoa's and Java's object-oriented paradigms. Carbon was supposed to be one of the three development frameworks for OS X alongside Cocoa and Java, which is why Cocoa and Carbon have drink-themed names to fit with Java. Porting a classic app to Carbon was "Carbonization."


Carbon wasn't presented that way in the beginning. They were supposed to be the three equal pillars of OS X development. Carbon was the C-based procedural framework, Objective-C was the object-oriented framework, and Java was, well, the Java framework.

That's why they have drink-themed names to fit with Java. Porting a classic Mac app to Carbon was called "Carbonization."


I coded in Objective-C back in the NeXT days, did various (small) Cocoa and iOS apps and still use Macs daily - but work at Microsoft now.

Despite my own personal plights with small desktop utilities I write now and then, I don't think Apple has broken Cocoa _that much_ or that it is untenable for a small company to ship cross-platform desktop apps these days (never mind Electron, I'm talking about toolkits like Qt, Xamarin, etc.). _Especially_ if their core functionality can be cleanly detached from the UI.

I poked around a bit to figure out the detail to which the software does estimations, and it does seem like there is a very complex UI:

https://www.turtlesoft.com/Images/Goldenseal%201024x768.jpg

...but I keep wondering how much of it could be abstracted away by a cross-platform toolkit, and what kind of separation there is between the estimation/modeling code and the UI itself.

We'll never know without a good overview of the internals, but my guess (based on looking at many internal corporate apps over the years, from the pre-web days) is that this evolved organically over time and built up technical debt by literally following UI abstractions rather that isolating their core code from it.


> Goldenseal was written in C++, using the CodeWarrior Pro development environment. We used the PowerPlant application framework (part of CodeWarrior). The layout for some windows was done using Constructor (also part of CodeWarrior). CodeWarrior is a great program. Thanks, Metrowerks!

> The basic database features in Goldenseal are based on NeoAccess, licensed from NeoLogic Systems. Unfortunately we ended up re-writing a large percentage of the code there, which explains 6 or 8 months of the delay in our shipping date.

> We have used OOP (object oriented programming) throughout Goldenseal. It's a very good programming model which makes it easy for us to add new features, without getting lost in the 5+ million lines of code that the program now contains. A lot of that is comments and spacer rows, but it still represents many programmer-years!

> We estimate that about 50% of the code is for the basic interface—screen display, file management, mouse behavior. About 25% handles data storage for the many object classes, and 25% handles posting, action dialogs, basic accounting and estimating features.

https://www.turtlesoft.com/Goldenseal-Software-Reference/Sof...


That is a bunch of Buttons, Textfields, Dropdowns and Listviews? How hard can that be in Cocoa?

In their blog post they said they worked for three years on the Mac version ... maybe the company is just one person not having much time or coding experience. Maybe I am underestimating the problem, but in my time they would have just hired a student or the nephew of their neighbor to hack something together useful.


They also mention in another post that four different contractors gave up on migrating it, three targeting Cocoa and one Qt. Maybe the codebase is brittle and hard to work with and not really worth the effort.


Without using their software, and just going based off of screenshots, I don't see anything that couldn't be ported to Qt fairly easily.

Now if there is a lot of intertwined code that makes the UI less portable, that's another issue, but I currently don't see how their UI layer could be so complex as to not be easily shifted to Qt.

The cross platform nature alone would be worth the cost in my opinion.


Or even electron (gasp). The application looks like simple webpages.

It might even work better as a web site, so the customers don't have to worry about backups, sharing data, etc.


Or just go the whole way and make a virtual machine

http://fabiensanglard.net/anotherWorld_code_review/


>...but I keep wondering how much of it could be abstracted away by a cross-platform toolkit, and what kind of separation there is between the estimation/modeling code and the UI itself.

Apparently they tried QT in 2015: https://turtlesoft.com/wp/?p=198 .


Apple started selling 64 bit machines in 2005 and supported it in the operating system with Leopard: http://theocacao.com/document.page/343

There weren’t many 32 bit only Intel based Macs.

This complaint says far more about the developer than Apple.


> There weren’t many 32 bit only Intel based Macs.

I don't understand, why Apple even created x86 osx ABI.

When they introduced the first x86 Macs, the writing was already on the wall; in the same way, that there are claims that Carbon was deprecated for years, x86 was in the door on its way out. You would not create a new Carbon-based app today, Apple introduced ABI in similar circumstances? Well, maintaining it then for decades to follow comes with the territory.

Yes, as an user, I do mind removing it. For example, Apple had broken Preview scanning on Samsung MFPs for the entire Mojave lifecycle ([1], [2]), there's no indication that they are going to fix it, and as a workaround, users are using Samsung Easy Document Creator, which talks directly to scanner, avoiding Apple scanner libs. Yes, it a 32-bit app.

[1] https://discussions.apple.com/thread/8552818 [2] https://h30434.www3.hp.com/t5/Samsung/Scanning-problems-with...


When apple started their intel transition, intel used 32 bit CPU’s primarily. The first intel macs (core Duo) and intel dev kits were 32 bit, the core 2 duos onward were 64 bit. It’s really a small amount of Macs.


The previous gen, P4 and P-D were already 64-bit, XP 64-bit and 2003 64-bit was out, Linux was 64-bit for years; Core Duo not being 64-bit was a stop-gap and everybody knew that. Creating new ABI on that was exactly like launching a new Carbon project today, that's why I compared these two in the first place.


>P4 and P-D were already 64-bit

Ah yes had forgotten that (the Apple Intel Dev Boxes were P4's)


Weren’t they Xeons?


Nah. P4s, a gig of SDRAM, either a GMA 800 or GMA 900 for graphics, a 1x and a 16x PCIe slots, a 160GB disk and a DVD drive all running on the first Tiger service pack (10.4.1). Yours for 18 months for only $999! (‘Twas a rental.)


Late Pentium 4 in a pretty much random standard PC motherboard, I think it didn't even have EFI (even the broken 2001-vintage one that was common for years on x86 Macs).

Xeons arrived with Mac Pro.


The first response sums it up nicely:

> The people I hear complaining about this are those who, like you, didn't move to Cocoa. Carbon was a _temporary_ transition API*. It was necessary when Mac OS X shipped in March 2001, but even though it wasn't yet formally deprecated, it was clear it would be.

- https://lists.apple.com/archives/cocoa-dev/2019/Oct/msg00021...


The fact that they had a long time to transition doesn't negate the fact that they have to spend the time doing the transition.

At market-rate developer salary, it doesn't take a lot of man months for the port not to be worth the investment for small shops.


Actually, considering the long-term average growth of “developer market-rate salaries” it makes sense to transition as early as possible (as soon as the new alternative is announced/becomes available) to minimise overall cost and capture the market share of those who shan’t transition when deprecation is announced and cost of developers is too high to make it worthwhile.


Except if the new thing flops and the old thing doesn’t die - which is much more common in the Windows world. MFC is mostly dead. So are ATL and WPF. Win32 is still going strong though.


It's a bit judgmental, though the original complaint is also a bit whiny.

The reality is that Turtle couldn't make the business case to port to Cocoa at any point for 18 years, probably because most of their customers were Windows.

That's not them being lazy, that's just them directing their resources according to market demand and engineering constraints.

And Apple is not being greedy or uncaring, they also have to direct resources according to market demand and engineering constraints.

If Turtle had more Mac customers, they'd have either ported their app, or they'd purchase a Carbon compatibility library. And if more developers were actively using Carbon, Apple would put more resources into it.

They aren't, and so the two are going to part ways.


> But that's part of what lost them their lead after the '90s

I think that guy lives in a bubble, Microsoft is still by a huge margin the lead in desktop OS market share :-P.

And really the main reason is that they try their hardest to not break people's applications. If Windows suddenly couldn't run the applications people wanted, everyone would migrate to Linux (and some to Mac, but Linux is free so the majority would go for the free stuff).


I've seen quite a few legacy Windows app needing to be run as administrator, compatibility mode, or both. Just because you can do something doesn't mean you should. It's glaringly irresponsible that applications should be given read/write access to the C:\Windows folder because that was acceptable in the 90s.

Now when I get an exe that needs to run in compatibility mode I don't even bother with it. I'm not compromising my computer because a developer has abandoned their software.


Good for you, but other people want their computers to do work for them with the applications they want to use.


Jens is way oversimplifying the developer story around Carbon, FWIW. It existed because Adobe and Microsoft wouldn’t port their apps without it, and that wasn’t going to change in the foreseeable future. It wasn’t deprecated in the technical sense until 2007.

Yeah, the arc was that Cocoa was the future, but as late as 2007, Carbon was still widely considered a viable target for new apps.


By the way, it wasn't just Adobe and Microsoft; practically no developers wanted to write all of their UI code twice.


> This complaint says far more about the developer than Apple.

Yes, it says that they didn't bother to waste resources doing unnecessary changes.

But it also says about Apple that they do not respect the time and resources of the developers that bother to support their platform.

There is no way to spin Apple breaking APIs and programs that people have worked on (for developers) and bought (for customers) in a positive way.


> Yes, it says that they didn't bother to waste resources doing unnecessary changes.

Keeping up with API changes from a decade ago doesn't qualify as unnecessary.

> There is no way to spin Apple breaking APIs and programs that people have worked on (for developers)

Here's the point. Apple isn't doing that. Apple don't visit customers with old macs to push breaking patches.

What happened is the developers made something that worked on old macs, didn't put in the effort to keep it up to date and are now complaining that Apple won't put in the efforts to support the old APIs on new systems anymore.

> and bought (for customers) in a positive way.

This is the developers fault I guess.


> Keeping up with API changes from a decade ago doesn't qualify as unnecessary.

The API didn't change, it worked until Apple decided to break it in a subsequent version of macOS.

> Here's the point. Apple isn't doing that. Apple don't visit customers with old macs to push breaking patches.

They broke the API in new macOS.

> What happened is the developers made something that worked on old macs, didn't put in the effort to keep it up to date and are now complaining that Apple won't put in the efforts to support the old APIs on new systems anymore.

Yes, that is exactly the issue: Apple shouldn't have broken the API in the new macOS. The developers are perfectly in their right to complain about Apple breaking their applications and forcing them to waste time rewriting code that already worked to do the exact same thing only now in a different way because Apple doesn't care about the developers' time.

And yes, it is all on Apple - the breakage wasn't forced on Apple, it was something Apple decided to do.


It’s been deprecated with warnings for years, this isn’t the same as “breaking”.


If in OS version N a program works and in version N+1 the program doesn't work, then the OS version N+1 broke the program. That is all there is to it, anything else is just excuses.`


Addendum: the kernel remained 32-bit until Snow Leopard.


And 64-bit macs came out in 2003, 64 bit intel macs didn't come out unti 2006


> my current windows app STILL WORKS ON VISTA, i don't have to do ANYTHING to "stay up to date" with Windows, cuz they support backward compatibility, and don't force changes on developers.

> Meanwhile, our Windows version hasn't needed any work since 2000.

Microsoft's impressive backcompat is a blessing as much as it is a curse, and is also the cause of the [subjective opinion incoming] awful User Experience and complete lack of UI and UX consistency, and it remains the number one reason I don't wish to go back to Windows.


Win32, WinForms and WPF desktop applications are all able to follow the OS theme. With Metro there's been a departure to a visually incompatible paradigm. This is to say that the lack of UI consistency is a political problem, not a technical side effect of keeping old UI technologies running.


I wouldn't be so sure of this.

I made the Windows -> Mac transition in 2008 and recently played around with a modern Windows machine. I was pretty stunned to find the Control Panel experience... not unchanged, but still eerily similar to Windows XP.

Windows' Control Panel is a trainwreck of UX compared to OSX System Preferences. Being such a core part of the OS, I really expected to see, well, something new and better. But then I remembered how countless programs embed themselves in the control panel via DLLs, so Microsoft probably can't make major changes to the UX without breaking binary compatibility with ancient software packages.


macOS has third-party preference panes too.


Yeah, but they're all separate from any other preference pane. Everybody gets the same framework to create completely independent prefpanes, not hijack Apple's existing ones.

There are no hooks, for example, to modify the Displays prefpane; if Apple decides they want to update it, they're free to do so. On the other hand, a lot of graphics card vendors still needlessly add their own little tab to the display adaptor control panel and the desktop's context menu.


...Political?


Decisional or organizational, if you prefer. This is about teams at Microsoft fighting, top levels failing to come up with an agreed upon way forward and quality control departments lacking any standard for design coherence.


Their UI inconsistencies (i find the UX perfectly fine myself) mainly comes from their insistence on creating new toolkits left and right due to their internal power struggles. They have improved existing stuff several times, but they most likely have a Google-like situation where new stuff is rewarded more than keeping old stuff running (though unlike Google they also manage to keep old stuff running, so perhaps things aren't as bad).


I would have much the same sour grapes attitude of many in this thread, except for one point: Apple cannot even keep code examples working. Most of the code samples on Apple's dev site don't even compile a year or two after they are written and Apple doesn't bother keeping them updated. This is a rather large problem and verification of how hard it is to keep up.


I’d you’re talking about swift yes they changed a lot since the early versions and many code examples are out of date. I believe the language has stabilized now.


Most, but the Objective-C examples also have breaks.


Only very rarely.


Uhm, the OpenPanel changes basically crapped on a whole group. Apple does not keep its examples updated. Heck, now its hard to find examples at all.


I still write ObjC more often than most, and I’m pretty familiar with their stuff - I’m generally the first to claim examples are like a needle in a haystack, but they almost always do work when you find them.

They definitely might be outdated, tho.


This is entirely by design.

Apple doesn't want you to build an app 20 years ago, make no updates to it and continue to sell it as though nothing has changed in the time since.

Their website is compromised of mostly broken links, the design is dated and it is showing reviews and awards from 10 years ago.

Clear to me that they couldn't be bothered to put any effort in at any point in the buyer's journey so I say good riddance to them.


Is this the reason that everything on iOS app store is transitioning to subscriptions where before an app cost $3 and now it costs $3 a month, because developers worked out that the cost of keeping up with Apples constant platform changes is too much?


I think that’s more that developers want to make a decent living. The vast majority of subscription apps have a server side maintenance cost. As the platform matured and developers realized those users were going to stick to their apps for years without paying another penny, then it becomes untenable.


That's simply not true, many offline apps are transitioning to subscriptions and maybe add some lame cloud sync as an excuse/afterthought.

Happened to me recently with an app which is the interactive version of a book. From one day to the other they switched from buying chapters/the book to some stupid subscription.

If they can't make a living they need to charge more and if they can't charge more they need to find a real job/business.


The problem is that so many companies burning VC cash and/or ad dollars have trained customers to think software costs less than a cup of coffee. If you charge what you need for a sustainable business up front, your sales will be enormously less and you’re going to get tons of negative reviews from people who think a $20 app would need to cure cancer to justify that price. (And, of course, even $3 deserves lifetime free updates and new features)

It’s especially bizarre that you’re ranting about them needing to find a real business when that is exactly what they’re doing by finding a billing model which is viable long-term.


Very few developers have proven that they're capable of building a sustainable business (like e.g. OmniGroup).

It's impossible to say if a particular app will in a week:

a) still be available

b) if it will have switched to subscriptions or free + IAP or free with ads and IAP

c) what the IAPs will be and if your old (if any) IAPs will still work.

d) if they'll decide to sell your info to the next available bidder

This is why almost all apps are worthless and why I've essentially stopped purchasing or downloading apps. It's simply not worth the trouble to invest time to learn to use an app and investigate whether you can trust the developer.


> If they can't make a living they need to charge more and if they can't charge more they need to find a real job/business.

That’s exactly what they’re doing: charging more.


They're also changing the nature of the commercial relationship from buying to renting. Stopping the subscription stops access to many essential features, no matter how much one has already paid.

The only kind of reasonable subscription is what's used for IntelliJ & co: if you stop paying after a minimum of X months, you get to keep what you paid for.


Users are trained that app costs few bucks. Charging fair price, e.g. $200 just won't work. So they're trying to charge that price over years.


What app is $200 really a fair price for though? Most apps do not provide nearly that much value to users, regardless of their development costs


That's an interesting question.

$200 is about 10 to 20-ish decent meals in most US cities (not high class, but not bottom barrel fast food either).

Do _most_ apps really not provide as much utility over a lifetime of use as ten meals?


A business is about covering development costs.

If the users see no value in paying for it, then there are better business opportunities to spend development efforts on.


I don't know would I calculate value for non-trivial apps. What's value for FaceApp app? Navigator app?


Essentially yes. Apple apps require constant maintenance between SW versions as proven by the quantity of compatibility updates on every new iOS version ever.

A particularly funny case was one otherwise rock-solid app which started SIGBUS'ing on iOS 13.


If that is the case, it doesn’t seem like a good value proposition for the user considering that’s a 4800% price increase assuming you use the app for 4 years. The improvements in iOS are nice, but not that nice...


Not disagreeing but it’s worth considering that the app on a subscription is more likely to be maintained.


Regular security updates is worth it. Device feature support is a bonus.


From iOS version to version the API typically does not change that much. The big changes are often new APIs, and those are also the ones most often in flux. Adding support for the Apple Watch or Shortcuts for example.

I never found iOS API changes any more extreme than Android or the web.


Well, they'd only have to do one major transition, to Obj-C/Cocoa and for that the writing was on the wall for 20 years (and official for 15 years) and everything would have been much smoother.

Had he used Cocoa, the rest, would have been trivial (to x86, to 64 bit, etc).

Or he could use whatever they like (C++, Pascal, what have you), and have their own UI/compatibility layer between OSes, like Adobe for example does (and several others, big and small: Sublime Text is an one man shop, and they make their own UI just fine).

The first response in the thread is not far off:

The people I hear complaining about this are those who, like you, didn't move to Cocoa. Carbon was a _temporary_ transition API*. It was necessary when Mac OS X shipped in March 2001, but even though it wasn't yet formally deprecated, it was clear it would be. The Carbon UI frameworks were deprecated circa, um, 2006(?). QuickTime has been deprecated nearly as long. 64-bit systems shipped in the mid-2000s, even before the x86 transition, and it was obvious then that 32-bit would eventually go away.

Eighteen years is _forever_ in the tech industry. At the time Cocoa was introduced, the Mac itself hadn't even been around that long!

It sounds like keeping an app limping along on 30-year-old APIs, and then suddenly trying to move it forwards all at once, is a bad idea. By comparison, keeping a Cocoa app up to date isn't that big a deal. I was maintaining Cocoa apps during the 64-bit, x86 and ARC transitions and had to make very few code changes. I've been out of the UI world for about 8 years, and there have definitely been significant changes in areas like view layout and document handling, but adapting to those isn't rocket science.

Yes, Microsoft is rather fanatical about compatibility. But that's part of what lost them their lead after the '90s: the amount of development resources needed to keep everything working exactly the same, and the difficulty of making forward progress without breaking any apps.

—Jens


In another post they mention that four different contractors tried to update their software and gave up, on top of the four years they spent themselves.

I think the problem is that their software is just a complete mess but they won't admit it. This whole thing has "sunk cost fallacy" written all over it.


While I agree with you relating to Carbon, Microsoft is pretty much the still only game in town for desktop computing (which also includes laptops and 2-1 hybrids) in worldwide market share.


Yes, but macOS has enough native apps for everything, and the advent of web apps and mobile apps, made the whole point moot.

You can do everything and more on a Mac.


Go look around anyone doing office work on their phones.

Tablets, sure, when converted into pseudo-laptops, and unless we are talking about iPads here, the European shops are increasingly replacing their Android tablets on sale by Windows 10 laptops with detachable keyboards and touch screen.

As someone that does native/web development, the only area where Web wins are the typical CRUD applications, anything more resource intensive just brings the browser to halt, and for stuff like WebGL it still hit and miss.

As for doing everything and more on a Mac, as much as I like Metal Compute Shaders, they aren't a match to CUDA tooling.

Finally, as much as I like Apple's platforms, they are out of reach for a large segment of the world population, no matter what.


>Go look around anyone doing office work on their phones.

Depends on the office work. A lot of stuff is doable on a phone even, as many common place apps are available, if it wasn't for the ergonomics (small screen, no full keyboard, etc).

>As someone that does native/web development, the only area where Web wins are the typical CRUD applications, anything more resource intensive just brings the browser to halt, and for stuff like WebGL it still hit and miss.

As someone who is a heavy user of the other apps (NLEs, DAWs, drawing/bitmap editing) where the web is a non-starter (and I don't care for all the half-arsed attempts at web-DAWs and such), I agree.

But for business, CRUD apps are 90% of their needs, plus Word/Excel etc, for which Google Docs is a lot of the way there (and even if not, they exist in good shape natively for both Windows and Mac).

>As for doing everything and more on a Mac, as much as I like Metal Compute Shaders, they aren't a match to CUDA tooling.

Perhaps, I don't use CUDA or do 3D at all.

>Finally, as much as I like Apple's platforms, they are out of reach for a large segment of the world population, no matter what.

Sure, but that's also true for workstation-like PCs, and commercial compilers/IDEs, which you're in favor of, no? :-)


Workstation like PCs and commercial compilers/IDEs can be had for cheaper prices than Apple hardware.


Depends on the "workstation like PCs" and "commercial compilers/IDEs".

Anytime I put together a decent PC with best of breed parts, it goes to 3-4K. And commercial offerings from Dell with similar specs also go there, same for laptops, e.g. Lenovo, and the like.

And I've seen commercial compilers/IDEs priced in the $1K/$2K range, with which you can surely buy a Macbook Air or similar...


No need for a Ferrari when a Fiat does the job.

Naturally there are those that feel entitled to get a Ferrari to go down the grocery store, but that is their problem.

If one is buying enterprise class prices, then it is always going to be more expensive with Apple's hardware, because those compilers and IDEs are not part of Apple's offering, adding to the already expensive hardware price.

And if by hardware workstation, you want a really beefy one, the Apple's alternative is only their top hardware.

Thus at the end of the day, when one does the math of what one is getting per buck/dollar/yen/..., still way over the usual budget on PC side.


Except decent gaming. One of the largest industries in the world.

But yeah, if I can't work on a Linux machine I'd settle for Mac


True, was focusing on work stuff.


Come on, there certainly are reasons to complain about Apple deprecating stuff and strongarming developers into their walled garden, such as OpenGL, the fast-move to Swift/Swift versions, SIP/mandatory signed apps, XCode a moving target with OS updates, EFI BIOS updates for eg. booting from afs only downloadable as part of 6GB Mojave and soon Catalina OS updates, deprecation of semi-official Mac OS ports without equivalents on brew, stone-age Unix userland tools (bash, awk from early 2000), Java (and now Python) unbundling, and probably others I'm not aware of since right now I'm not very much into developing on Mac OS and iOS.

But Carbon isn't one of them. I knew Carbon was about to be deprecated in 2005 when I was coding a crappy UI for a crappy OCR solution. Carbon was just a forward-compat GUI lib on Mac OS classic (1998?) for apps to run on Mac OS X (2000).


Apple may initially have intended Carbon as temporary, but has changed their stance when they started adding new APIs that never existed in MacOS 9 - HIView/HIWindow was intended to unify Carbon and Cocoa, there was even a 64 buy version of Carbon that shipped in Mac OS betas, and when Retina-Display Hardware shipped, Carbon received new APIs to deal with that.

Likewise, Cocoa in 10.0 was buggy and incomplete. It took until about 10.4, 10.5 for Cocoa to reach feature parity with Carbon and many Cocoa applications were using Carbon for certain features (in facts, last time I checked, Cocoa menus were implemented in Carbon).


HIView and HIWindow were added, if I recall correctly, because it allowed other toolkits (tk, Qt (who only recently moved off this), etc) to draw the native look “properly”. It was never actually up to date or well documented, and they kept it around just so the OS didn’t look so totally bizarre as you moved through it (like you see on Windows).

Bolting on Retina support to that is such a no-brainer that I hesitate to call it adding features - that was table stakes in making sure the switch to retina didn’t look like total ass.

None of this stuff ever changed how Carbon was deprecated. People just sat around not listening to Apple, and then the last two years the bigger GUI toolkits finally did the work to transition properly.


HIView was up to date and documented, and there were WWDC sessions teaching developers why they should move to HIView and how to do it. Likewise, Apple provided documentation about how to port your Carbon application to 64 bit Carbon. Maxon reportedly even had a 64bit Carbon version of Cinema 4D ready to ship when Apple suddenly announced that they would abandon 64bit Carbon - telling developers to disregard the 64bit Carbon they were still shipping in betas.


Yeah, no, this really wasn't the case.

Toolkits that used HIView/HIWindow looked very out of date compared to proper Cocoa implementations (Qt, contrary to popular belief, wasn't really "native" for the longest time since they did this - it's part of why it always looked off).

There are very valid reasons to be frustrated with Apple, but the writing has been on the wall for this stuff for years now. Nobody should be complaining at this point.


Exactly. In the days of WWDC 2003/2004 there were numerous separate Cocoa/Carbon sessions, and the Carbon message was to go fully to HIToolbox and Carbon Events.

Carbon NIBs even existed, although the format (XML) and concept (definitely not “freeze-dried objects”) were totally different from anything Cocoa. Xcode 4 and up couldn’t edit or view Carbon NIBs, which made that tech deprecation pretty clear.


> deprecation of semi-official Mac OS ports without equivalents on brew

Apple doesn’t deprecate ports, since they don’t run it.


In other saner platforms (Win32 or even X11, if you ignore Gtk and Qt) you can pretty much assume that if something is deprecated it will still remain around even if it wont get updates as doing otherwise will break all the applications that rely on that tech that people bought and rely on. Especially when that deprecated something is still used by developers despite it being deprecated.

But apparently Apple doesn't care about their users and developers.

Saying "told you so" doesn't help much, the only thing that helps is keeping things working. Anything else doesn't matter.

Though if there is anything that can be "told you so"'d that would be bothering with Apple's platforms when Apple doesn't care about you.


> X11, if you ignore Gtk and Qt

You don't need to ignore those. GTK+2 was released ~17 years ago and Qt4 was ~12 years ago. Apps written with them still run perfectly fine on modern Linux distros.


Qt4 applications will run perfectly fine on modern Linux distros assuming these distros have Qt4 available (let alone installed since Qt isn't as common as Gtk to be considered something you can rely on being installed on any desktop installation). Qt4 is dead from its developers perspective and because of that it will be removed, like Qt3, Qt2 and Qt1 before it. Debian, Arch Linux and others have plans underway to do that. After that is done, Qt4 applications will not work unless the user decides to manually install it (which most likely will imply compiling from source - and good luck solving any incompatibilities between Qt's massive codebase and any of its numerous dependencies - after all it isn't just Qt that breaks backward compatibility, some of its dependencies can and will break it too).

Same with Gtk+2. Once Gtk+2 itself is removed - like Gtk+1 and Qt4 before it - from the repositories, Gtk+2 applications will also stop working. Similar case with Gtk+3 now that the Gtk developers are working on a yet another incompatible major version, Gtk+4.

Also keep in mind that even though the major versions of Gtk+2 and Qt4 were released years ago, applications are still being written and released using those versions (e.g. Lazarus applications to this day are released with Gtk+2 as the Gtk+2 backend is by far the most stable - the Gtk+3 backend is still in alpha state, due to the low manpower that the project has - note that this manpower wouldn't need to be wasted if Gtk+3 was backwards compatible with Gtk+2 and instead all that time would be spent in stabilizing and improving Lazarus, which shows how much developer time is wasted when libraries many others rely on break backwards compatibility).

Now sometimes someone will suggest that applications should bundle their dependencies with them, but this introduces other problems - like the bundled libraries not using any features or fixes or having configuration mismatches with the libraries already installed on the system. This is a worse situation since instead of potentially addressing forward compatibility (you wont know for sure if the developers of the systems you rely on wont promise to not break your application) you are breaking current compatibility.


It sounds like you are just airing your general grievances about Gtk2 and Qt4. The fact is, you claimed X11 with Gtk/Qt wasn't "sane", because sane platforms like win32 "will still remain around even if it wont get updates as doing otherwise". Gtk2 and Qt4 are still around and up-to-date packages are available for all major distros.

And your last paragraph sounds like you don't really know what you want. Do you want old applications to run exactly as is (which is solved by bundled dependencies), or do you want their developers to update them forever for decades and decades (which is solved by abandoning old toolkits)?

Finally, none of this even touches the fact that your "sane" win32 platform has incompatibilities too and there are many, many old win32 apps that doesn't run in modern versions of Windows.


> The fact is, you claimed X11 with Gtk/Qt wasn't "sane", because sane platforms like win32 "will still remain around even if it wont get updates as doing otherwise".

"...as doing otherwise will break all the applications that rely on that tech that people bought and rely on". Do not remove the important bit.

Also i wasn't only referring to Win32 as Win32 does get updates, though they are minimal.

> Gtk2 and Qt4 are still around and up-to-date packages are available for all major distros.

For now. But as i already wrote several distros like Debian (and thus any that depend on it) and Arch are planning on removing it (just like they did Qt3, Qt2, etc). I already wrote that, why are you responding as if i didn't already addressed the issue here?

> And your last paragraph sounds like you don't really know what you want. Do you want old applications to run exactly as is (which is solved by bundled dependencies), or do you want their developers to update them forever for decades and decades (which is solved by abandoning old toolkits)?

It only sounds like i don't know what i want because you see only "which is solved by bundled dependencies" and "which is solved by abandoning old toolkits" as the only possible solutions. I didn't brought those up because they are the only possible solutions, i brought those up to explain why they are bad solutions (something i'm not going to repeat, i already wrote that).

Another solution, which i have repeated multiple times, is for the libraries to not break backwards compatibility. If the libraries do not break backwards compatibility then you can simply link against them dynamically, rely on them being there and provided by the OS (or at least ask for them and expect the OS to be able to provide them) and you wont need to bundle anything (the library is provided by the OS) nor worry about breakage (the application will keep working because the library wont break).

I mean, it isn't rocket science, it isn't dreamland, it is something already happening in both Windows and Linux to an extent. On Windows is the USER32.DLL et al, on Linux is the C library, libX11, libGL, etc. What i'm saying is to extend this to the libraries that also provide a full UI, not just the lowest levels.

> Finally, none of this even touches the fact that your "sane" win32 platform has incompatibilities too and there are many, many old win32 apps that doesn't run in modern versions of Windows.

Yes, there are incompatibilities but Windows is still trying to be backwards compatible and for the most part it succeeds in doing so. I have a lot of older software that work perfectly fine under Windows 10, either by themselves or via minor tweaks. Any incompatibilities that are the aren't because Windows developers didn't try, it is despite their efforts.

On the other hand incompatibilities with Gtk, Qt and now macOS are there because their developers explicitly and intentionally decided to break their APIs. They do not even try (Qt would have a hard time due to it being a C++ API, but Gtk has no excuse).


Lazarus was also amusing for shipping a Carbon backend for macOS until very recently - and then telling people to just use Qt5 while waiting for a proper Cocoa port.


Yeah AFAIK there is only a single programmer who primarily worked on macOS support and he had to work on both the Carbon and Cocoa backends.


As an open source developer, I have mixed feelings about this.

Yes, Microsoft seems very keen on keeping Windows compatible even with ancient versions of the OS. New stuff usually is optional and APIs that behaved strangely in Windows 95 still behave the same way in Windows 10.

In Apple land, APIs may change their behavior whenever Apple deems it necessary. I ran into issues because of this with almost every macOS update since 10.8. And I see that even big players like Adobe keep running into compatibility issues all the time.

On the other hand, I'm spending just a few hours per week working on my project [0] and I manage to support an app that now runs on 10.5 through 10.14 and on three different CPU architectures with a single package. So no, I don't think you need to "throw 100 programmers at it" to get a working macOS version.

[0] https://github.com/enzo1982/freac


Not all applications are the same. Your application may not have the same requirements from the OS than some other application. As an (extreme) example, an application written in C that only needs from the host OS is a top level window and basic events for handling input while doing everything else with custom code will be much more portable and easy to keep up with changes than an application written in a language with its own runtime, creates a native UI for each platform, uses native APIs and tries to integrate with the native OS.


Although, it seems like you’re reimplementing platform controls? This seems like a mini-Qt: https://github.com/enzo1982/smooth


Yes, that's my own custom UI framework. Most of the platform dependent stuff is implemented in the smooth library.

In hindsight, it would have been easier to just use Qt, GTK or wxWidgets. But I learned a lot by doing this myself and wouldn't want to miss that experience.


I think it's more of a challenge when you have someone who is more skilled as a subject matter expert than as a programmer, which may be the case when you're the tech person or even founder of a small business.

That person may not be doing good unit testing, they might not use the best tools, and find that supporting different configurations may require a lot more manual work rather than maintaining some carefully crafted #ifdefs.

And maybe one person could do it, but one salary they can't justify based on the demand may as well be 100 programmers.


Context: The authors software website: https://www.TurtleSoft.com

A screenshot: https://www.turtlesoft.com/Accounting-Software.html#Chart_Of...

I want to be respectful to an indie developer, but think it’s worth considering the kind of niche he works in (guessing windows -centric) and probably does more high touch sales.

I want to also guess that many of the people on that thread are from an older generation of developers, might be worth considering what the tradeoffs have been in language improvements that have attracted more people to writing software compared to the authors in that thread saying C++ is all they need.


His users are surely customers from 20years ago who never wanted to switch to anything new.


This is exactly how software should be: it should just work reliably for years and decades on end.


Sure that's fine. You can stick with your decades old computer running a decades old OS and decades old software.

What you can't do is expect the rest of the technology world to standstill or forever maintain backwards compatibility.


I think it is a shame that longevity of software is a thing of the past.


A myth of the past: the reason why everyone switched to online update cycles was recognizing how common security updates and other bugfixes were, and how much you’d save by planning to do them regularly.


You are being mean. Picking on a guy's website because it's not shiny and new for you.


Wasn't trying to be mean, but to understand his perspective when he stated that "For anyone smaller, it's hard to justify the constant need to rewrite code just to stay in the same place. Return on investment is just not there."

His reply post goes a bit more into his story of trying to update the app: https://lists.apple.com/archives/cocoa-dev/2019/Oct/msg00027... . While I get author's gone through some rough experiences, I wonder he could have sought outside investment/advising to get a solid rewrite done & grow the business.


The screenshot is from classic Mac OS, which means it’s at least 15 years old. I love the classic Mac OS, but a product being actively sold for much newer OSes needs up-to-date screenshots.


I hear and also feel the pain. I've been doing multi-platform app development since the 90's, and keeping up with Apple is starting to feel like a fools game - the work one has to do, just to stay on the platform and current with the vendor changes to the OS is very frustrating.

Which is why I'm just going to use an engine-only approach from now on. I can, fortunately, eschew native UI's .. since I work on creative tools and my users prefer to have the same pixel-equivalent interface on each platform rather than shifting paradigms.

I think that game engines are the future for all app development. There's not much I can't do in Unreal Engine, for example .. with the benefit that the same app truly runs everywhere.

If Apple want to continue to subvert developer minds to keep them on the platform, fine by me. The engines see this as damage and easily allow a lot of us to route around the problem.


Normally I would've said this comment was ridiculous.

But Google did exactly this with Flutter i.e. using a game engine and I found the experience to be significantly better than native development. Not only are you guaranteed of the same behaviour across platforms but you end up with a smoother, more polished app faster.

Definitely not suitable for every use case but for many it was impressive.


From where I stand, Game Engines are the New OS™. UE even has its own compiler system built-in, ffs...


Good luck running 20 instances for some productivity apps.


Depending on design and particular engine/rendering library used there could be no problems at all. If we are talking say business UI on top of GPU accelerated graphics lib (no heavy 3d assets stuffed in regular and GPU RAM) you could run multiple instances with no problems.


Good luck running 20 Electron apps...


Who runs 20 instances of an app? That just sounds like poor design.

Anyway, I have no problems running multiple instances of a small and light UE-based app. Most I've had running on one machine is 5 .. but I'm not seeing the limitation you're indicating.


Just open multiple windows in electron and you're there!


Electron is just poorly engineered software.


I am going multiplatform for my new GUI apps and I am actually considering game engines / rendering libs of which there are plenty.


You could consider taking a look at Flutter's approach (it's basically a UI SDK built up from scratch around Skia). It's entirely open source and likely contains some decent tips (and you can piggyback off the engine layer and bolt on a different language for scripting).


The only Cocoa API that has been deprecated in recent years is drawers. Apple is not going to suddenly rewrite their entire desktop apps and utilities in UIKit/Marzipan anytime soon (if ever - hell, they won't even rewrite them in Swift anytime soon), so Cocoa is well established and here to stay for a long time. So unless your app is entirely built out of drawers, I don't see how the mere presence of Marzipan can affect your long term business.


QTKit was deprecated, and support for objective-C garbage collection, once praised as the future, was removed altogether.


To the best of my knowledge, there wasn’t much of a difference in programming style going from GC to ARC except for a few exceptions like all those pesky CFMakeCollectable calls that ended up being no-ops anyway.


You have to add all kinds of reference qualifiers (__strong, __weak, ...).


You don't have to explicitly mark things as strong as it's the default. Having to manually break references cycles with weak pointers does make the migration not as simple as just changing the compiler flags, but I've never heard of it being all that difficult. Apple managed to migrate Xcode in a single version.


Garbage collection was deprecated almost immediately after it was released and has never been supported on iOS (which let’s be honest is the target for 95% of all cocoa developers)


Be careful, Cocoa is Mac, UIKit is iOS. Most of iOS developers have minimum Cocoa experience.


Cocoa is more fundamental and exists in both (although branded as “CocoaTouch” on iOS originally).

The UIKit analog on macOS is AppKit.

Also GC/Arc aren’t features of cocoa, but of Objective-C itself.


ARC was added to get ready for Swift.


Well, to be technical, the OpenPanel API changed pretty drastically too.

I would expect that a second round of problem will come up because Swift is going to get access to APIs that Objective-C will not get.


This was a developer that didn't even bother moving to Cocoa yet


Not specifically Cocoa - but audio in Catalina is a city block-sized dumpster fire.

A good set of big audio developers - Steinberg, Ableton, Avid, NI, Presonus - and not a few of the smaller developers have had to send out emails saying "Whatever you do, don't upgrade yet!"

I'm sure there's a shiny happy Cupertino reason for this, but is it ever an annoying waste of time and resources for everyone involved.


Worth pointing out that Catalina hasn’t been released yet. There’s a good chance the problem(s) won’t be fixed by then, but it’s a little OTT to gripe about beta software.


I started using macOS in 2007 and I've learned the hard way it's not a good idea to update to a major version right away. These days I usually wait about a year before installing a new major version. I barely installed Mojave the other day.


I do the same with Windows. I upgraded to Vista 1.5 years after it's initial release and never had a complaint about the OS which was much better than XP for me.


I skipped Vista completely (except one machine I used to test my software with) and went to Win 7. Same thing with Win 8.


Sadly not an option for anything internet-connected. Each macOS version has dozens of critical vulnerabilities.


The older major releases for macOS get updates for some years even after being superseded, unlike iOS, so it's an option.


I think you're factually correct, but what I've seen is that you have macoS N and when N+1 comes out it's full of vulnerability fixes and there is no vN.M coming out at the same time which fixes the same bugs.

So one has the option of waiting for N.M, but I've never done that. How long does it take usually?


Supplemental updates are shipped outside of the minor version bumps for quick(er) security fixes. Those do not change the build number and don't take the heavyweight update path.


Apple does issue security updates for older iOS versions, at least for major vulnerabilities.


This constant iteration is exactly how software development works now.

By contrast to this, Quickbooks for Mac went through a ground-up rewrite in Cocoa about 10 years ago and has been keeping up with the transitions. It's really beautiful software that's a delight to use. They even used an SQLite data format so that you could easily access your own data.

Apple's is best at how it gently breaks stuff to move everyone forward.


I can't even fathom what would poor Quickbooks ever do without Apple "gently breaking stuff".


It's not like Microsoft is any better. UWP, WPF, WinForms, and GDI are all in various states of disrepair and neglect.


I would say that the main difference is that many applications written for Windows 95 will work just fine on a Windows 10 system. The Mac ecosystem has nowhere near the same amount of longevity.


And even well written games, like Deus Ex. Not it's successor ironically though.


With some tweaks, Deus Ex Invisible War runs fine here on Windows 10. In general i didn't had any game or program i couldn't get it to run in Windows 10 and i have a lot of older software.


Microsoft may shit out a new toolkit every other year (which is a shame, they should be fixing their existing stuff instead) but at least they maintain them for basically eternity. You can pick up anything and it'll be there for years to come (though not everything may be bundled with the OS forever - IIRC some less used some are downloaded on demand, but still the applications do not break).

(though personally i mainly trust core Win32 as everything else is built on it and because of that it is the most likely to remain stable)


But you can still use any of them. Especially GDI/Win32 which just won't die.


GDI.

If FillRect(hdc,rect) is the only thing that you need from it then it will work for you, yes.

But GDI does not support antialiasing so, let's say, Ellipse(hdc) will just draw an illustration to Brezenham algorithm but not that you would want to present to the user.

Same thing about alpha channel - no transparency in GDI.

So forget GDI for UI purposes. You may try to use that GDI+ horse... But it is pure CPU rasterizer and so forget it too on modern high-DPI monitors.

Yes, GDI is still with us but the chorus is here already to sing "Sic transit gloria mundi".


I would say it's worse. At least Apple uses the same frameworks they want you to use.

Of the ones you listed office (word, excel etc.) doesn't use them, they have their own framework.

Why should I use UWP if its not good enough for the office team? I want to use what they are using.


Why is Office your only reference here? Office also runs on Android. It needs to be more portable.

Other references: Visual Studio runs on WPF and many Windows apps are UWP.


I use office because that's their biggest money making client application suite. You would think it gets the most UI usability focus of any of their products, so it must mean something when they don't use any of the UI frameworks they push for everbody else.

Visual Studio is an interesting one, huge effort to get working on WPF, which needed improvements to WPF for them to use it. Surprise dog fooding your UI framework causes improvements pretty quick.

Imagine what WPF would be like if the Office team used it, might be cross platform.


Luckily for GUI apps I've had enough "wisdom" and am only using Win32. Delphi (another shining example of backwards compatibility and being TRUE RAD) has shielded me from writing too much code for functional GUIs. So I mostly work on improving functionality and feature set rather than chasing whatever latest tech/framework is in vogue at the moment. Most of them come and go. Especially in web development.


However, there is effectively a 0% chance that an app written using those technologies will stop working arbitrarily.


All of these are still maintained and get new OS features.


I'm not familiar with the architecture of the app in question but are the UI parts and business logic/processing parts not properly separated such that a full app rewrite is needed? Someone referred to the backward compatibility of Windows apps as a curse: this is especially true when companies finally decide to update a VB6 or early 2000s WInForms app to something modern, the first step of which is to extricate all of the business logic in click handlers. By the time Cocoa came out the concept of separating your application into logical layers such that things like the UI could be swapped out with newer tech faster were well known so this shouldn't have been such a big deal if the app was constructed correctly.


They also have a blog post with more a detailed explanation than the linked mailing list post

https://turtlesoft.com/wp/


The version numbers are a bit off in that.


> Meanwhile, our Windows version hasn't needed any work since 2000.

I see they haven't updated their website since 2000 as well. It's hard to have sympathy for software companies that don't invest in their business.


I, for one, rather enjoy the 2000s aesthetic. Different tastes for different folks.


The paradigm programmers deal with today is demoralizing: it’s lousy to know your work will disintegrate in five or ten years.

For a while Mac OS 10 had support to run Mac OS 9—- via nicely integrated emulation. Then Apple removed it.

I’d just love it if the current OS supported all previous versions, via emulation.


Eh, it isn't that bad. Apple is such a case, sure, but other platforms are better. Win32 is probably the king of backwards compatibility - software you wrote or bought 25 years ago will work just fine on modern Windows 10 (assuming you didn't do too many mistakes :-P). But Linux in general should have decent backwards compatibility too, at least for stuff not relying on Gtk and Qt (or C++ if you go far back). Some time ago i compiled some examples from a GUI toolkit i was working on on a RedHat from 1998 and they worked on Debian from 2018 just fine.

In source code form you may even surpass Win32 as code written for X11/Xlib and Motif (to some extent since that wasn't 100% compatible between different Unices) will still compile with little to no modifications.

And of course anything you write to run in a web browser has good chances to work in the future (client side only). Well, assuming Google doesn't completely take over and decide that they know better than anyone else if it is a good idea to remove stuff or not.

Also while not exactly OSes, but several platforms exists that provide isolation from the underlying OS madness. E.g. Java/JVM is an example of a platform that almost never breaks stuff. Languages like Smalltalk and Common Lisp also tend to be very stable. Free Pascal and Lazarus developers also try hard to avoid breaking code (there is still breakage but it is very rare and in almost all cases is about bugs in the compiler - personally i had only a couple of cases where i had to fix code over the last 15 years and that took me only a few minutes).


Great point. I made some assumptions without taking a mental inventory. The issue is disproportionately an Apple issue (and even then, an issue with programs that rely on Apple APIs).


> it’s lousy to know your work will disintegrate in five or ten years.

I think it has always been this way. Software does rot if not maintained. If it's not through API changes, then it's security issues.


The operating system is a foundation all developers needs to upon, and you want this foundation to be solid in many ways.

Being secure and offering the latest technologies available to the developers is one aspect, but being stable (in terms of ending support) and supporting those previous technologies for a long amount of time is another. I feel like Apple isn't doing the later.


Apple's app stores are stuffed with quality apps. They don't need to give developers a quality experience because the supply is there anyway. So they don't. Small development shops have similar negotiating power and position to struggling unsigned music bands.


For me the last straw was notarization. I am not going to beg for permission to release software for Apple's platform. Considering the dire state of the Mac software scene Apple should be the ones on their hands and knees begging developers, the arrogance is incredible.


Apps aren't required to be notarized and it trivial for any vaguely competent person to dismiss the one-time warnings about unsigned apps. So it's only a big deal if you choose to make it one.

To be perfectly honest, if you can't even jump through that absurdly minimal hoop, I wouldn't want your software running on my mother's computer anyway.


Apps are required to be notarised. You can turn the requirement off, but that doesn't negate its existence.

I hope you and your mother are happy living without open source software. No good deed goes unpunished.


Signed apps are required to be notarized. Unsigned app will run like before (right click -> open in Finder).


> Apps are required to be notarized. You can turn the requirement off, but that doesn't negate its existence.

I don't understand what you're saying here. Users can run apps that aren't notarized. Therefore, developers are not required to notarize their apps.

If you want users who refuse to run non-notarized apps to be able to run your app, you'll have to notarize your app...


Notarization is an automated process, it's not an app review, there is no "begging". Anyway, you can just run unsigned apps on macOS.


Yes, but for how long is this going to last?


According to a slide at WWDC this year, there’s always going to be a way to do this.


> the arrogance is incredible

Every time I see a developer completely fail to understand what notarisation is and how it works then proceed to say that their faulty understanding was the last straw and that they'll stop developing for macOS, I can't say I feel anything but delight. If basic reading comprehension is beyond so many developers, I can't imagine wanting their code on my machine, even in a sandbox.

The arrogance is incredible, indeed.


Please don't be a jerk on HN, even if another comment was provocative.

https://news.ycombinator.com/newsguidelines.html


I used to write a lot of apps for Mac. Including free and open source software. Now I have to pay $$$ and jump through hoops just to keep on doing it. It is amazingly arrogant for them to expect me to pay for doing them a free service. You can have as much autoschadenfreude as you want.


It's a power grab.

University students and people from countries on the US trade sanctions list will (I predict) not be able to share binaries in a future macOS, for example. It also gives apple a way to ban certain categories of program at the behest of the legal apparatus they have to operate in.

You can turn it off now, but you can't tell me you will always be able to turn it off.

I will abandon macOS when it is financially convenient - I hope you are delighted.


Not sure if they're complaining about runtime compatibility on macos but does anyone know if I wrote a Cocoa app few years ago on Snow Leopard or whatever would it still run as-is on macos version of today?

If not, then I can sympathize with their complaint that keeping up with macos updates will be more effort than Windows. The one time carbon to Cocoa update will be worth doing if my Cocoa app is not broken by next 10 os updates requiring constant code updates.


Forget snow leopard, Apple can't even keep compatibility for the version they released 1 year ago.

When Apple announces a new OS update at WWDC you have approximately 4 months to install a beta on an external drive and fix any incompatibilities. Oh and it's a moving target, they only freeze it just before release. This process repeats every single year, sometimes they are so excited to break shit that they do it in a patch release but that's much rarer.

If there's a silver lining it's that Apple has gotten so bored with the Mac that they don't change as much as they used to, it's certainly not as bad as it used to be, but not because Apple has suddenly started caring about backwards compatibility.


Depends on what you used, i guess. I bought an iMac in late 2009 which i think came with Snow Leopard and i upgraded to new macOS versions whenever they came out up until El Capitan which was the last version to support it (i've heard that you can run Sierra with an unsupported patch too but didn't try it). I've also got (some bought, others free) a few applications from their developers' sites - that was before the Mac App Store.

After every single macOS upgrade some application would stop working. By the time i installed El Capitan, no application that i had installed since the Snow Leopard days worked.

Mac App Store didn't help either though. I've only bought a couple of applications from there, but one game - NOVA 3 - broke after an OS upgrade (do not remember which) and then was removed from the store instead of being updated. Of course this is partly the developers' fault, but on Windows this would never have been an issue as i have games and applications more than a decade older than NOVA 3 that still work on the latest Windows 10 with little to no fixes.

I also had a similar case with iOS - i got an iPod Touch some time ago, bought a bunch of things, several versions later nothing worked. Eventually the device itself stopped being supported and developers were both forced to not support it and did require later versions of iOS. Typical case of forced obsolescence since otherwise it was a fine device.

This experience alongside Apple's attitude towards (ignoring) backwards compatibility is what makes me stay away from their systems.


Simple ten-year-old 64-bit Cocoa apps will generally run on modern macOS with no changes. The more complex your app is the more likely it is to have been broken by an OS change since macOS is not especially concerned with backwards compatibility, but the actual APIs you're programming against won't have changed much in that time.


Having updated OS X 10.0 era Cocoa apps to run on modern macOS, it’s really not that bad. Actually, it’s way easier than I’d expect given the age of Cocoa and how its first few years were full of rapid evolution.


When you consider investing into user facing software trough specialized API's and tools, it's safe assume only 5-10 year useful life. The consequence is that investing into user facing apps and developers has limits because ROI is limited by current and near future user base.

Separating all the program functionality from all the OS/UI stuff pays off over longer term.


Apple makes my preferred platforms, but macOS and iOS have poor backward compatibility, and this is particularly bad for games.

Yearly ABI changes and removals impose a recurring maintenance burden on developers to revise their apps just to keep them working. I think it's one of the factors driving the move to subscription pricing.


I wish they would just open source Cocoa (AppKit) if they're going to just abandon it. There were some really good ideas in there, and I think it could possibly develop an active community to continue experimenting with them.

IMO the problem this time around isn't the pivot, but the absolute half-heartedness of the pivot: two half-finished solutions: Swift UI and Catalyst, neither of which works very well yet, and this after about 5 years of pitiful development on Cocoa proper while simultaneously asking people to kind of switch to Swift. So it really feels like 3 half pivots in those 5 years, all to end up with something less polished than Cocoa in its heyday.

I really respect anyone who put up with working on Cocoa apps the last 5 years. The documentation has been sparse to non-existent on new features (often times for a year or longer your best bet was to wade through a WWDC video to get any information at all), coupled with a really miserable Mac App Store experience that was forced on you. It's really sad that when you would find some bug in Cocoa, unlike in the past, you could basically determine that this was it, as everything feels end of the line and thus not likely to be fixed.


I'm looking at doing desktop apps for my latest project. It's not simple.

It used to be simpler, although fragmented. There wasn't one tech for all targets, but there was at least one tech for each platform that did a good job.

How come writing a desktop application is harder now than it was 20 years ago?


> there was at least one tech for each platform that did a good job

I think the problem is exactly the opposite: there used to be "at most one tech for each platform", but now there's too many choices. On top of that there's also Electron and many multi-platform widget libraries. This gives a lot of "analysis paralysis" to developers. There just isn't a silver bullet.

...I mean, except for MacOS, ironically, where only Cocoa matters (at least until Marzipan). Carbon has been a second-citizen from the start, and was deprecated in 2007.

On Windows there's UWP, WPF and WinForms, which are very productive, and are open source now, on top of that. The only problem IMO is that they're only "great" if you use C#. Writing windows apps with C++ is way too verbose.

On Linux it's more complicated (and I'm not familiar with it) but at least users are much more forgiving when it comes to desktop consistency.


To be fair C# is probably the best choice for the majority of desktop developers.


Cross-platform toolkits have much nicer and sane APIs compared to native SDKs. Even if I wanted to target the single specific platform, I'd still pick wxWidgets or Qt Widgets over Win32 or Cocoa.


I would completely agree. It's a bit like html vs native though, you end up trading flexibility for compatibility and ease of development.


Depends. On Windows and Linux, ~nobody complains about Qt applications. On Mac, users do complain because some details aren't exactly right.


> On Windows and Linux, ~nobody complains about Qt applications. On Mac, users do complain because some details aren't exactly right.

Windows users are used to UIs from the 1990s. Linux users are used to 100 different types of UIs and bask in the glory of diversity. Mac users are trained to care.


FWIW i find Qt applications "off" on Windows too, at least when they try to look native, because there is always some little detail that they get wrong :-P.

(also often cross-platform toolkits tend to have weird layouts, like having multiple numeric input fields span an entire dialog box width instead of making a more compact arrangement)


I disagree about Windows and Linux. Windows users aren't particularly used to "90s" UIs. Windows users are used to every application looking different, including those from Microsoft. On Linux you "only" have GTK+, Qt, and the odd wxWidgets application. Oh, and Electron, sadly.


And Motif, FLTK, and others (though personally i've mainly encountered these two in addition to Gtk and Qt) as well as bespoke toolkits. Also Gtk isn't just "Gtk" but Gtk2 and Gtk3 use separate themeing systems and Gtk4 seems to introduce something new. Similarly Qt isn't just "Qt" but Qt4 and Qt5 also use separate themeing systems.

If you only use a desktop-focused distribution like Mint or Ubuntu and stick to only using applications for the desktop environment you use, then yes you are only going to see Gtk or Qt apps. But there are way more toolkits than those two - check your distribution's repositories (and even Debian doesn't contain everything).


Motif and FLTK apps exist, but they are very, very rare these days. If you count Motif, you can as well count Xt and X Athena Widgets. FLTK isn't as old, but was AFAICT never widely used in any case.

Motif used to be the standard toolkit for commercial Unix applications in the 90s I guess, and you might still see it when using some expensive but minimally maintained software originating from that time.


Debian includes a lot of applications using Motif (e.g. Xpdf, Nedit, DDD, etc). I think you underestimate how many such applications exist out there. If you silo yourself in the GNOME or KDE (or derivatives) world, yeah chances are you wont see them.

FLTK is used for many smaller applications. Again, chances are if you are into GNOME/KDE you wont see those, but some smaller environments and distributions use FLTK for their utilities.

Though yes, Xaw applications exist too (and they come preinstalled with most Xorg installations - Xedit, Xman, etc).

Also i forgot Tk applications - Tk is the default toolkit for both Tcl and Python and a lot of such applications exist (gitk would be a very common one).


What's odd with wx? It mostly wraps native APIs - on Linux it's just a C++ wrapper for GTK+.


I meant rare


"Mac users are trained to care." Wrong set of priorities I think. And how are they being "trained"? They go to special courses or they just listen to a big ads telling them what they should like?


It's probably fair to say that they expect consistent look and feel. But there is a bit of "it's good because it's Mac" too IMO, because it takes faith to believe that the dock isn't the worst launcher-task manager in wide use today.


> For anyone smaller, it's hard to justify the constant need to rewrite code just to stay in the same place.

This sort of says it all about modern progress, or indeed modern capitalism - you need to run all the time just to stay in place. This is also known as the Red Queen's Race [0].

[0] https://en.wikipedia.org/wiki/Red_Queen%27s_race


I guess they don't have the resource to update their homepage last 20 years how they can update the app?

Their website still shows how long does it take to download files on the Dial-Up connection.


Seems like they call themselves "Turtlesoft" for a reason.


“Meanwhile, our Windows version hasn't needed any work since 2000”

I honestly don’t see that as a good thing. What are these people doing if they haven’t touched their code in almost 2 decades?


Meanwhile, our Windows version hasn't needed any work since 2000

While I don't disagree that it's hard to keep up with Apple, if you haven't updated your Windows product in 19 years, then it's probably not a product I would want to use anyway.

Imagine if web developers thought the same way. Very few web sites exist today as they did in the year 2000, and those that do are ultra niche, or abandoned.


You're reading him too literally, and in the process create a strawman. It's obvious he updates his software; Business software must be updated to match changing legal requirements, besides a nonsubscription business has nothing to sell without updates.

He's saying that Windows per se did not require extra work for compatibility after he got it done since forever, while Mac needs adjustments and rewrites every once in a while, and the last time was too much for him. I guess he'd rather spend the time doing features.


Curiously, the website of the software in question has been using the same design since at least early 2003

https://www.turtlesoft.com

https://web.archive.org/web/20030420115300/http://www.turtle...


I disagree completely. If software genuinely hasn’t needed to be updated since 2000, it’s solid, dependable. It’s a hammer. I use the same hammer I use today I used 20 years ago.

It’s a tool, not a gimmick. Think the UNIX tools used billions of times a day we don’t even think about.


Hammers don't have security issues, which is where the analogy breaks down. Sometimes, these UNIX tools haven't changed much on the outside, but they are getting updates.

If you look closer though, the BSD and GNU variants of common tools have diverged. And while I like BSD's conservative approach, some of the GNU features are damn useful.


Hacker News looks like it was from 1995. Works fine for me.


But HN is updated minute-by-minute. It's up to date. The guy moaning in on the Apple listserv said he hasn't touched his Windows app since 2000.


Does anyone know if the current macOS still uses elements of Carbon under the hood? What about Catalina?


For me it just redirects to apple.com


Its a thread that starts with this:

Subject: Thoughts on Cocoa

From: Turtle Creek Software via Cocoa-dev <email@hidden>

Date: Wed, 2 Oct 2019 13:14:44 -0400

Sadly, we just decided to abandon the Cocoa update for our app. It's not easy to walk away from 3 years of work, but better 3 years lost than 5. Time will be better spent on our Windows version.

TurtleSoft started Mac-only with Excel templates in 1987. The first prototype of our current stand-alone accounting app was in the early 90s. Since then, programming for Mac has gone through four primary programming languages (Pascal, C++, Objective C, Swift). Three, soon to be four chip architectures (680x0, PPC, Intel, ARM). Four frameworks (MacApp or Think Class Library, PowerPlant, Carbon, Cocoa).

Microsoft and Adobe are big enough that they've survived the many pivots. They can just throw 100 programmers at it. Intuit has barely kept up. For anyone smaller, it's hard to justify the constant need to rewrite code just to stay in the same place. Return on investment is just not there. Seems like each new update is more difficult.

Many good apps for Mac have died in one pivot or another. We managed to lurch through most of the changes, but not this one. Thinking ahead to the consequences of Marzipan was the last straw.

Meanwhile, our Windows version hasn't needed any work since 2000. It probably will take less than a year to get it updated to 64-bit and a better interface.

Casey McDermott TurtleSoft.com


It did for me 2-3 times, and I was puzzled. I tried a 3rd time and it worked.


I wish this post focused on the core issue: they don’t make enough money from Mac sales to be willing to replace a framework (Carbon) deprecated by Apple 15 years ago, and are instead terminating Mac support.

I’m surprised they didn’t do that sooner, but blaming Apple for ending Carbon support is beside the point of “not enough revenue”.


They can't update their web https://www.turtlesoft.com and they charge $495 for a non-accounting licence. Seriously, the web looks like it was done around year 2000.


Should they? Maybe it is not that pretty but it is functional. Many modern sites on the other hand show big nothing picture that one has to scroll down and hunt for info. And I love that light grey text on white background which is nigh to impossible to read for many not so young eyes.

And since they have pretty specific product I am pretty sure that their customers do not give a flying hoot about how the site looks. They need specific functionality from the product


This 100 times. Form over function is apparently more important these days.


Qt seems like a good solution to multi-platform requirements.


That happens all the time. With any OS.

On Windows, if you have an application that draws stuff using GDI+ (20+ years technology) then you will discover that it is not moving today on high-DPI monitors. GDI+ is pure CPU rendering and the Moore's is over. GPU rendering is the only option these days.

Yes, GDI+ still works. You see it renders something in your applications on Windows 10. But you cannot use the application on your new cool monitor.

Same with CoreGraphics on Mac. CG is GDI+ alike thing by nature - pure CPU rasterizer. Something tells me that it will die pretty soon too.

Good abstraction/isolation layer is definitely needed for the applications that want to survive on the long run.

As an example, Norton Antivirus started using such UI layer (Sciter Engine - https://sciter.com) 12 years ago. First versions were using GDI rendering backend. Then we added Direct2D GPU accelerated backend. With the same API. Application code and architecture did not change all these 12 years.

Same thing on MacOS. Initially the engine used CoreGraphics. Then OpenGL gfx backend was added. And now we are working on Metal. And Vulkan for these matters.

But API of the engine is the same as it was initially.

TL;DR: Good application architecture is still the thing.


I think the future is game engines. You can write once and run everywhere. As soon as there’s one with first-class support for Rust (as the world is clearly and rapidly moving to Rust) the rest will be moot.


FYI Flutter basically works like a game engine.


They probably should’ve started porting their app to web tech 10-15 years ago rather than ranting about progress in native frameworks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: