Hacker Newsnew | past | comments | ask | show | jobs | submit | c0nsumer's commentslogin

Do tell, how does one avoid this problem on macOS when needing to use colored fonts and lines? (That is, flipping to greyscale isn't an option.)

The reason fringing exists is because they're rendered as multi-colored fonts. As in, instead of 1 pixel for each pixel, its rendered in a 3x1 (or 1x3 for vert) 'greyscale', and then just striped across the subpixels, using the subpixels for their spacial locations.

With DirectWrite and Freetype, subpixel rendering isn't done for colored rendering. On OSX, due to lack of subpixel with Core Text, it also isn't done.

I suspect what you're really asking is "why does high color contrast look weird at the edges?" Because some monitors are "exceptionally clear and sharp", and people have been selecting monitors for this trait for over a decade.

On LCDs, "good" polarizers make it hard to make out individual subpixels (which also makes subpixel rendering kinda moot on them; you'll see the fringing but the text won't look any sharper than greyscale, and noticeably less sharper than aliased); instead of "clear and sharp" they're more "natural".

OLEDs and MicroLEDs do not have polarizers, and they're the sharpest monitors I've ever seen. However, good news (at least for me): I can see subpixels on a 1080p 24" during high color contrast (ie, fringing fonts), I _cannot_ see them on a 4k 24".

Even if I integer-scaled 1080p to 4k, I would be using an array that looks like...

R G B R G B

R G B R G B

.. to represent pixels. I can't see subpixels during situations like this. So, the only way to avoid the problem is just use HiDPI monitors. 32" 4k seems to be a very common size for Mac users; it causes it to do Retina scaling correctly, while also being about 150% the DPI of a 24" 1080p, or about 125% of the DPI of a 27" 1440p (the two most common sizes).

My recommendation also is: never use below 4k on OSX. OSX's handling of sub-4k monitors is broken, and usually leads to in-compositor scaling instead of letting apps natively render at LoDPI.


...yet at 4K native on macOS (OS X) I could see fringing. And it was worse than using a slightly lower resolution, scaled up by the OS.

And it's particularly bad on solid color lines and high contrast borders (not fonts). So... that doesn't work for me. Which was the point of the post; I don't like how this particular subpixel pattern OLED monitor looks and it's not for me.


Which applications? Are they ones using non-native text rendering?

Chrome, for example, do not use Core Text. Chrome (and all Electron apps) are habitual offenders for bad text rendering on all OSes, not just OSX.


Numbers.app, Autodesk Fusion, Adobe Illustrator, and Terminal.app were the first places I noticed it. And in Fusion and Illustrator it's not text that's the issue but lines/graphics.

And high contrast edges in photos in Apple Photos looked wonky.


Oof, at least two of those apps should not do that. I wonder how Fusion and Illustrator do lines, because last time I touched Illustrator (CS6 era), its line drawing was pretty good.

I'd like to see screenshots of these showing off the weirdness, if you don't mind.


I would send them, but I've already returned the OLED to Costco. Sorry. :\

On the upside, I should have a shiny new U3225QE IPS LCD later this week.

(I just sidestepped the problem.)

EDIT: I should add that the screenshot in my post of a cell from a spreadsheet was Numbers.app.


I did mean screenshot, not photo.

I wanted to see what it looked like on my 24" 1080p IPS monitors (two Dell U2414H IPS, and a rebranded LG FastIPS from Monoprice). I don't own a Mac, so I can't replicate it.



Huh.

All of those share similar traits: the lines are excessively soft in many cases. They're rendered in linear space and then baked to the target gamma ramp, instead of being rendered in sigmoidal space (or some other psudeo-sharpening/pixel-aligning w/o sharpening methodology).

The font in Numbers is extremely misrendered, that's soft even for Core Text standards. Its as if it had absolutely no hinting applied, instead of the kind they use that approximates Freetype's "light" hinting.

The Terminal.app one is okay, but not amazing, as you can see slight misshapen stems, such as in the M for Makefile.

So, I'm not sure the monitor was at fault, but given the "clear and sharp" nature of OLEDs, it certainly magnified the effect.


Yeah, and I'm not terribly interested in getting into the details of how everything renders... I just want a display that works and doesn't make my eyes feel funny.

The PA27JCV (which I don't expect to have back from warranty repair for 3+ weeks) looked fine, and I'm now at day 5 of using the U3223QE and it's fine. So this is my solution to the problem I guess.


Unfortunately, that might be your only solution.

From what I can tell from photos of your new monitor's pixels, it has a polarizer that is of similar taste of my Dell U2414Hs, just much newer. It aims for natural reproduction, which means pixels aren't sharply defined from neighbors, and subpixels blend together.

I prefer monitors like these, so I can't really argue with your choice. Sadly, due to a lot of younger kids being raised on phones (which have exceptionally sharp screens), modern high end screens keep being pushed towards sharper and clear to a fault.

Apple refuses to adjust rendering, since Apple's own taste in screens prefers natural over sharp. Even their OLEDs clearly have a film on them to hide the subpixel misalignment; the side effect of this is their brightness and contrast is lower, but eye fatigue is also lower.

Unfortunately, this is why I won't take OSX seriously: I bought a MBPr many years ago, I really tried to like OSX, I tried to understand why people like it, but it ultimately is a death by a thousand cuts, and entirely Apple's fault.

That MBPr ran OSX for a year, then Windows 10 a bit, and then Linux until it died. The text rendering was only fatigue-free on Windows and Linux, OSX had always been too fuzzy, especially with dark themes.

If you're not willing to break up with Apple, yeah, you're stuck just buying Apple-friendly monitors. A lot of OLEDs are just too clear and sharp and I don't disagree with you on sending it back.


(Author here.)

I'd say that because the article documents my experience at this point in time, the only poor timing is when my old-ish monitor died and I went looking for a replacement. And this article documents my experience with that.


(Author here.)

This is akin to how I've (technically?) stepped back from a 5K 27" to a 4K 32". Likely due to scaling and how far I sit from the screen (about 24" -- average I think) things look the same? At least, I don't notice that the 4K is any worse.

Me being me, I can't help but think I should have a 5K or 6K or whatever, but the price is... high. So I figured I'd try a 4K 32" since the OLED was cheap and the result was this post because the subpixel pattern messed with me. But now for the replacement I'm looking at a simple (but nice color / high end) 4K 32" IPS LCD.

And having been using one for the last day, I'm pretty content with it. It's like everything I wanted from the OLED without the eye strain.


If you're keen on a weekend project, consider converting a 5k iMac to work as an external display. Glossy display, and bang for buck!

I actually had a 5K iMac that I sold when I got the Mac mini. As I was deciding on the display I looked at doing that, but I wasn't super keen on the unfinished look. And IIRC it was going to cost about $250 in parts at the time. I was able to get the ASUS for about $700 and sell the iMac for ~$300. So it was really only about $150 more to not DIY it and have a more finished final package.

It is a really neat looking project, I just determined it wasn't for me.


It has gotten a bit cheaper, a lot easier and somewhat better lately. But I agree that it does not make that much sense because you end up with a product that has many flaws and is a bit annoying to use.

The main factor is being able to sell the iMac for that relatively high price. I can't figure out why they are still so expensive because most of the early 5K models are kind of useless nowadays (on low end version, the compute just cannot cope with modern media/files at such a resolution). But maybe it's people converting them to display driving the market...


The person who bought mine was a family friend who wanted a large display for her kid to do 3D printing stuff. Since he was just going to be running a slicer and some basic modeling stuff, it seemed perfect. I got a bit of cash, he got a computer with a good display, and it was a general win all around.

Ah yes for those use cases it makes perfect sense.

Apple excuse for stopping the big iMacs was that when you bundle up the display with the compute it makes it hard to upgrade and the whole thing become useless. But in reality it just looks like some bad "reasoning" to force people to spend more on a less elegant solution that probably won't get upgraded that much.

At least an old school 5K iMac can have some secondary use case, like you demonstrated or even just to watch movies, do some light document editing and such. And you can convert them to displays if you really want, but that should have been a built-in functionnality in the first place.

I guess this is why they still commend a relatively high-price, a good large display still has many uses even if the compute is weak.

The studio display is kinda useless outside of Mac use, so even though it's great quality its really not a good deal.

I really hope Apple finally release another big iMac because I won't get another Mac Mini or a Mac Studio. I like macOS (less so nowadays) but the whole point of the Mac was the integrated hardware for base/mid-range power. Their small desktop boxes that cannot take any upgrades are really pointless as a desktop because you end up with cable galore and not much space saved. This is so stupidly inelegant, I can feel Jobs rolling in his grave. They added front accessible ports, so that's something I guess, but come on, they stink of greed and profit maximisation at all costs...


(Author here.)

I do have astigmatism. You do make me wonder if this plays a part as well...


In my experience, it seems to. My astigmatism (or other eye stuff) seems to move different colours different amounts, leading to wider RGB pixels and making things like Cleartype so much worse. So people were enjoying Cleartype and I was hating the obvious colour-changes and fringes that somehow they weren't seeing. I assume some people are lucky enough to have aberrations that actually make cleartype more pleasant.

I do too. Combined with progressive lenses and I have significant chromatic aberration issues. Blue and red pixels require different focus, which is sometimes an issue when solid blues and reds are on screen in close proximity. I turn off pure blue colors in my terminal emulator, for example.

That sounds familiar. I also have ever so slight green-brown color blindness. It's only really noticeable in low light (like in the woods in evenings), but that could well all stack up to be a problem.

I also have significant problems with blue LEDs around the house, to the point where I've removed, replaced, or covered almost all of them. They really, really bother me because it feels like my eyes never focus on them and they leave me feeling slightly disoriented.


(Author here.)

There's no Cleartype in use here, as the first paragraph says, it's macOS.

And I'm using the default font sizes because they work well for me on an LCD. The point of this post is to document my experience with trying a current-gen generally-available OLED and how it did not work out well because of the subpixel arrangement.

It's also not just an issue on text, it affects any high contrast edges, especially perfectly vertical or horizontal ones. This meant that CAD stuff, spreadsheets (the grid), and large colored sections in graphic design software looked off as well.


(Author here.)

The other issue is that it's not just a text problem. It affects any high color contrast edges, especially directly vertical or horizontal ones. So subpixel rendering tweaks for text rendering (eg: Cleartype) don't solve the whole problem.


(Author here.)

That's an interesting point you mention about not seeing it, because prior to buying an OLED I'd read a bunch about fringing and in many articles I just... couldn't see it. I couldn't tell what was being illustrated in the images.

It wasn't until I sat in front of one for a few hours, in my room and lighting and with my apps and had funny-feeling eyes and a this-seems-off feeling that I decided to investigate. And yes, those macro photos show fringing, but it /is/ hard to understand how the subpixel pattern translates to on-screen weirdness until you've seen it for yourself.


Yep -- you've got it exactly. macOS and no Cleartype.

Had I been using this on Windows I would have started to solve it by trying to tune that.


(Author here.)

I actually am using it, but I didn't want to go down the rabbithole of an all-encompassing article on displays, PPI, scaling, etc. Using it to scale the display really helps, but I find that for the size of things I like 3008x1692 (on a native 3840x2160 panel) and this looks fine on an LCD. And is better than native res on the OLED, but still not great. It still bugged my eyes.

I just went with native res for demoing things because it's a worst-case, but the fringing problem, because it affects all strong-contrast edges not just text. It was also really noticeable at thin/narrow lines such as when doing CAD or between cells in spreadsheets.


Those do show the best example, yes. And best to photograph. But it's also noticeable on any high contrast edge or a fine line, like a drawing in Autodesk Fusion (CAD software) or just the lines between spreadsheets.

And no, no Cleartype here because (as mentioned in the first paragraph) it's a Mac running macOS.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: