"Switching to the new mode might take some getting used to though, so if you think your fonts are suddenly fat, fuzzy or weird, give your brain some time to adjust."
This is true, and IMO it's the reason font rendering is much less important than people think. When I got a new pair of glasses I disabled font anti-aliasing to make the edges as sharp as possible. I did this so I could more quickly learn to focus my eyes correctly with the new glasses. By the time I had mastered the glasses I was used to the font rendering, and I have not switched it back. I now prefer fonts with no anti-aliasing. If you'd told me in the past that I'd end up preferring that I might not have believed it.
The most legible font is the one you're used to. People can read Blackletter or Spencerian like it's a normal font if they're used to it. I think vector fonts were a mistake. It's a lot of software complexity for very little value.
To be fair, the main reason for algorithmic fonts is so that non-latin characters scale up nicely. Storing chinese characters at a high bitmap resolution gets expensive fast, and you'd need several bitmaps for each character to keep them legible at smaller sizes.
Trying to have chinese characters legible at "smaller sizes" is a hopeless cause; they contain much more information than letters do and they consequently need more space. Check out the rendering of the very common character 疆, which HN would like to slot in at basically the same size as the rest of the text. As it shows on my computer, it's legible -- but not correct. (Most egregiously, the horizontal stroke at the bottom of 土 (bottom left quadrant) is missing.) I've seen denser characters show up as blocks of pure black.
You might want to give a hidpi display a try (where vector fonts are pretty much essential). All standard ~96 DPI displays look blurry to me after using one.
Though beware of Windows' ClearType, which seems to do a bad job on hidpi: it renders very thin glyphs, and the glyph width discontinuously jumps from 1px to 2px when you change font size. It tries way too hard to snap to pixels. OS X and freetype (tested: 2.6) do a much better job on hidpi.
The problem with Bitmap fonts is that they assume both a pixel density (usually about 96ppi), and a font size. With unicode and smart phones (With up to 6" screens at 4K resolution), it would be impossible to store crisp glyphs for even a fraction of the Basic Multilingual Plane; phone makers still like producing 16GB SKUs! Sure this is one extreme but the benefits of vector fonts can be felt at any ppi with adjustable font sizes.
I'm not saying bitmap fonts don't have their place, but that place is typically old computers or small embedded devices where the complexity of rendering vector fonts is an actual unnecessary performance bottleneck. Vector fonts are easy to use and quick to render, they produce crisp glyphs at virtually (Besides extremely low) resolution and pixel density.
I have a 14" 1080p laptop and so I'm not constantly squinting I use 125% windows scaling and usually about 125% scaling in Google Chrome, although it varies from website to website. With vector fonts this just works, using bitmap fonts would be a lot of ~~software~~ complexity for very little value.
> The most legible font is the one you're used to.
Yes, and in my case on Linux I switched off hinting a long time ago (but retained anti-aliasing)
We even did it on our network at work by default as users liked the fonts when they were more "Mac like".
Personally I have never understood the whole font hinting thing. It seems rather unscientific and the result is that fonts look fundamentally different at slight size differences; seems hinting is part of the problem, not the solution.
> Personally I have never understood the whole font hinting thing. It seems rather unscientific and the result is that fonts look fundamentally different at slight size differences; seems hinting is part of the problem, not the solution.
Hinting exists because displays don't have infinite resolution. Mac-style rendering has always acted like displays have paper-like resolutions, and respects a font's letter shapes more, favoring the original designer's intended font appearance on paper. Windows-style rendering has always acknowledged that displays have pixels, and that smearing fonts across pixels can decrease readability on average-DPI displays. Linux font rendering has historically offered a choice between the two, but typically defaulted to Windows-style rendering.
Yes the most legible font is the one you're used to, but do we want to condition the next generation into thinking low dpi bitmapped fonts are the best? The thing is, we need to set the software defaults to be most pleasing to a population who has never had any prior experience. And IMHO, the print industry has figured that out centuries ago.
but do we want to condition the next generation into thinking low dpi bitmapped fonts are the best?
If it means the software for displaying them can be simpler, maybe yes? Personally I think the X fixed fonts (with a slashed zero) are pretty good... I can read text in that font all day in a terminal and not get tired.
Trying to read ClearType for any length of time beyond casual glances actually makes me dizzy and my eyes start watering because of the lack of sharp edges to focus on. Non-subpixel-AA still looks slightly blurry, but I don't get the same "WTF is happening to my eyes" feeling. I prefer non-AA as well, and in the example screenshots posted in the other comment here, it's the font to the left, describing the current settings, that I find most readable.
I wonder if part of the reason for those studies indicating that ClearType "improves reading speed" is because of that - slowing down to focus on the letters feels unpleasant, so people try to scan through the text as quickly as possible.
Just look at any recent Android phone. Font rendering is basically solved, all it took was making displays dense enough. What you are talking about is legacy software that doesn't scale properly.
If you have a sufficiently high DPI screen this is definitely the best option.
Font hinting and anti-aliasing is only useful because people have giant ~70-100 dpi screens which look awful without it. On mobile where you have 300-400 DPI it's no issue. Even with 4k desktop monitors with ~200 DPI it seems much less necessary.
I've had the same results as you, though without the glasses. I really don't understand why people prefer such blurry fonts. The biggest practical problem I've had with turning antialiasing off though is badly hinted web fonts. Sometimes bars or stems just outright disappear.
> The most legible font is the one you're used to.
Maybe for you, but don't assume that is true for everybody. It took many months of trying various font rendering configurations (Infinality in ClearType-ish and OSX-ish modes as well as custom modes) before I settled on my current[1] configuration. I'm very familiar with having to spend some time adjusting to different modes, but in my opinion, this the differences in rendering greatly overshadowed the differences from seeing an unusual/new setup.
Perception of text on a screen is affected - at a minimum - by the rendering technique (AA, subpixel-AA, hinting, contrast of the font, contrast of the display, eyesight of the viewer, and the ambient lighting in the room. Maybe in your situation the subtleties of font rendering are not that important. On my monitor, I find [1] to be a lot easier to read than any of the examples (both before and after) that were linked in the announcement.
> very little value
I'm sure most people are fine with sensible defaults, but please, try to remember that "value" is often subjective. People even disagree about which criteria represent high value.
(No, you don't want to see my messy custom fontconfig configuration that overrides a LOT of fo0nts choices and forces them into the handful of fonts that I find easier to read. Yes, that took another few months of experimentation.)
edit:
> vector fonts were a mistake
That depends a lot on what you are doing with them. While I was talking about vector fonts, the amazing font
is still my favorite. Fortunately, it's easy to use both old-style bitmap fonts and modern fontconfig/freetype rendering in X. Each type of font can be used where they are appropriate. Honorable mention: "Dina".
If you doubt the benefit of a high-quality rendering of a (appropriately hinted) vector font, try loading [2] and [3] (source: a very enlightening article[4]) in separate tabs and switching between them (so the images are in the same position). Image [2] uses modern techniques to render each line 0.1px further right, accumulating 3px over 30 lines. The other image [3] rounds the x offset to the pixel grid. Modern rasterizing techniques can align font rendering to 1/256th of a pixel!
Oh no, why would they do this? You could already achieve something similar (and superior IMO) with the old FreeType by enabling "loose hinting" (ignoring hinting in x-direction) and using RGBA subpixel hinting. Ignoring horizontal hinting only really works when you have subpixel hinting, because of the ~tripled horizontal resolution.
I don't understand why anybody (MS, the FreeType devs, ...) would choose DirectWrite style rendering. It just looks blurry to me. Even on a hidpi display, where it shouldn't make a difference, it looks wierd compared to GDI rendering. One major reason I don't like Edge, and also find Metro and XAML apps somehow yucky (just a gut reaction, I'm not judging them technically).
What? As someone who hates Windows' Font rendering, why would you want this?
The OS X font rendering is awesome, on Linux, Infinality can be tweaked to look even better, but windows? It's hideous, inconsistent and terrible to change...
I don't want to defend Windows' font rendering, because I hate it too, but (as far as I understand) the core idea behind these changes is good. AGG's article on text rasterization[1] also recommends that renderers ignore horizontal hinting and it has some demo images that look quite good. Also, a lot of fonts are designed for Windows' style of hinting.
I agree with that assessment (although I think OS X font render has problems as well). I'm a bit worried because now Infinality is removed from the code base by default, so I suspect I'm going to have to stick to customer builds to get the font rendering that I prefer.
I suffer from ocular migraines and as hard as it is to believe (even for me), the way fonts are rendered makes a difference. I will literally go blind looking at badly rendered fonts. I'm going to take a wait and see attitude, though. Not all fonts will be affected, so perhaps it will be OK.
Bad font rendering gives me an instant headache as well, especially blurry rendering. From the zoomed-in sample pictures, it looks like this new renderer does less snapping of character stems to pixels, which sounds like the problem that has caused blurry rendering in the past. (I'm also concerned about the extensive mentions of specific hinted fonts; hopefully this doesn't break rendering with fonts like DejaVu, for instance.) However, I'm going to wait and see what this new version looks like, hopefully when it shows up in Debian experimental.
Yeah. I have the most bizarre setup up. I have a 13.3" display (with about 165 DPI), so not retinal, but still relatively high resolution. Anything bigger than this seems to trigger my migraines. I'm really waiting for the day when I can get 30 fps on an e-ink display. Something about staring into the light seems to be a problem. But low contrast is also a huge problem. So I usually have my backlight set at around 10% (!!) and my colour themes are set so that you would think that I subscribe to the clown school of programming. Everything is done in a 16 colour console with a 25 point font. You have no idea how many hours I've spent comparing the contrast ratios of various 16 colour schemes. All I can say is that it's a good thing that I come from a time where 80x25 column terminals were state of the art ;-)
I'm considering moving to very large wall mounted TV and sitting far enough away so that the display angle of the screen is about the same as my 13.3" laptop. Unfortunately, I don't have the ability to wall mount a TV in my current apartment, so it may have to wait until I buy a house (I bet the e-ink display comes first ;-) )
This preference is subjective. There are valid reasons to prefer OS X rendering (scaling is proportional), but there are also valid reasons to prefer Windows rendering (stems are less blurry).
Totally agree. I have to work on Windows, and if I could get it to render fonts the same as FreeType (freetype-freeworld-2.4.1 on Fedora 23, light hinting, subpixel smoothing), I would be a happy camper.
Meanwhile Microsoft has basically given up on sub-pixel smoothing: it's missing from Windows Store/Modern/Metro apps (including Edge) and IE10 and newer and is mostly absent from Office 2013 and newer when running on Windows 8 and above†. More's the pity - not all of us have high DPI screens yet.
I believe part of the reason is that it was never fully compatible with screen rotation, often still assuming RGB order from text-left to text-right. Another part would be that the world is moving towards much higher resolution displays, where the effect of subpixel rendering is negligible.
FreeType < 2.6.2 hint=slight: use auto-hinter for vertical hinting
FreeType 2.6.2+ hint=slight: use native "vertical-grid-only-snapping" for OpenType/CFF, fallback to autohinter "vertical-grid-only"
FreeType 2.7.0 v40 default: support TrueType too for native "vertical-grid-only-snapping", and enable this by default (regardless if hint is set to slight or not?)
Finally. To be honest I switched back to using Windows on my private machine because font rendering on Linux (using Chromium and Firefox) was more like a lottery than something readable without enough alcohol.
A pity it will take some time to ship to Debian Testing, I'd love to try it without having to recompile the whole system.
What Linux distribution did you use? In my experience the font rendering in the common distributions, like Ubuntu, is rather good these days. Personally I use Debian stable with a simple window manager (Blackbox) and no X display manager. In this case I need to set a few variables in ~/.Xresources to get good results:
This is fantastic news. FreeType rendering was already pretty good for common FOSS fonts in the recent versions unless the distribution crippled it, but a pet peeve of mine was that Microsoft fonts have never seemed quite right (as demonstrated by the OP in https://s31.postimg.org/t5trkfd2x/freetype_264_consolas_v35_...).
KUDOS to all contributors, especially for the last few release of FreeType which really rocked!
Autohinting is hard, especially for ideographs.
My hinter even used differential evolution to calculate how strokes should be placed. (link: github.com/be5invis/sfdhanautohint)
A randomly selected unix system has a 99.999% chance of being a Linux system. Nobody cares about the other variants except for their own developers and a few historians and hobbyists.
For that matter I have no idea why you think freetype is even unix software. It builds and runs on numerous non-unix platforms.
It does? I remember when I was using SUA I had to build without FreeType (its build process seemed to rely heavily on building programs and then running them to produce things to build, in some way that didn't work).
In the announcement, they explicitly state that the performance changes will make Arial and other very common fonts render less well. Is this a sound decision, in an age where these fonts are still very widely explicitly used?
Can somebody explain this to me? In low resolutions, you can easily, with current memory sizes and bandwidths, store all the fonts in pixel format (compressed of course). At high resolutions, you don't need any hinting, so you can just render the font as svg paths for example.
If you use a particular version of each letter at each size, you start to accumulate positioning error as you go along a line (or end up with odd gaps if you try to avoid it). Nowadays, we tend to aim for precise reproduction of the font's sizing, even where the size ends up not being a round number of pixels, for consistency. You'd then need at least 16 versions of each letter to approximately cover each subpixel position, and it starts not really being a win.
No. No stop. I install MacType, which hacks windows's font rendering and replaces it with freetype, SPECIFICALLY so I can /avoid/ cleartype's ugly rendering on windows. I don't want it to invade my linux/mac as well!
Nikolaus Waxweiler writes in the announcement that he was led "to strip the v38 Infinality code to the bare minimum and remove all configurability in the name of speed and simplicity".
There were 2 interpreters before. Now there are 3. You're misunderstanding the announcement. Regardless, the Infinality interpreter is the closest to Windows rendering so it isn't what the OP wants. In all likelihood they want hinting disabled if they want it to be like OS X... so this stuff isn't relevant at all.
This is true, and IMO it's the reason font rendering is much less important than people think. When I got a new pair of glasses I disabled font anti-aliasing to make the edges as sharp as possible. I did this so I could more quickly learn to focus my eyes correctly with the new glasses. By the time I had mastered the glasses I was used to the font rendering, and I have not switched it back. I now prefer fonts with no anti-aliasing. If you'd told me in the past that I'd end up preferring that I might not have believed it.
The most legible font is the one you're used to. People can read Blackletter or Spencerian like it's a normal font if they're used to it. I think vector fonts were a mistake. It's a lot of software complexity for very little value.