Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
iPhone 6 Screens Demystified (paintcodeapp.com)
253 points by melancton on Sept 11, 2014 | hide | past | favorite | 85 comments


This fuss about the iPhone screens is, to me at least, hilarious.

A good part of my career has been being the "Android guy" in organisations where for better or worse the products would de facto lead on iOS. The number one headache was designers pushing "pixel perfect design" which is doable on Android, but is a pointless headache when your tiny screen is 720p or higher.

The fact Apple have a phone coming out for which it is actually impossible is going to send a lot of these people into a confused fit. If this was how Android worked it would be criticised to the hills, but because it's Apple they can't.


Since the points are rendered in pixel-perfect form and then down sampled on the iPhone 6+, the only potential loss of fidelity is in the downsampling step. Since we're in an age of not even being able to see pixels, there should be no fidelity loss. The image displayed will be rendered at a higher resolution than the display, and thus every pixel on the ultimate display will be "perfect". Back when pixels were big enough to see, I was one of those "pixel perfect" people too. You had to get it right, or the error was noticeable. With retina displays, this is much less of an issue. (to me at least)

I have seen android devices advertised with 4k displays at a 4-5" diagonal form actor. That doesn't make sense to me, because beyond 300dpi there isn't much point in higher pixel density. Yes, it makes sense for Apple to choose this size display (as 1080p is a commodity size).

The real issue is, if you want to compare the platforms, which platform has better resolution independence support.

I won't criticize android on this, but I will say that the iOS reliance on PNGs has gone on way too long. 2x was a decent solution for photographs... but we should have moved to SVG or some vector format as a native format in iOS several years ago.


"there should be no fidelity loss"

Heh. Yes there will be, most notably on webpages, for example those with box borders defined as "1px solid" lines. The lines will look fuzzy and unequal. Check the last image at http://www.paintcodeapp.com/content/news/2014-09-11_iphone-6... : it shows a "1 CSS pixel" line rendered as 2 black lines + 2 grey lines (one of the greys is very, very light). Some lines will not be interpolated the same way: some will be 2 black lines + 1 grey line, some will be 1 black line + 2 grey lines, and the shade of the 2 greys will be unequal and will vary depending on their precise alignment on the 1080p physical grid. The end result is that some lines will look slightly thinner/lighter or slightly thicker/darker than others. Heck, on my Nexus 5 which has an even higher PPI (445 PPI) than the iPhone 6+ (401 PPI) I can still clearly notice this effect, when holding the phone at arm's length (and I don't even have 20/20 vision), when simulating the downsampling that the 6+ is going to have to do. I can do it by zooming out[1] to simulate an effective pixel ratio of ~2.61, which is equivalent to what the iPhone 6+ is really doing graphically (1080 physical pixels wide / 414 logical pixels wide = ~2.61).

Contrast this with most Android devices where it is possible to get pixel perfect rendering because most (but not all) of them use a non-fractional pixel ratio (1.0, or 2.0, or 3.0, or 4.0) and to my knowledge none do downsampling (though with thousand of devices on the market I am sure 1 or 2 obscure models do it).

[1] If you have a 1080p Android device with a pixel ratio of 3.0 (eg. Nexus 5, LG G2, Galaxy S4, HTC One M7, etc), you can simulate the downsampling of the iPhone 6+ by zooming out in your browser by a factor of 360/414 = 0.8696

Edit: I did NOT downvote you. I agree that for many apps it won't be noticeable, but it will be in some apps. Many users spend a lot of time in the browser, and many sites use 1px lines, and they will be a distracting artifact (unless Apple implemented a special hack to interpolate 1px lines in the browser - maybe Safari bypasses downsampling and forcibly rounds them up to 3 physical pixels, but then what about other browsers?)


Why wouldn't a 1px CSS line, shown with 3 pixels on iPhone 6+ per, downsampled to HD (per the article) still look OK?


FWIW, here's how a 1 pixel line looks on my Nexus 5 (4.95" 1080p display): http://i.imgur.com/RmRHpit.png

Here's the web page with the 1 pixel line: http://jsfiddle.net/k0xep8pL/embedded/result/

On my PC, the line is just a single pixel. On my Nexus 5 -- as you can see -- it's actually three lines, which are (from top to bottom): grey (RGB=134), black (RGB=0), grey (RGB=156).

The impression I get, when looking at it on my phone, is a solid black line. Bear in mind, that the width of the display is 61 millimeters, which means that these three pixels take up only 0.17 millimeters on the screen.


I think the reason raster icons have stuck around so long is the proportions of icons need to change as the scale gets really small to keep them recognizable (I think formats like svg has support for this now). Any vector icons also need to be rasterized before drawing to the screen which can take up memory and make things less speedy.

Vector icons were the new hot well before the original iPhone was released. Here's an article about OSX 10.4 Tiger released in 2004, 3 years before the first iPhone[1].

[1] http://forums.appleinsider.com/t/45544/mac-os-x-tiger-to-sup...


> Any vector icons also need to be rasterized before drawing to the screen which can take up memory and make things less speedy.

Well...any raster format needs to be read from disk, decoded and drawn, though there are special optimizations for the drawing step.


...any vector format needs to be read from disk, decoded and "rendered" (bezier, gradients, transparency, etc calculated), too.


You can use PDF assets in iOS 8 instead of bitmaps but the tools haven't completely caught up with this yet.


> Since the points are rendered in pixel-perfect form and then down sampled on the iPhone 6+, the only potential loss of fidelity is in the downsampling step.

Are we sure that this is how it's implemented? They could just as well have the frame buffer be 1080p and adjust the device matrix of the rendering engine(s).

> but we should have moved to SVG or some vector format as a native format in iOS several years ago.

Like PDF?


I wonder if Apple's ios7/Yosemite icon designs are a shift in this direction to vectors. E.g. the icons look like they are authored as a vector. But, I wonder if the platform APIs are not yet ready to handle vectors everywhere and seamlessly between pixel/vector formats.


Almost all of these are done as vectors in Illustrator and pre rasterized into pixels (with touch ups in PS if necessary).


Absolutely. No designer worth their salt is starting with PS for these app icons.


OTOH, iOS 7 did a fine job of almost entirely removing bitmaps from iOS in advance of these new screens. So developers are in a lot better shape now than if they had to redraw all their skeumorphic assets at 3x.


Since there was never an iOS update that did NOT break my app in some way or another, I don't expect resolution hell to work out of the box. Android had to deal with this for years and matured. In iOS, the last release of my iPhone app asked me to upload a gazillion different pixel-perfect icon sizes.

I have a small online business. I have neither time nor patience to do this shit myself, but it'd probably take one person for the app only to maintain it properly through all these nasty upgrades all the time. Oh, and this person now needs at least 4 different iPhones to test that stuff properly.

Good for you that you were the Android guy. I'm the guy required to have an Android guy and an iOS guy. Which sucks ;)


Just out of curiosity... you need four different iPhones to test?

How many different O/S and model variants do you need, to feel safe on Android? Or does your app not do anything complex with the graphics?


Well, the iOS simulator so far wasn't really that good. There were some minor parts that didn't work in the simulator, but they worked on the iPhone. I used to test with two different iPhones: 4 and 5.

My Android-app is a lot less complex. The visually complex part (a custom chart that cannot be done with a default charting library) wasn't implemented in the Android-app.

We're working on a new version. I plan to test it on probably 3-4 devices as well, but with the assumption that Android is far better at handling different screen sizes than iOS currently (because maturity).

So yes, in the end, I'll have to test on 4 different devices for both platforms. With Android, I can mostly borrow devices from friends. Not so with iPhones. I might skip on the large iPhone 6 though. It's just too ugly to care about ;)


Android developers have to take what ever comfort they can get, it certainly won't be the money.


I wonder if it will be possible to opt-in to 1:1 pixel rendering on the iPhone6+ after all, particularly for opengl/metal/video playback. It seems the downscale hack is mostly required for UIKit backwards compatibility and to maintain approximately equal physical sizes for UI elements only.

Very nice visualization in the OP, by the way!


The downsampling is representational of how the OS renders your App.

Especially in Metal, but probably in OpenGL, I'm sure you get access to the actual resolution buffer you can render into.


Have to admit I'd be surprised if a full screen OpenGL or metal thing wasn't available, as for gaming the overhead of writing to a much larger offscreen buffer and then to the screen is not going to be a good idea. Quite often the reverse is done of scaling up a smaller buffer.


This isn't even always possible with Apple's software on a retina MacBook with OS X, and vexes me to this day. QuickTime won't play video back at 1:1 without setting the entire display to 1:1.


At least there is a way to set the mode to 1:1, though. There's several tools that are able to switch the retina macbook into 2880x1800.


There must be a way. Otherwise the samples they have shown in the keynote of the landscape mail with the split view or the "enhanced" CNN sample app would never work as in the default mode outlined in this infographic, the screen can't possibly offer any more real estate for UI than the smaller screens, but the samples in the keynote were clearly showing more UI content.


The samples with the wide split screen are rendered at 1242x2208 (3x 414x736, which is larger than any other iphone which is either 320x568 or 375x667) - and then downscaled. My question was really about whether is it possible for software to render framebuffers at 1080x1920 (3x 360x640) - perhaps bypassing UIKit completely.


Hunh? Developers work in terms of the "points" described in the first image.

The 6+ has an extra 39 points in height and 69 in width the 6; about 10% more real estate in each direction. Sure, it's not that much, but it's not nothing. It's an even bigger jump over the screens of the iPhone 5 and 4.


This is looking really nice. The "No Image" images on http://www.paintcodeapp.com/content/news/2014-09-11_iphone-6... look unintentional, though.

Edit: The 1x version looks fine: http://www.paintcodeapp.com/content/news/2014-09-11_iphone-6... So this post only looks broken on retina screens, which is a bit ironic :-)


An infographic which conveys a complicated information in easy to understand form. More of this please!


I know it's dangerous to armchair speculate about Apple's moves, but it really seems like they should have held out for a true 3x screen instead of compromising on a 1080p. I certainly would never buy the 6+ after reading about their cheesy resolution hack. It feels rushed and short term.


Keep in mind that it's resampling a 3x image. Display scaling already looks acceptable on retina MacBook Pros [1], and the 6+'s screen is much more high-resolution than that. I agree that this sounds like a terrible idea on paper, but it might not be so bad in practice.

1: http://www.anandtech.com/show/5996/how-the-retina-display-ma...


I think it's actually pretty clever. Developers clearly have trouble writing apps to support arbitrary scale factors. Resolution independence was supposed to be a feature of Windows since Vista was codenamed "Longhorn" (before 2004), and Windows is still a mess on higher-DPI screens 10 years later. Making it so developers only have to think about 1x and 2x and 3x makes the task easier.

Also, look at it this way: with the diminishing returns of resolutions above 300 dpi, is it really a big deal that you won't get the absolute best use of the iPhone 6+'s 400 dpi? Seems like a pretty reasonable trade-off to use that extra resolution in a way that makes resolution-independence easier for developers.


I'm imagining the commentariat's reaction to such a "pretty reasonable trade-off" on a Nexus 5 or a Galaxy S4


Aren't the commentariat always asking "what's the point of going to 1080p or 4k on a 5" display?"


We were until Android phones got higher DPI than retina iPhones ;)


1x 2x 3x is great as developer, but downscaling the result to save a few bucks on the display panel? Not so much.


Its not the cost, its the power usage. The 1080p LG G2 had incredible battery life, while the G3 (according to all the reviews) with 2K screen was a big step backward in battery life.


Normally I'd be right there with you, but at this DPI there shouldn't be any visible scaling artifacts. Normally scaling looks terrible because you're upscaling low res components to a higher res version, but since their source is higher resolution and they're downscaling to an already extremely high DPI screen, you'll probably be hard pressed to find aliasing artifacts without a magnifying glass.

The other reason why I think this might be a good thing is since 1080p is more of a standard than the other resolutions they're playing with, which might help unify things moving forward.


I'll withhold judgment until I see it live, but they did make it work with the MBP Retina (E.g. the 13" will fake 1680x1050 by rendering to a 3360x2100 frambebuffer and downscaling to the 2560x1600 display)


Do you realize that every Retina Macbook Pro out there uses this "cheesy" resolution "hack"?

Tying the size and density of your display to software needs-- especially points that are arbitrary to begin with (as they are based on the original iPhone) seems silly and outdated.


Actually no. Or maybe: yes, but...

My first MPB was one of the "high resolution" ones that ran at 1680x1050 and I loved it. So when I first got my 15" retina MBP was miffed that now it felt like I was running a 1440x900, just with double (well, quadruple) pixel density. So I opened scaling preferences and set it so the retina screen would "emulate" a 1680x1050 display. It was terrible. Blurry and SLOW. Good lord it was slow. And impossible to do any design work on. I went back to normal 2x scale.

So sure, this kind of scaling has been available for a long time but the reality is most people don't scale their MPB retina displays to anything other than 2x.


Eh? I've run my rMBP at 1680x1050 for over two years. It's not blurry or (appreciably) slow, though I'll admit I'm using it more for development work more than heavy design work.


I guess YMMV... I also had to get my screen replaced because it was one of the small number of screens that had severe burn-in issues.


I'm not a UX designer by any stretch, but as a consumer I can say that if Apple can pull off the downsampling/downscaling correctly, good on them. I have a Kindle Fire HDX 7", and recently upgraded to a Fire HDX 8.9". The former is 1920x1200 and 323 PPI, the latter is 2560x1600 and 339 PPI. That's not a huge difference in PPI in fact it's far less than the difference in the two new iPhones, yet so many apps (mostly games) render incorrectly on the larger Kindle, to the point that small text becomes unreadable.

If Apple can release two new phones with greater disparity and seemingly get the app experience right, why can't Amazon? I don't use an iPhone but seeing these explanations that break down how they pull off such a feat really impresses me.


I haven't worked in the Kindle division, but there is a huge cultural gulf between Apple and Amazon. Apple is very much a software company (well they're a design company that encompasses both hardware and software). They care a great deal about getting software right. At WWDC each year they have a session on how to do great software that illuminates a process that involves a lot of design and upfront work before you begin coding. A process that virtually nobody does these days. (I say these days because back in the 80s this is how CS students were taught "write your program out in paper before writing it in code". Nowadays nobody barely thinks about their work before they start writing code, for the most part.)

This is the polar opposite of Amazon, which I worked for in the past.

Amazon is perceived as a high tech company but it is a retailer and its management is the management of a retailer. Amazons engineering is overseen by non-engineers and the company culture is highly political and focused on making a show more than making a product. Quality is the last thing they actually care about (when it comes to software.) You get ahead at Amazon with press releases-- it never matters if your product makes no sense. (EG: Movie listings on amazon.com, mail-order catalogs, literally scanned, and posted to amazon.com, innumerable initiatives and features that were put up there getting someone a promotion, only to have the team disappear in the next quarterly company wide reorg and the result to just rot.)

This is why I would never buy a tech product from Amazon. They really don't give a damn as a culture, and any engineer who gives a damn will not survive long there (and will be punished).

Giving a damn about design and quality doesn't play well with stack ranking.

What plays into stack ranking is kissing your bosses peers butts, playing political games and making splashy new features.


Huh. Does any/all of that apply to AWS specifically as well, or is it an exception?


>> Nowadays nobody barely thinks about their work before they start writing code, for the most part.

I am old enough to remember where this came from...

I used to do C for quite a while, then I made my hobby (Perl and scripting) my day job.

If given a certain size of a problem while working alone, I used to sit down and think for 1/2 a day or so, then develop (coding and also thinking more) for a week or two.

After I went scripting, this didn't work.

Since I could over an afternoon throw together a solution to the same size of problem which took a week in C, it was just not realistic to think first!

The best solution was to throw something together, look at it and then rewrite it if I didn't like it. The sheer development speed felt like I had gotten a sports car.


Absolutely agreed... prototype-morph-prod is a faster cycle.. but on the same lines, it means automated unit tests are more important the more complex projects get. Which is why modular code and testing work really well with scripting environments, and computing is fast enough.

With a modular approach, it also becomes easier to scale horizontally either in the same server/system or a separate server/system. Either via HTTP, TCP, 0mq or another abstraction, if the interfaces are the same, the modular layers can be replaced. This gets easier with async by default environments (node.js, golang, etc), and I find it to be much harder with classic N-Tier (.Net, Java). I'd rather use 0mq with node than wcf with .Net any day of the week.


Interesting point about scaling, thanks.


Apple documents the crap out of it, and pages like on the dev site, and summaries like this are just built of that (usually aimed at a less technical audience than the dev portal). I think that's what helps the most - I've been able to tell every designer I've worked with what resolutions I need and how I need the files suffixed, and since they tend to use vectors anyway, it all just works out. Amazon must not put the same amount of effort into documenting, but admittedly I don't know about that.


I think you've hit on a big part of it. It seems Apple works more closely with their iOS developers than Amazon does with their Android developers. The Fire OS itself is very nice, fluid, responsive, and makes sense when you treat it as a purely consumption oriented device. That's why it's such a shame that third party apps and games can look awkward on the larger tablet; the user experience should matter more but it seems Amazon doesn't want to put in the same effort to support their third party developers.


Have they gotten the experience correct? Serious question, I haven't seen them.


I haven't either, but all signs point to "yes". I have a friend eyeing the 6 Plus as an upgrade from his Note 3, so I may get a hands-on soon enough.


So if you want to watch a 1080 video on this device, the player will upscale it to 1242 and then the hardware will downscale it to 1080 again? or will there be some way to bypass all this?


The downscaling in the info graphic is representational of how a developers chrome is handled. Not of a "full screen" experience...

Or put another way, when you play a video, you let the OS take over the full screen and there's no need for downsampling.

Its not like the downsampling is a permanent stage of some rendering pipeline to the display.


This article concerns "rasterization": the process of converting a description of an image into actual pixels.

A 1080p video stream doesn't need rasterizing -- it already consists of pixels. Any sane video player would simply show the original 1080p video mapped on the 1080p screen in full screen mode.


(1) Is it really a problem that apps can't be pixel perfect on the 6+? How unsharp are we talking about? Are there workarounds such as device detection and an additional set of image assets?

(2) Will Apple aim to re-enable pixel-perfection in next year's iPhone release? Was this just a stopgap measure due to yield challenges inherent to display manufacturing?


1- We'll have to see a device to know for sure, but I expect his will not be a problem at all. For my apps, I'm going to do nothing, but make sure we support the additional point dimensions.

2- I think this is doubtful. The first time I saw a WWDC presentation about Resolution Independant graphics was in 2003 (maybe 2002.) Apple's been preparing for this time for a over a decade. The iPhone just was a bit too skeumorphic in its early versions resulting in a focus on raster assets rather than vector.

Either we're all going the Paintcode way (authors of this post) or Apple is going to have to start taking CSS or SVG or some vector format for its assets.

I think this is more likely than tying displays to a rigid multiple of the original iPhone screen size.


(1) The iPhone 6+ has 421 dots per inch, which is in the lower ballpark of printing resolutions where you don't have pixel perfection either. It's not going to be a problem.


My Nexus 5 (4.95", 1080p, 445 ppi) does the same, and to me it looks fine.

See my other post in this thread: https://news.ycombinator.com/item?id=8308194


Would be great to add 4" size of iPhone 5/5s/5c to the illustration.


Substitute its resolution and dimensions for the regular 6, then: it's got the same DPI and the same 2x scaling, on a smaller screen.


Stupid question, but why the point resolution for the iphone 6+ isn't a perfect divider of the pixel resolutions like it is for the other iphones?

Since there is more than one point resolution anyway.

And also, how is it different for android devices?


Not a stupid question as you've hit on the major strategic bit of news here that most people, I think, are missing.

Displays are cut from larger pieces of display material at a given PPI. So Apple focused on perfecting a 163PPI display material and then tended to use it on multiple products. They'd do the same thing for the retina display material as well (I think at one point the iPhone and iPad used the same display material and it was just cut to different sizes in making the display)

Apple could do this because they work so closely with their display manufacturers and they care so much about fidelity. It just happened that doing rigid multiples of the original display size was convenient for software as well.

However, displays have become more commoditized, especially retina resolution ones (at the time it was introduced, Apple was the only one shipping these kinds of displays in such volume) and Apple's volume has only increased over the years.

So we're seeing a shift to more commodity display production, I believe, because the commodity market has caught up with Apple's standards. Beyond 300 pixels per inch there isn't much advantage to higher density.

But there is a huge advantage to unit volumes of a display also used in portable TVs or whatever.

The software cost of downscaling isn't significant at this point so it is an economic change.


That explains the pixel size, but not the point size. Why not make the logical resolution 360x640 with perfect 3x upscale?

My best guess is because this would then be smaller than the iphone 6 size (in points) at a larger physical size so everything would look "too big"...


Doing a 360x640 will indeed create a rather awkward situation where iPhone 6 plus have less usable screen space than the iPhone 6 (which is 375x667). Though, my guess is that they want to keep the physical height of elements to be roughly the same across all iPhone family.

This way, all existing buttons that were designed with 44px tapable height in mind or any existing text could have more details in it while not appear larger or smaller than any other iPhones and will have more space to display content. I believe non-mini iPad is the only product in the iOS family that recommended 44px is physically larger than the rest.


I believe this is because Apple wants the logical size to correspond (roughly) with physical size. This is particularly important for user interfaces. A button needs to be sized for the human finger that is tapping it and thus should be the same physical size on each device, which in turn means it should take up a smaller percentage of the logical size of the bigger devices. The only case I know of where they broke this was the iPad mini, which has the same logical size as the larger iPad, and you may notice that everything on the iPad mini is just smaller.


To be fair, iPad Mini has the same PPI as the original iPhone (163 PPI) and iPad Mini Retina has the same PPI as the Retina iPhone (326 PPI) so it could be argued that iPad is the only device where everything appears larger than the rest.


Do you actually believe that LCD manufacturers screen TFT transistors onto massive pieces of glass (like TV-sized) and then slice them down to phone-sized pieces for assembly?


It's not quite that simple, but essentially "yes". https://www.youtube.com/watch?v=TDoxReEkHe8


In that video, the larger parent substrate glass is still partitioned into smaller segments for lithography and processing.

I was referring more to the concept that the entire substrate is built as a huge 163 dpi display and then it's a matter of customer preference how the glass is cut.


There were rumours of display manufacturing problems holding up the "bigger" iPhone 6. So I wouldn't be surprised to see a minor bump up to an actual 3x pixel size in the next revision.

(This would also give them back a minor marketing advantage that they had briefly when they introduced Retina, as I'd guess most mainstream Android's will stick at 1080p)


Not stupid at all, I still don't understand the reasoning behind it.


They did it! This nice page explaining the resolution differences etc. made me want to click the "paintcode" link at the bottom of the page. This is hard to pull off well, and I think they did it.


Exactly this, the article succeeds in making this seem so incredibly backwards that you NEED their particular commercial third party tool to survive as a developer ;) (hey, maybe its even true, I haven't the faintest)


Cool diagram - helps to understand what can be a confusing topic for people who don't actually do any dev on the platform.

I believe the 326 PPI for the original iPhone is incorrect. Should be 163, I think.


Note: On latest Chrome, I'm getting some clipping on the left text column. see http://imgur.com/jEAuB4A,YtVYC4T

Nice diagram!


Looks like the images themselves are incorrectly cropped, so everyone should get the same clipping, whatever browser they use:

http://www.paintcodeapp.com/content/news/2014-09-11_iphone-6...


TLDR: The Iphone 6 plus can never show a sharp picture, unless the application-UI is adapted to be unsharp on all other displays.


Not the case at all. Anyway, anything rendered into a retina resolution is going to be quite sharp.


If this article is true then "The Iphone 6 plus can never show a sharp picture" seems correct (your 3x assets will be scaled to 2.6x).

What part of this do you think breaks other displays, though?


Microsoft made a somewhat similar mistake(?) many years ago with WPF, a vector UI framework supposed to handle a wide range of pixel densities. You could frequently end up with lines that were 1.5pixels thick or located on a fraction of a pixel which caused them to look very blurry. To fix this they added a special flag called something like forcesnaptodevicepixels, which was supposed to force certain lines to always be located on an exact device pixel.


For the 6+, why couldn't they use a 1600x900 screen at 2x and be done with it? That'd be 800x450 points and 330 dpi, nice and easy. Rendering at 3x and then down sampling to 1920x1080 seems like a lot of useless busy work for absolutely no gain.


I still don't understand how the iphone6 can run existing apps while having the same dpi as the iphone5 but with a larger screen.


The screen is larger but each pixels are at the same distance of each other in both phones.


can this be real? There's mandatory downsampling in every case, even if you want to try to ship to be pixel perfect?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: