I implemented the same behavior in a different Google product.
I remember the PM working on this feature showing us their research on how iPhones rendered bars across different versions.
They had different spectrum ranges, one for each of maybe the last 3 iPhone versions at the time. And overlayed were lines that indicated the "breakpoints" where iPhones would show more bars.
And you could clearly see that on every release, iPhones were shifting the all the breakpoints more and more into the left, rendering more bars with less signal strength.
We tried to implement something that matched the most recent iPhone version.
To be sure, is it possible that, on each subsequent iPhone release, the hardware got better at handling weak signals, and thus a mediocre signal for iPhone N was decent for iPhone N+2 and would give great throughput on iPhone N+4?
Possible sure, but wouldn't it be better marketing for the iphone to have better performance on lower bars? Phones are judged for their performance, but network providers for the number of bars they show on the screen.
Bars are supposed to be an indicator of actually achievable quality of service, in my view. I don't care why I can use my network where I am, I just want to know whether I can.
The comment you’re replying to is incredibly concerning. Is he saying people at Google are purposefully misrepresenting signal strength so they can “compete” with Apple?
Except you have no idea why Apple is changing the signal display. They could be lying, or they could have a standardised test in which newer hardware performs better. This guy, on the other hand, is clearly saying that Google has no such thing and they're blindly copying Apple regardless of the performance of their hardware.
Bars really don’t matter. You can have full bars and slow to no internet. You can have one bar but relatively decent internet. Honestly kind of wish the signal display would go away and instead show me when I lose internet.
When you lose internet, you get a ! next to the bars (at least I have on my last few androids). Usually I also have no bars when I lose internet, but sometimes I've got coverage without data flow.
And while we're at it: Just surface the fact that connectivity sucks to applications (maybe even at the socket layer, by just closing them if there's not been any forward progress for a certain time), rather than showing me loading screens that'll never go anywhere for minutes.
This would give apps that do have some offline caching the chance of falling back to that (looking at a certain green music streaming service here).
It is surfaced to apps, but the "just detecting that connectivity sucks" heuristic turns out to be not all that easy to implement. There doesn't seem to be a better heuristic than "try and let the app decide if waited too long".
That is literally what i am observing lately with my provider: i have 2 bars and yet i do not have internet, where as my gf, using the same iPhone model, with a different provider, having 2 bars, has perfect data connectivity.
I build apps at the moment, in addition to the phone's network indicators you really should provide your user with visible and live feedback to indicate whether the servers are reachable because there's so many things that can break down in between. Also programming your app for offline-first is good unless it's critically important the information is either live or absent. We allow offline access by using React Query and storing its caches in user storage.
> And you could clearly see that on every release, iPhones were shifting the all the breakpoints more and more into the left, rendering more bars with less signal strength.
One thing explaining this might be that advancements in antenna design, RF component selection including the actual circuit board and especially (digital) signal processing allow a baseband to get an useful signal out of signal strengths that would have been just noise for older technology.
In ham radio in particular, the progress is amazing. You can do FT8 worldwide (!) communication on less than 5 watts of power, that's absolutely insane.
I remember the PM working on this feature showing us their research on how iPhones rendered bars across different versions.
They had different spectrum ranges, one for each of maybe the last 3 iPhone versions at the time. And overlayed were lines that indicated the "breakpoints" where iPhones would show more bars.
And you could clearly see that on every release, iPhones were shifting the all the breakpoints more and more into the left, rendering more bars with less signal strength.
We tried to implement something that matched the most recent iPhone version.