the most headache inducing issues by far are going to be file formats and network protocols that integrate 32bit unix timestamps, which are numerous.
NTP is one such example.
This is a much worse and much more fundamental issue than the y2k bug was, I hope people aren't writing off the severity, the time to start dealing with it is now.
Agreed, it's pretty scary. On the other hand, I'm sure similar problems were tackled for y2k with the 2-digit fields everywhere. If we built it, we can rebuild it. Although a lot of software and hardware will need replacing/upgrading/switching over to new protocols.
They were, however, given that a lot of things have life in the field of several decades, now is the time to start fixing it so it doesn't become a mad rush. Yes, it means less money for devs 23 years from now as they madly rush to update software, but I'd rather that money be spent for devs to make future programs awesome than to worry about our ghosts.
Indeed. I get worried when I see people deploying new embedded ARM systems these days to replace systems that are 30 years old when there is only 24 years to 2038...
If you install a 32-bit Linux, then time_t is still 32-bit, only on 64-bit Linux you get 64-bit time_t.
Although by 2038 I hope we won't have many 32-bit systems around...
Embedded devices are likely to be 32-bit for a very long time yet. I'd consider 100+ years to be a reasonable number!
64-bit devices are much more complicated (bus size, peripherals, part count) and therefore more expensive. If your entire task fits in a 32-bit space there is little motivation to use a 64-bit core.
It won't on full SoC devices but most ARM embedded devices of that sort have all sorts of peripherals which are bus connected so the bus will need to be external.
If it ain't broke...There are still 8 bit and 16 bit cores being produced today. Hoping products will become obsolete is one of the big reasons y2k was a problem.