Probably not. One reason is that this is a competitive advantage for Netflix but I suspect the larger reason is that they created this to optimize their primary business, while operating it for other productions would be a secondary business and not mission critical. Another reason is that productions not on Netflix have other commercial options for camera to cloud production workflows (Frame.io for example).
This workflow and tool set is tuned for Netflix and is probably opinionated in a variety of ways to conform with Netflix production standards and requirements. If your production is being funded by Netflix then you're incentivized to learn and use their provided tools.
On the consumer side: not until people can actually afford casually dedicating the bandwidth necessary to deal with multiple 4k streams. Because either you need a truly monumentally liberal internet plan for sending uncompressed 4k at a normal frame rate, or you need a hardware encoder (baked in to your motherboard or discrete graphics card, or as separate purchase) to make sure you can send compressed 4k. And that only covers the sending part, you need to also receive and have the hardware capable of smoothly rendering the 4k video stream you're receiving.
But more importantly, on the industry side: given that live TV broadcasts are still 1080p, the answer here is almost certainly "not until the broadcasting world decides live 4k by default is even remotely worth it."
What's the financial incentive? Everyone who needs 4k streams is making money off them, and spends money on fancier camera gear. Most people just don't need it, so there's no money to be had.
It's hard finding a 60fps 1080p webcam that has a deep in focus region to minimize focus hunting and that isn't bad image quality, and that's way more useful.
They are. I was part of the test rollout of IPv6 when I lived in Richmond, VA last year. It worked flawlessly so I'd assume they should be set for a greater rollout sooner rather than later.
When was that? It's been "coming soon" on Verizon's own FAQ[1] for almost half a decade now. I'm in one of the major MSAs along the eastern megalopolis, and there's no IPv6 here.
There’s an excellent book on this surveillance technology and the ramifications called Eyes in the Sky. Talks about the military’s development of Gorgon Stare and many of the domestic startups using the similar technology to watch us.
That's exactly what they're doing. They are using an MCU architecture where as most others use an SFU architecture. I believe some solutions use a hybrid of both.
Is there any way I could use a USB C to DB9 console cable on an iPad Pro with this? I’m thinking this could be a great setup for working in the cramped data center.