Then you parse elements one by one. While your particular library may not support it, it's not a very hard thing to implement
["a", 1, "b
A parser could at this point give us the "a" and the 1, but not the "b" since the string could have more content.
You probably want the parser to give you an object that behaves as a collection you can for-each loop over, and that blocks when there is no more data available.
Presumably similar to how you would stream XML. I guess my question would be, isn't this already possible? You just need a reader that knows that the JSON source is a stream and blocks reads until data is available. Then use a concurrency mechanism to coordinate between the stream reader and the consumer of the stream data to keep things running smoothly (or if you don't mind blocking just use the blocking reader on the same thread/process as the consumer).
If I got multiple items, they're in an array. If such a reader blocks reads to an element until it's finished, I've won nothing if I have to wait until that array finished.
As I said, you'd need some measure of concurrency. Or perhaps a non-blocking reader. Try to get new stuff, if it's not there continue on, I just don't like getting "null" responses like that as it's too close to a busy-loop which is, well, poor form. I generally write code that interacts with the world in its own thread/process/whatever so it doesn't block the rest of the system unless appropriate.
The utility here is if you can process each element one at a time. If you need everything to get a meaningful result it offers no benefit to you.
Or do you want to send different parts of e.g. a map in separate streams?