by beau on 2/19/18, 4:28 PM with 84 comments
by pkulak on 2/19/18, 4:47 PM
With this new-line-delimited JSON format all your clients HAVE to know about your new protocol. They have to stream the response bytes, split on new lines, unescape new lines in the payload (how are we doing that, btw?), etc. If a client doesn't care about streaming, it can't just sit on the response and parse it when it's done coming in. Or, how about if later on you upgrade the system so that the response is instant and streaming is no longer necessary? Then you move on to a new API and have to keep supporting this old streaming-but-not-really endpoint forever.
by rhacker on 2/19/18, 4:36 PM
by slig on 2/19/18, 5:00 PM
For instance, their iOS app weighs 888.8 KB! When it's common for simple apps to be 50 MB monsters, it's very refreshing to use something that has been developed with proper care.
by zeger on 2/19/18, 5:06 PM
by Figs on 2/19/18, 6:36 PM
Right now, it takes over 5 seconds(!!) for this page to load because of all the freaking JavaScript it has to download! With JS off, the page loads almost immediately. With a keep-alive connection, subsequent loads over HTTPS are not particularly long, unlike what this article seems to think. (Hacker News is one of the FASTEST sites I can access, for example. Even on my crappy connection, pages load nearly instantly.)
Simply letting me type, press enter, and wait 0.1~0.3 seconds for a new page response would not be a significantly worse experience -- however, due to the way the site is written, search doesn't work AT ALL with JS disabled.
So, lots of engineering effort (compared to just serving up a new page) for little to no actual speed improvement, and a more brittle website that breaks completely on unusual configurations... Yeah. Please don't do this!
by erikrothoff on 2/19/18, 4:57 PM
by fenwick67 on 2/19/18, 5:45 PM
The use-case was we had a slow database query for basically map pins. The first ones pins come back in milliseconds, but the last ones would take seconds. The UI was vastly improved by streaming the data instead of waiting for it all to finish, and the server code was easy to implement.
A different delimiter would have worked, but newlines are easy to see in a debugger.
by tuukkah on 2/19/18, 4:54 PM
by MonkeyDan on 2/19/18, 5:42 PM
by iamd3vil on 2/19/18, 7:22 PM
by bshacklett on 2/19/18, 8:34 PM
by Osiris on 2/19/18, 9:11 PM
At one job I had several years ago we came up with the same idea and use \n separated JSON elements as a streaming response. We also tossed around the idea of using WebSockets to stream large responses between services.
by JensRantil on 2/19/18, 10:48 PM
IMHO the most reliable way to get data from point A to point B is likely by having a client actively polling for data, using a strict socket timeout. Data should be at-least once delivered. If JSONS should be called anything remotely "reliable" as periodically polling, at least it should have a strict timeout (not mentioned in the article) for receiving the next newline & it should handle replaying of non-acked messages. Otherwise I would call it far from "reliable".
by jannes on 2/19/18, 4:47 PM
JSON.parse() only accepts strings.
The library that the article recommends also uses XMLHttpRequest with strings. [2]
The reason I'm asking is the maximum string length in 32-bit Chrome.
[1]: https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequ...
[2]: https://github.com/eBay/jsonpipe/blob/master/lib/net/xhr.js
by sajal83 on 2/20/18, 6:11 AM
I think streaming would be useful only if the responses are stateful and it's hard to share it across requests.
by delaaxe on 2/20/18, 2:40 AM
by wybiral on 2/19/18, 8:03 PM
by gumby on 2/19/18, 4:58 PM
by cwt137 on 2/20/18, 8:01 PM
by nategri on 2/19/18, 6:16 PM
by osrec on 2/19/18, 8:52 PM
by stringham on 2/19/18, 11:41 PM