by brynary on 6/5/14, 7:00 PM with 57 comments
by falcolas on 6/5/14, 7:46 PM
No reason you can't implement schemas over JSON. In fact, you typically implicitly do - what your code is expecting to be present in the data structures deserialized from JSON.
> Backward Compatibility For Free
JSON is unversioned, so you can add and remove fields as you wish.
> Less Boilerplate Code
How much boilerplate is there in parsing JSON? I know in Python, it's:
structure = json.loads(json_string)
Now then, if you want to implement all kinds of type checking and field checking on the front end, you're always welcome to, but allowing "get attribute" exceptions to bubble up and signal a bad data structure have always appealed to me more. I'm writing in Python/Ruby/Javascript to avoid rigid datastructures and boilerplate in the first place most times.[EDIT] And for languages where type safety is in place, the JSON libraries frequently allow you to pre-define the data structure which the JSON will attempt to parse into, giving type safety & a well define schema for very little additional overhead as well.
> Validations and Extensibility
Same as previous comment about type checking, etc.
> Easy Language Interoperability
Even easier: JSON!
And you don't have to learn yet another DSL, and compile those down into lots of boilerplate!
I'm not trying to say that you shouldn't use Protocol Buffers if its a good fit for your software, but this list is a bit anemic on real reasons to use them, particularly for dynamically typed languages.
by wunki on 6/5/14, 7:39 PM
by Arkadir on 6/5/14, 7:45 PM
Free backwards compatibility ? No. Numbered fields are a good thing, but they only help in the narrow situation where your "breaking change" consists in adding a new, optional piece of data (a situation that JSON handles as well). New required fields ? New representation of old data ? You'll need to write code to handle these cases anyway.
As for the other points, they are a matter of libraries (things that the Protobuf gems support and the JSON gems don't) instead of protocol --- the OCaml-JSON parser I use certainly has benefits #1 (schemas), #3 (less boilerplate) and #4 (validation) from the article.
There is, of course, the matter of bandwidth. I personally believe there are few cases where it is worth sacrificing human-readability over, especially for HTTP-based APIs, and especially for those that are accessed from a browser.
I would recommend gzipped msgpack as an alternative to JSON if reducing the memory footprint is what you want: encoding JSON as msgpack is trivial by design.
by CJefferson on 6/5/14, 7:48 PM
1) Doesn't support Visual Studio 2013.
2) Doesn't support Mac OS X Mavericks.
3) No "nice" support C++11 (i.e. move constructors)
(These can be at least partly solved by running off svn head, but that doesn't seem like a good idea for a product one wants to be stable)With JSON I can be sure there will be many libraries which will work on whatever system I use.
by ardit33 on 6/5/14, 9:26 PM
by AYBABTME on 6/5/14, 8:14 PM
- network bandwidth/latency: smaller RPC consume less
space, are received and responded to faster.
- memory usage: less data is read and processed while
encoding or decoding protobuf.
- time: haven't actually benchmarked this one, but I
assume CPU time spent decoding/encoding will be
smaller since you don't need to go from ASCII to
binary.
Which means, all performance improvements. They come, as usual, at the cost of simplicity and ease of debugging.by gldalmaso on 6/5/14, 8:18 PM
This seems to me like a key issue, you need to really know beforehand that this won't ever be the case, else you need to make your application polyglot afterwards. A risky bet for any business data service.
Maybe if it's strictly infrastructure glue type internal service. But even then, maybe someone will come along wanting to monitor this thing on the browser.
by znt on 6/5/14, 7:43 PM
From Protocol buffer python doc: https://developers.google.com/protocol-buffers/docs/pythontu...
"Required Is Forever You should be very careful about marking fields as required. If at some point you wish to stop writing or sending a required field, it will be problematic to change the field to an optional field – old readers will consider messages without this field to be incomplete and may reject or drop them unintentionally. You should consider writing application-specific custom validation routines for your buffers instead. Some engineers at Google have come to the conclusion that using required does more harm than good; they prefer to use only optional and repeated. However, this view is not universal."
So basically I will be in trouble if I decide to get rid of some fields which are not necessary, but somehow were defined as "required" in the past.
This will potentially result in bloated protobuf definitions that have a bunch of legacy fields.
I will stick to the JSON, thanks.
by pling on 6/5/14, 8:20 PM
[ 4738723874747487387273747838727347383827238734.00 ]
Parsing that universally is a shit.by KaiserPro on 6/5/14, 9:00 PM
however if schemas scare you (shame on you if they do) then msgpack might be a better choice.
by jacob019 on 6/5/14, 9:16 PM
by tieTYT on 6/5/14, 10:31 PM
> * You need or want data to be human readable
When things "don't work" don't you always want this feature? Over a long lifetime, this could really reduce your debugging costs. Perhaps protocol buffers has a "human readable mode". If not, it seems like a risk to use it.
by redthrowaway on 6/5/14, 8:55 PM
With Node, I'd have to see a very good argument for why I should give up all of JSON's great features for the vast majority of services. Unless the data itself needs to be binary, I see no reason why I shouldn't use the easy, standard, well-supported, nice-to-work-with JSON.
by jweir on 6/5/14, 8:20 PM
by jayvanguard on 6/6/14, 12:36 AM
by nly on 6/5/14, 8:33 PM
by pkandathil on 6/5/14, 8:08 PM
by mrinterweb on 6/5/14, 8:55 PM
by don_draper on 6/5/14, 7:51 PM
Anyone who has used Avro in Java knows that this is not true.
by AnimalMuppet on 6/5/14, 8:43 PM
(Answer: Trivial.)