XML or JSON?

I’m at a meeting of the OpenAjax Alliance, hosted by Microsoft in Mountain View, CA. A presentation from Fidelity raised an interesting point regarding the use of XML and JSON to convey large data sets from the server to an Ajax client. In the demonstration, a moderately large set (of 2000 data items) took 17 seconds to transfer in XML, while it only took 3 seconds using JSON. It is known that the JSON approach has some security problems, but the difference in performance is compelling. I raised the possibility of using Efficient XML. They haven’t looked at this, and it is still early days for E-XML, so it’s likely that they will put their effort into getting the community to improve JSON security rather than into XML’s efficiency.

Other raised issues included the (in)efficiency of JavaScript engines, the incompatibilities across implementations and the lack of support for creating secure mash-ups. On that last issue, I recently came across SMash (Secure Mash-up) from IBM that used multiple frames from related domains to overcome the sandboxing imposed by the browser to enable independent components to talk across the boundaries.

All very interesting questions. I’m not hearing many answers though.

In the case of diversity across JS implementations, there are two obvious ways to address the problem:

  • Impose compatibility through interoperability tests and certification.
  • Record the differences in a repository and adapt accordingly at run-time.

The former requires a lot of industry resources and cooperation. I just can’t see that happening any time soon. The latter has potential, and I will touch upon that at my presentation to the Mobile Ajax workshop tomorrow.

Categorised as: Uncategorized

Comment Free Zone

Comments are closed.