Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Java Programming The Internet IT Technology

What is JSON, JSON-RPC and JSON-RPC-Java? 317

Michael Clark writes "Seen those funky remote scripting techniques employed by Orkut, Gmail and Google Suggests that avoid that oh so 80's page reloading (think IBM 3270 only slower). A fledgling standard is developing to allow this new breed of fast and highly dynamic web applications to flourish. JSON (JavaScript Object Notation) is a lightweight data-interchange format with language bindings for C, C++, C#, Java, JavaScript, Perl, TCL and others. It is derived from JavaScript and it has similar expresive capabilities to XML. Perfect for the web as doesn't suffer from XML's bloat and is custom made for our defacto browser language. JSON-RPC is a simple remote procedure call protocol similar to XML-RPC although it uses the lightweight JSON format instead of XML (so it is much faster). The XMLHttpRequest object (or MSXML ActiveX in the case of Internet Explorer) is used in the browser to call remote methods on the server without the need for reloading the page. JSON-RPC-Java is a Java implementation of the JSON-RPC protocol. JSON-RPC-Java combines these all together to create an amazingly and simple way of developing these highly interactive type of enterprise java applications with JavaScript DHTML web front-ends. " Click below to read more about it.
"Now is the turning point. Forget that horid wait while 100K of HTML downloads when the application just wanted to update one field on the page. The XMLHttpRequest object has made it's way into all the main browsers with it's recent introduction into Opera and Konqueror (sans the Konqueror bug). This new form of web development now works on Internet Explorer 5, 5.5, 6, Mozilla, Firefox, Safari 1.2, Opera 8 Beta and Konqueror 3.3 (with a much needed patch). Appeal to Konqueror users - please log into the KDE bugzilla and vote on this bug so you to can experience this wonderful thing. More details here: http://oss.metaparadigm.com/jsonrpc/ "
This discussion has been archived. No new comments can be posted.

What is JSON, JSON-RPC and JSON-RPC-Java?

Comments Filter:
  • O...k..... (Score:3, Funny)

    by Anonymous Coward on Monday January 24, 2005 @09:35AM (#11454631)

    *finishes reading summary*

    ok... so... huh?

    • Re:O...k..... (Score:4, Informative)

      by rdc_uk ( 792215 ) on Monday January 24, 2005 @09:39AM (#11454664)
      So...

      repopulate your product page for a new product WITHOUT reloading the whole page.

      Put a timer in, and have rotating feature products WITHOUT reloading the whole page on a timer.

      Update your totals in your chckout / shopping cart WITHOUT reloading the whole page.

      Write an RSS news ticker in html rather than flash...

      Basically anything that you might have used flash or an IFrame for, you could do with this, javascript and a DIV tag... Pretty important news (if you write commercial websites)
      • Re:O...k..... (Score:4, Insightful)

        by dorward ( 129628 ) on Monday January 24, 2005 @09:59AM (#11454793) Homepage Journal

        repopulate your product page for a new product WITHOUT reloading the whole page.

        So now people can't bookmark specific products

        Put a timer in, and have rotating feature products WITHOUT reloading the whole page on a timer.

        Useful from a commercial point of view. Really rather distracting from a visitor point of view. If I can't block it, I'm likely to find another vendor.

        Update your totals in your chckout / shopping cart WITHOUT reloading the whole page.

        This sounds practical, but at some stage you need to send the user to a new page anyway, and you can calculate new totals without having to make server calls - so you might as well leave telling the server about it until they go to the next stage of the checkout process.

        Write an RSS news ticker in html rather than flash...

        Ummm... why would you want an RSS new ticker on a webpage in the first place?

        Basically anything that you might have used flash or an IFrame for, you could do with this, javascript and a DIV tag... Pretty important news (if you write commercial websites)

        Yes, lets just create something with no practical advantages over Flash/Iframe, but which requires a more recent browser to access.

        • Re:O...k..... (Score:5, Interesting)

          by rdc_uk ( 792215 ) on Monday January 24, 2005 @10:32AM (#11455048)
          "So now people can't bookmark specific products"

          Fair comment, though I can think of places where similar features are desirable -> changing product images is a better usage.

          "Useful from a commercial point of view. Really rather distracting from a visitor point of view. If I can't block it, I'm likely to find another vendor."

          Good luck; sites designed to sell tend to ADVERTISE their wares. You seem to be thinking 3-second rotation; I'm thinking more 1-minute rotation of current "specials". Put bluntly; I have clients that will pay me to do this if its possible.

          "This sounds practical, but at some stage you need to send the user to a new page anyway, and you can calculate new totals without having to make server calls."

          1; I don't want the mechanism for calculations at the client side, so the calculation would require a reload, therefore this will make the page more responsive; i.e. better. The reason for non-client side? First: javascript is a shit language for anything other than DOM manipulation. Second: clients don't want their business logic exposed in javascript. Third: we don't want to download all the data required for those calculations to the client (prices in a code-manipulatable form should NEVER get to the client side, or be sent via post/get, prices just get displayed to the client, not manipulated by them)

          Think about a checkout that calculates shipping costs globally; you need the location from the client. Depending on location the methods available for shipping will change. Depending on weight of goods (changes with quantity change) and location, the costs for each method change.

          Thats a LOT of information to download to the client's machine to make the totals update without a server-call. Its also a ton of information (including prices) that I don't want the client side to have access to, and that I don't want javascript responsible for calculations on.

          If I can pull down just the new value, rather than the whole page -> better!

          "Ummm... why would you want an RSS new ticker on a webpage in the first place?"

          Again; clients will pay for it. Just because YOU don't want it (or, in fact, that my client's clients don't want it), doesn't mean it has no value.

          "Yes, lets just create something with no practical advantages over Flash/Iframe, but which requires a more recent browser to access."

          There are plenty of advantages to not using flash. (reduce number of languages required to display 1 page and reduce number of external plugins required to display 1 page to name just 2 advantages)

          Iframes are never a good idea.

          And did you read the list of supported browsers? Only notable omission in the real world was safari...
          2;

          • Re:O...k..... (Score:3, Interesting)

            by dorward ( 129628 )

            So (some) Vendors want it, and (some) Web Developers want it becuase the Vendors (their clients) want it - but does it bring any serious benefit to the end user?

            Changing product images is a reasonable thing to do - but all that needs is a change in the src of the <img> (Or you can take the thinkgeek approach and have nice, large images that show lots of details and that a full page reload doesn't add any significant bandwidth needs to)

            1-minute rotation? How long to people spend on a single page t

            • "Changing product images is a reasonable thing to do - but all that needs is a change in the src of the "

              And the VALUE to put in the SRC - which is what this would allow you to pull down as/if required.
              • How many images of each product are you going to have that you can't just include them in a <script> block on the product page? Or in the "Change image" link?

                <a href="product-page-with-image-2"
                onclick="return changeProductImage('product', 'image-2.jpeg');">Action shot</a>
                • Re:O...k..... (Score:4, Insightful)

                  by ryepup ( 522994 ) on Monday January 24, 2005 @11:27AM (#11455643) Homepage
                  So would you rather download <1K to get the new cost in your cart, or download 100K of HTML and images and have the screen flash to get the new cost? As a web host, would you rather serve 1K or 100K?

                  The other side is responsiveness of the application. A lot of places use the web for more than shopping carts, and this is the sort of technique that makes a web app seem multithreaded. Don't you like the multithreaded applications you use? Wouldn't it be cool to offer similar speed easily via a web browser? Right now my main option is to have a lengthy process performed in a pop-up window, or post back and start a new thread on the server, which both involve more work than it should.
                  • So would you rather download <1K to get the new cost in your cart, or download 100K of HTML and images and have the screen flash to get the new cost? As a web host, would you rather serve 1K or 100K?

                    If your HTML document containing little more then a list of products is 100 kilobytes, then its very very badly written. You should solve the problem by writing better HTML rather then trying to avoid downloading it.

                    • I said "HTML and images".

                      Ok, change that number to 4K and reread the post. Or are you the type who likes to program in assembly, and prefers human labor over machine labor?
                • Oddly; enough that the idea seems like a good one!
        • Actually I don't see anything there that I, as the object of the design process, wouldn't live happily without. Pages that crawl, buzz, flash, or otherwise do stuff other than lie there quietly for me to read are an anti-feature.

          If I hadn't already decided to buy your stuff, I wouldn't be reading your page. You don't need to attract my attention; my presence proves that you already have it.
      • It might also be useful for saving application settings/user preferences.

        As having to update a complete set of file reading/writing/update routines for every application became rather tedious, we found it more convenient to load and save the files as XML documents, read them into memory and access data settings through getValue/setValue calls.

        The downside is that the configuration files are rather bloated. If this data exchange format can help reduce the size of these files, then that can only be a good
      • Re:O...k..... (Score:3, Insightful)

        Uhmmm... Actually I've been doing this for a while now using iframes and javascript. It's not that hard and also avoids the xml bloat. This just gives a "standard" api to use when doing it. It still uses Iframes you just don't have to create them yourself. Most people who have been doing this already have a set of custom tools to help them do it. The functionality has been there for a while. This will help boost its use though and for that I'm grateful. Gmail makes excellent use of it in their UI making it
  • Pros and cons? (Score:5, Interesting)

    by Sierpinski ( 266120 ) on Monday January 24, 2005 @09:35AM (#11454634)
    Two major issues that come to mind with this type of technology:

    1) How easy is it to learn for the average programmer

    2) What kind of security precautions can we expect to see?

    Otherwise it sounds like a great technology to use for web developers who wish to have dynamic content on their sites.
    • Re:Pros and cons? (Score:4, Informative)

      by metaparadigm ( 568438 ) on Monday January 24, 2005 @09:45AM (#11454702)
      A1. The idea is to make it transparent to the programmer. You can practically just call a Java method from your JavaScript web application. one line of code is required to export or allow access to a server-side object.

      A2. Yes, security is an interesting topic. The Java implementation refered to works on a deny all by default - allow specific objects to specific clients. It does require the programmer to think about what methods they are exposing. I have been using it over HTTPS with selective objects exported to authorized clients (using the existing JAAS Java authorization and Authentication framework), so I believe it can be used in a very secure way.
    • Are you referring to "HTML Programmers" (hahaha) or programmers? The thing's bloody easy to use.
    • Re:Pros and cons? (Score:5, Informative)

      by Jetifi ( 188285 ) on Monday January 24, 2005 @09:53AM (#11454757) Homepage

      Well, JSON is a subset of JavaScript object notation, so people who know JavaScript already know this. It's basically a way of transfering structured data between browser and server that is less verbose than XML, and can be eval()ed straight into javacript itself.

      Of course, any server receiving this stuff via POST should do the same validity checks it does on anything else it gets from the wire. On the client, IIRC you can only use XMLHttpRequest with the server the document originated from, and neither should you be able to execute script across domains, even within iframes, so the existing browser security model should be sufficient to prevent additional security problems, bugs and exploits notwithstanding...

      • It makes it sound like the expected way to use it is for the client to be Javascript. If you're going to do that, why not just write an html frontend? There are a ton of ways to do that for all the languages listed. You don't need to use RPC of any kind. In fact, doing so is stupid. How do you expect to get the data into the RPC in the first place? A java client? You're going to be using html forms.

        And if you're not doing that, then you're writing some other kind of clients that can do more than ht
        • Re:Pros and cons? (Score:3, Informative)

          by Jetifi ( 188285 )

          You're right, the expected client is JavaScript, inside a browser. The frontend UI is typically in HTML (or SVG), but the client-side logic is written in JavaScript. In this case, the request/response happens within a web-page, between the browser and the web server.

          HTML forms are one mechanism for client-server data transfer in a browser, but posting them necessitates reloading the entire web-page, or an iframe, which is not seamless for interfaces like Google Suggest, and may be overkill if you're only

    • 3) What are the benefits of this compared to using a hidden iframe and using that to talk back and forth to the server?
  • You know, guys, there's a reason that we have separate application programs instead of doing everything through Internet Explorer. Believe it or not, it's not necessarily the best interface for a lot of things.
    • by Anonymous Coward
      XUL and XAMAL are being developed so that all applications WILL run through the intarweb...

      Only computationally expensive programs will be relegated to the desktop, everything data orientated will live on the web
    • That is true, but, as far as I can see, it has experienced some impressive results, like Gmail (I believe that is what we are talking about, right?). The fact is that GUI programming, and even programming in general, can seem quite daunting to the beginner. Although I am an average-to-good C/C++ programmer, GUI still confuses me for the most part (I always have to refer back to the API, is that normal?). Combining JavaScript and its merits (easy-to-learn, basically a subset of Java and C), with (D)HTML a
    • by hummassa ( 157160 )
      It may even not be the better interface for some things, but it *is* the better way to deploy the things. It is specially better if you have to deploy thousands of copies.
      • by arkanes ( 521690 ) <arkanes@NoSPam.gmail.com> on Monday January 24, 2005 @10:28AM (#11455010) Homepage
        Eh. Not really. Auto-updating isn't especially difficult, especially in the close environments most web applications are written for. Java Web Start, for example, is a cinch. It's not too hard to roll your own mechanism either. Web applications are trendy now, though, despite there being no objective advantage in most circumstances.

        Refresh-less updating isn't new, either - I've been doing it for at least 3 years, without the XML stuff. Even with it there's only so much you can do on the client, by design. The web is a decent platform for reporting. It's a good place for universal access (see gmail, for example). It's a lousy place to put your data-entry heavy business applications.

    • Believe it or not, it's not necessarily the best interface for a lot of things.
      Yes, but how many applications make is so easy to give you access to such a wide variety of viruses? I kid, I kid.

      Actually, I agree with the parrent on this one. I prefer applications to have their own gui. This can help where speed of use is an issue. Also, if you buy software, throw the cd in, and all that is on the cd are a bunch of .html files, you are prolly gunna be pretty pissy.
    • by MancDiceman ( 776332 ) on Monday January 24, 2005 @09:48AM (#11454718)
      Imagine yourself in 5 years time. The web browser has all this stuff on it which means it is as good an interface as any other GUI widget stack. Firefox or Safari or IE or whatever effectively is the window manager with tabbed browsing and links to your favourite 'applications'.

      The interface is fluid, keyboard shortcuts working fine and everything is as responsive as it is right now in your current desktop. Your applications are used over the web - you don't have to worry about software upgrades or fixing your parent's computer after some installation as everything is done by your ISP.

      Can you see that future? What is stopping it from happening?

      You're right, the browser is a crap interface. If you actually understood the technology being described, you would realise that it is an improvement to the interface to make all those things happen.

      The browser is a bad interface right now. JSON helps to make it a more suitable interface. Go figure.
      • by uradu ( 10768 ) on Monday January 24, 2005 @10:11AM (#11454881)
        > You're right, the browser is a crap interface. If you actually
        > understood the technology being described, you would realise
        > that it is an improvement to the interface to make all those
        > things happen.

        No, a real improvement to the interface would be to move away from any technologies that mix (D)HTML and executable code. It's a recipe for unmaintainability and for driving self-respecting desktop developers to despair. True advances in distributed apps are approaches such as Mozilla's XUL. Alas, they're a step away from the quasi-declarative "programming" of (D)HTML back to the procedural programming of C and its descendants, not something artsy web "developers" like to hear.
      • Your applications are used over the web - you don't have to worry about software upgrades or fixing your parent's computer after some installation as everything is done by your ISP.

        Exactly that is something to worry about! Imagine you are writing something important with a web-based word processor. You are close to deadline, and all you have to do now is to print out that stuff (or to convert it to PDF and mail it). Nobody would be so insane to update his word processor at exactly this point. But with web

      • by arkanes ( 521690 ) <arkanes@NoSPam.gmail.com> on Monday January 24, 2005 @10:37AM (#11455082) Homepage
        What's stopping it from happening is that the features that make a good browser for hypertext are not the same features that make a good client for, say, a business or data entry application. As a quick test, go hit the back button in any web application that uses this sort of technology. Does it do what you expect? Does the "back button" even make sense in the context of what you're doing?

        Hypertext is a lousy way of writing applications - in fact, most "web apps" have roughly zero relationship with hypertext. Network-transparent thin clients are interesting, but HTML/DHTML/current browsers are the wrong way to implement these things. Part of the problem is the issue of control - client applications need to be able to control the user interface to a degree that a general purpose browser simply can't allow. Something as simple as "Save changes at exit" is impossible in a browser - and you wouldn't want it to be. Same thing with control of the back button, or spawning new windows (or even dialogs, which you can do with IE).

        In short - the browser is a fundamentally poor platform for most applications. More to the point, we have and have had the technology for network-based application suites for years. ASP (application service providers, not the MS web platform) is gaining some mindshare, but it's not taking off like gangbusters.

        • Actually, you're wrong about the "save at exit" thing. That would actually be fairly straightforward to implement in javascript.

          But there are plenty of things that aren't:
          -It is nearly impossible to make a more advanced text widget than what are available in html already.
          -Rendering images takes a long time in interpreted languages by nature (of course, javascript could become JIT and this problem would be gone).
          -You're sandboxed into the browser. You can't interact with other desktop apps. To do so woul
      • The distributed app model (i.e. passive connection, declarative programming, flow-oriented MMI) is very very good. That's a good concept. But the implementation really sucks. Developing distributed applications via this model is very immature, at the very best. It is difficult since there are tens of domain-specific programming languages to learn, the binding between them leaves a lot to be desired, and debugging is almost non-existent. It's very clear to those who have developed (or tried to develop) distr
      • Google is currently one of the masters of Javascript.

        Look at what they have done, and what they have not done - GMail has a very good interface. But even Google has released some real applications, like desktop search and Picassa.

        I really believe the browser model can only be taken so far. As someone else noted, your browser becomes your window manager and pretty soon you develop a little cosmos in there. But that cosmos will always be a subset of the richer cosmos the OS itself offers, and so web apps
    • by ceeam ( 39911 ) on Monday January 24, 2005 @09:48AM (#11454721)
      IE besides - "Web app" is a darn appropriate interface for a lot of things and definitely the single most widely spread and portable one. The inability to request a "callback" value from the server from the JS code is a huge PITA if you've ever tried programming those.
    • by Malc ( 1751 ) on Monday January 24, 2005 @09:50AM (#11454731)
      Web interfaces have two massive advantages: no need to install anything. They also work just about anywhere.

      You're right: a web page for a complicated will rarely match the UI of a dedicated application. Take Outlook's Web Access UI: it's pretty amazing, especially if you're using IE. It can be used almost anywhere without having the latest version of Office installed. It's still damned clunky compared with real Outlook, but sometimes it's better than nothing.
    • by hey! ( 33014 ) on Monday January 24, 2005 @10:06AM (#11454834) Homepage Journal
      Well, I think you're missing the point.

      These applications areng going to use a browser interface. The are going to use the browser as a platform on which non-browserish presentation layers can be constructed.

      This is exactly the future that caused Microsoft to go bezerk over Netscape all those years ago.
    • Maybe not, but web apps can usually be developed, deployed and maintained far faster and easier than writing custom apps. That can often outweight having the best interface, as long as the interface is good enough.

  • by Threni ( 635302 ) on Monday January 24, 2005 @09:36AM (#11454642)
    With all that stuff going on it's a wonder it didn't click itself!
  • Michael Clark usually runs corporate IT meetings, right? I mean - you can't just come with this buzzfest without due training.
  • Fix HTML instead? (Score:3, Insightful)

    by Spiked_Three ( 626260 ) on Monday January 24, 2005 @09:38AM (#11454660)
    I've had to resort to all sorts of tricks to avoid postbacks in my current (aspx) development efforts. First we used a thrid party soap-xml RPC like this thing. Then we switched to an IFrame for the postback portion. Then I noticed that MS is including their own new and improved soap-xml RPC thing in .Net 2. Now I read about this.
    Seems it is a problem a lot of people need/want to solve - but to be honest, I am tired of having so many different solutions to a problem I should not have to begin with. Isn't there something that can be done with the HTML standard to elliminate the need? Life would be so much better down that path.
    • by Malc ( 1751 )
      HTML was designed as a markup language for text. All these things you're complaining about result from trying to shoe-horn HTML in to an application it was never designed for. It still works pretty well though, all things considered.
    • Re:Fix HTML instead? (Score:4, Interesting)

      by TheRaven64 ( 641858 ) on Monday January 24, 2005 @10:01AM (#11454806) Journal
      I don't think HTML is to blame so much as HTTP. The integration of something like XMPP into the browser would be a huge improvement, since it would allow arbitrary XML to be pushed to the client without the need for polling (which is ugly, and no less ugly if it's done in the background without the need for full page refreshes.
  • And what about php? (Score:2, Interesting)

    by Anonymous Coward

    If you want to interoperate with PHP, I'd suggest Harry Fuecks JPSPAN [sourceforge.net] as it is quite nice at hooking javascript up with serverside php

    As for xmlhttprequest, it's rather easy to make neato web applications with it. Here's something I coded up the other night (only seems to work in firefox at the moment though): http://www.james-carr.org/index.php?p=8

    Cheers,
    James Carr [james-carr.org]

  • by Anonymous Cowherd X ( 850136 ) on Monday January 24, 2005 @09:44AM (#11454695) Journal

    Example in JSON:

    {"menu": {
    "id": "file",
    "value": "File:",
    "popup": {
    "menuitem": [
    {"value": "New", "onclick": "CreateNewDoc()"},
    {"value": "Open", "onclick": "OpenDoc()"},
    {"value": "Close", "onclick": "CloseDoc()"}]}}}

    The same thing in XML:

    <menu id="file" value="File" >
    <popup>
    <menuitem value="New" onclick="CreateNewDoc()" />
    <menuitem value="Open" onclick="OpenDoc()" />
    <menuitem value="Close" onclick="CloseDoc()" />
    </popup>
    </menu>

    Perfect for the web as doesn't suffer from XML's bloat and is custom made for our defacto browser language.

    Take a look at those examples and try to explain how is JSON free from bloat when in fact it is even more bloated and slightly more difficult to read and write by humans? It's just another notation with no obvious advantages.

    • by metaparadigm ( 568438 ) on Monday January 24, 2005 @09:56AM (#11454776)
      Yes, although this is an XML DTD discussion. Most DTDs including the XML-RPC and SOAP DTDs don't encode using attribute values but instead using child elements with character data (apparently this is the XML best practice). Much Much bigger.

      Also, the JSON takes one line of code to parse and access natively in our defacto web browser language 'JavaScript'.

      The second requires a bloated JavaScript XML parser (as this is not built in to many browsers) and CPU intensive processing and a cumbersome API to get the data out. Also try doing 100 RPC calls a second with SOAP in a browser (this can be done with JSON-RPC on a local network - 10ms round trip on simple methods).
      • Also, the JSON takes one line of code to parse and access natively in our defacto web browser language 'JavaScript'.

        The second requires a bloated JavaScript XML parser (as this is not built in to many browsers) and CPU intensive processing and a cumbersome API to get the data out.

        Give us a CPU speed, amount of memory available, amount of memory used, number of cycles used and code for each parser. I want to know the features of these parsers, and why they were chosen (SAX vs XML-Pull vs DOM). I'd li

      • It strikes me the JSON version would be *much* larger for non-western languages. It can only include non-ascii characters in the data via the use of unicode escapes, which are 6 bytes long (\uXXXX), as compared to 2 in XML using an appropriate charset. It also lacks object references, so can't be made as compact as an arbitrary JS program.

        JSON-RPC doesn't seem to be intended for use in interchange, but for websites, since it relies on browser security for the 'efficiency' of being able to use eval() in its
    • Because XML requires a parser, and this JSON thing looks (at least to my very rusty eyes, it's been ages and ages since I touched front-end stuff like javascript) like it could be evaled into a jscript array, which is a *much* quicker operation and requries no external libraries to operate. I've done something like this before, working at a startup back in 2000, with an invisible iframe (we were targeting IE only) that was running jscript which would poll the server api for various things and eval the jscr
      • by Anonymous Coward

        Because XML requires a parser, and this JSON thing looks like it could be evaled into a jscript array,

        Which magical browser do you use that doesn't need to parse the code that it eval()s?

        which is a *much* quicker operation

        I think that you have forgotten that eval() needs to parse too, I'm not convinced that it is much quicker. Even if it was, it doesn't follow the Principle of Least Power [w3.org]. XML doesn't execute. Javascript does. There's a reason why JSSS was rejected by the W3C and CSS wasn't.

    • I think you're the missing the point. It's not the fact there's those little simple CreateNewDoc() functions, it's the binding to the Java objects on the backend that's simple to create, and that's the part the 'free from bloat'.
    • Importing the datastructure into the language is very easy. In JS you can basicly go "eval(datastring)". No need to either a custom XML walker, or invoke some ActiveX, or create a document/fragment.

      Other obvious things, is that the "menuitem" notation in JSON is actually an array, this is not the case in the XML case. There we just have siblings, with coincidentally the same names. So JSON has richer information (this obvious alot of duplication, only one menuitem, instead of three).
    • The previous does choose a best-case format for XML, relying on attributes instead of elements whenever possible. To be honest, what would be the actual SOAP encoding for the equivalent JavaScript data structure?

      Now what would you think of (dropping quotes and spaces when unnecessary for parsing):

      {menu:{id:file;value:"File:";
      popup:{menuitem:({v alue:New;onclick:"CreateNewDoc()"}
      {value:Open;on click:"OpenDoc()"}
      {value:Close;onclick:"CloseDoc ()"})}}}

      Now what if built-in filter allowed you to generat

    • As other replies have noted, you can just eval the JSON.

      Well, you can also just use an XSLT to translate the meaningful XML into JSON, or whatever other standard your heart desires.

      So, you're both right.
  • and slow torutring pain for developers.

    The user benefit will come from more usable dynamic web applications when this is applied well. The users will suffer when everbody decides their pages need this even when they don't. Kiss that CPU goodbye. The users will get to suffer when they decide to use a platform that didn't rank high enough for the sites QA team to bother checking.

    When used and tested well, this can provide some awesome benefits. Hopefully, we'll see more than simple ad/news/stock tickers. Im
    • Yeah, god forbid we actually work for a living, and not whine about it.

      Seriously, this is what it means to be a real developer, solving hard problems. It's what good developers do, and what great developers do well.

      The next step will be to tie this to a backend framework for easy site building.
      • Sorry if my comment sounded like whining. I was commenting on the appropriate USE of a new idea. If web dev wasn't pure hell, I would probably do something else. Its the challenge that makes coming to work worth while.

        Actually, since reading the article a while back about the guy who reversed engineered google's thing, I've been thinking about building a library to make this two-way communication between server APIs and a dynamic page.
        This package looks like it will be very interesting and might fill the r
        • That's an interesting idea. I'm looking forward to building something that I can tie back to an EJB3 backend (POJOs) and just massage through session beans. This might take some of the more tedious aspects of EJB->Web Development a lot less painful.

          Anyway, this is going to make it's way into my playtime toolbox.
          • Ok, I've just downloaded their demo code and played with it a bit. I've also read most of the source code. So basically, we can register java objects, and call their methods from the client in JS. The (un)marshalling basically translates our objects between Java/text/JavaScript. It even appears to work well. :)
            I now think this is officially pretty cool.
            One mod though, since I tend to think scripting in JSP's is evil, I create and register the objects in my Struts Action class. I then store it in the session
            • I forgot to mention in my previous POST.
              Using the JSON-RPC package, you don't actually have to touch and JSON. Just plain old Java/JavaScript. The package does the conversion for you.
              In the (un)marshalling stage the text representation of the objects, is actually the JSON. But, since the package does this step for you (each way), you don't have to mess with it.
              So, now we have a new toy but, don't *really* have to learn anything new. Now, if someone wanted to do a PHP,ASP, or insert favorite scripting lang h
    • Thank God I've been doing this on my own for quite some time, and watching it work in every browser, though I've never referred to it as JSON because I'd never heard of JSON.

  • JSON, JSON-RPC, Java, JavaScript, Perl, TCL, XML, XMLHttpRequest, MSXML, DHTML :BOOM!:

    :head explodes from acrynym overload:

  • When will the average programmer be able to keep up? I am sure in India they are already teaching classes on this ;-)
    • ha, but the way you eat someone's lunch the way the Indians are doing to the rest of the world is to get good at the crappy stuff that's out there and then stay on top of it, regardless of the pain.

      The scary part is, I happen to know (from working with Indians) they ARE teaching classes on this stuff, or at least similar stuff. I could use some classes myself. I just wrote a few pages of jscript and I am in severe browser standards pain just like I was the last time I did this, 2 years ago. There's still
  • We've written a client for the Remedy Action Request System using JS and the XMLHttpRequest object, with a Java based back end. The client is faster than we ever imagined, and is twice as fast as Remedy's own client. So if you fancy seeing some Shockwave movies of a overly complex web client, which demonstrates exactly what can be achieved with the XMLHttpRequest object, visit: http://www.symbiontsolutions.com/tour Stan
  • by Homology ( 639438 ) on Monday January 24, 2005 @10:05AM (#11454826)
    "Seen those funky remote scripting techniques employed by Orkut, Gmail and Google Suggests that avoid that oh so 80's page reloading (think IBM 3270 only slower)....."

    I hope I wasn't the only one that shuddered when I read "remote scripting techniques".

  • ... Yet another standard that can confuse just about everyone. "You have a problem on the server, wait thats written in JSON, we only do XML. The JSON developer is on vacation."
  • Pushlets (Score:4, Informative)

    by tezza ( 539307 ) on Monday January 24, 2005 @10:07AM (#11454855)
    http://www.pushlets.com [pushlets.com]

    This is a server side push framework based on the same idea. It preceded GMail et alia.

  • Malarky (Score:2, Flamebait)

    by sporty ( 27564 )
    This is a lot of junk except in exceptional cases.

    We have machines that are now in the gigahertz. Even a 0.5 ghz machine can process XML with some speed.

    We have machines that have tons of memory, so even an inefficient DOM parser which loves using memory, can handle a large XML packet (use sax for that).

    We have parsers that are a bit smarter than the slow ones, namely XML-Pull parsers and SAX.

    When implementing someone else's protocol, something that is readable is awesome, because documentation isn

    • The problem we are solving is that XML handling in javascript is cumbersome. JSON offers a very simple solution to provide client java scripts with a normal javascript object without the need to use any DOM parser. This is the second easiest solution right after sending html text blocks to replace the html embedded in the original page. And the responses generated by this approach are much shorter than html text blocks.
  • Or instead... (Score:3, Informative)

    by koehn ( 575405 ) * on Monday January 24, 2005 @10:10AM (#11454876)
    Why not just replace some of your HTML [ibm.com] instead?

    All JSON does is make it easier to have your JavaScript call in to your application and parse the results. If you're just interested in presentation, just have your JS call up, get some HTML, and replace the affected HTML. This decreases the amount of JS and increases your re-use (since you don't need to build your UI twice: once is (PHP|Java|.Net|Ruby|.*), and once in JS). You just call your (\1) code on the server from the JS and have it generate the HTML.

    I understand that sometimes there are advantages to the programmatic approach that JSON (and XML-RPC, which the browsers support) extoll, but I don't think many developers even realize the UI-based alternative.

  • XML-RPC [xmlrpc.org] did this years ago:

    What is XML-RPC?

    It's a spec and a set of implementations that allow software running on disparate operating systems, running in different environments to make procedure calls over the Internet.

    It's remote procedure calling using HTTP as the transport and XML as the encoding. XML-RPC is designed to be as simple as possible, while allowing complex data structures to be transmitted, processed and returned.

    It works quite well too. Next, please!

  • 3270s (Score:3, Informative)

    by Alrescha ( 50745 ) on Monday January 24, 2005 @10:20AM (#11454952)
    "think IBM 3270 only slower"

    Hey, 3270s were coax-connected to a channel-attached controller with a 4.5MB/sec path to the CPU. You could do video on them (if you didn't mind the fact that your pixels were the size of a tic-tac.

    A.
    (who lusts for the feel of a 3270 keyboard under his fingers)
    • by Cato ( 8296 )
      3270s were (and are) frequently connected over WANs, so it's unlikely they normally had 4.5 Mbps to the mainframe, even shared with the cluster controller, except in few cases where the terminals were just about in the data centre.
    • Yes, mainframes are really, really good at I/O, which is a concept that many people didn't get (DEC for one, when they fell flat on their faces trying to leverage the VAX into mainframe land) and still don't get. The CDC 7600 was surrounded by 6600's to spoon feed it, just as IBM mainframes have channel controllers (real processors) separate from the CPU to do the same thing.

      However, your memory of 3270's is a lot different than mine. How about when that nifty wifty 3270 cluster controller went south, as
    • (who lusts for the feel of a 3270 keyboard under his fingers)

      Dammit, you had to bring up the best keyboard ever made. Now I've got an erection. I'm such a nerd. :)
  • Although I haven't called it JSON this is kind of exactly what I called "javascript views" in my web application toolkit [sourceforge.net]. One small part of my project uses serverside java to generate dynamic javascript objects which are a javascript representation of nested java object graphs. The syntax of the generated js object source is a bit different because I did not know that the language constructs used by JSON exist (I think I'll switch to JSON very soon).

    Online Demo showing the javascript view feature. [logotopia.net]

    Projec [sourceforge.net]

  • And, no doubt, JSON etc. will work exactly the same across browsers/platforms and any combination of client/server platform. There will be no surprises. Yeah right.

    And all those crappy Javascript 'programs' are going to be magically transformed into high-quality type-safe rock-solid modules by the same people who wrote the crap JS in the first place?

    I won't hold my breath on this one...
  • by nomadic ( 141991 )
    Seen those funky remote scripting techniques employed by Orkut, Gmail and Google Suggests that avoid that oh so 80's page reloading (think IBM 3270 only slower).

    Mr. Grammar has left the building.
  • by imbaczek ( 690596 ) <(mf.atzcop) (ta) (kezcabmi)> on Monday January 24, 2005 @11:50AM (#11455913) Journal
    It looks like the technology is finally converging towards Lisp. Maybe 40 years isn't THAT much, after all...

    (If you think about it, it started quite a time ago, since xml is isomorphic to sexps.)
  • by dioscaido ( 541037 ) on Monday January 24, 2005 @12:13PM (#11456189)
    Is this different from script callbacks in ASP.NET? It allows you to hit the server on an already loaded page and selectively update its contents. While the full abstracted implementation will be available in ASP.NET 2.0, you can easily implement it in the current ASP.NET 1.x.

    http://msdn.microsoft.com/msdnmag/issues/04/08/Cut tingEdge/ [microsoft.com]
  • I noticed that there was no Python binding listed. Then I looked at the examples and they looked a lot like Python dictionaries and lists so I fired up the python interpreter and fed it the following from one of the example pages:

    { "glossary": {
    "title": "example glossary",
    "GlossDiv": {
    "title": "S",
    "GlossList": [{
    "ID": "SGML",
    "SortAs": "SGML",
    "GlossTerm": "Standard Generalized Markup Language",
    "Acronym": "SGML",
    "Abbrev": "ISO 8879:1986",
    "GlossDef":
    "A meta-markup language, used to c

  • I was converted to using JSON about 2 years ago to enable my rather complex DHTML applications to go to the next level. The transition was so sucessfull that I now use JSON in many levels such as:

    1) Server to client hashmaps:
    JSON is great for passing hashmaps of data and for storing specifications meta data for applications.
    eg:
    ServerReport={serverFailed: true, errorMessage: "you used XML dummy"}

    2) Application meta data:
    Storing application configuration using JSON is not only efficient but very easy to ma

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (5) All right, who's the wiseguy who stuck this trigraph stuff in here?

Working...