The Future of XML 273
An anonymous reader writes "How will you use XML in years to come? The wheels of progress turn slowly, but turn they do. The outline of XML's future is becoming clear. The exact timeline is a tad uncertain, but where XML is going isn't. XML's future lies with the Web, and more specifically with Web publishing. 'Word processors, spreadsheets, games, diagramming tools, and more are all migrating into the browser. This trend will only accelerate in the coming year as local storage in Web browsers makes it increasingly possible to work offline. But XML is still firmly grounded in Web 1.0 publishing, and that's still very important.'"
Re:Why is XML so popular (Score:2, Informative)
WARNING: GNAA (Score:3, Informative)
YAML (Score:3, Informative)
If only there was a standardized format that combined these advantages, without all that XML bloat. There is! Try YAML [yaml.org].
XML's big win is supposed to be its semantics: it tells you not only what data you have, but what sort of data it is. This allows you to create all sorts of dreamy scenarios about computers being able to understand each other and act super intelligently. In reality, it leads to massively bloated XML specifications and protracted fights over what's the best way to describe one's data, but not to any of the magic.
As my all time favorite Slashdot sig said: "XML is like violence: if it doesn't solve your problem, you aren't using enough of it."
Re:"How will you use XML in years to come?" (Score:5, Informative)
No, it really doesn't, but if "JavaScript" in the name bothers you, you might feel better with YAML.
And there are JSON and/or YAML libraries for quite a lot of them. So what?
Re:Why is XML so popular (Score:3, Informative)
Why not store it as a tree in a format computers can parse efficiently? Invent binary format with parent and child offsets and binary tags for the names and values. It's smaller in memory and faster. Better basically. You don't need to parse them if machines are going to read them. And decent human programmers can read them with a debugger or from a hexdump in a file, or write a tool to dump them as a human friendly ASCII during development.
So parsing in general is actually quite easy.
You end up doing a bunch of string operations. Those aren't quick. Most likely you drag in some library written by a Computer Science damaged 'engineer' who doesn't understand assembler or how to read a hexdump and so it will be a lot less efficient than that.
Don't get blindsided by big stuff you can't see (Score:4, Informative)
WHATWG's HTML 5 and JSON will have no effect on these other uses. It's just that nobody in hangouts like this sees it.
For example, the entire international banking industry runs on XML Schemas. Here's one such standard: IFX. Look at a few links: http://www.csc.com/industries/banking/news/11490.shtml [csc.com] , http://www.ifxforum.org/home [ifxforum.org] , http://www.ifxforum.org/home [ifxforum.org]
But there are other XML standards in use in banking.
The petroleum industry is a heavy user of XML. Example: Well Information Transfer Standard Markup Language WITSML (http://www.knowsys.com/ and others).
The list goes on and on, literally, in major, world-wide industry after industry. XML has become like SQL -- it was new, it still has plenty of stuff going on and smart people are working on it, but a new generation of programmers has graduated from high school, and reacts against it. But it's pure folly to think it's going to go away in favor of JSON or tag soup markup.
So yes, suceess in Facebook applications can make a few grad students drop out of school to market their "stuff," and Google can throw spitballs at Microsoft with a free spreadsheet written in Javascript, but when you right down to it, do you really think the banking industry, the petroleum industry, and countless others are going to roll over tomorrow and start hacking JSON?
Errrm, folks, what's the big fat hairy deal? (Score:5, Informative)
And for those of you out there who haven't yet noticed: XML sucks because data structure serialisation sucks. It allways will. You can't cut open, unravel and string out an n-dimensional net of relations into a 1-dimensional string of bits and bytes without it sucking in one way or the other. It's a, if not THE classic hard problem in IT. Get over it. It's with XML that we've finally agreed upon in which way it's supposed to suck. Halle-flippin'-luja! XML is the unified successor to the late sixties way of badly delimited literals, indifference between variables and values and flatfile constructs of obscure standards nobody wants. And which are so arcane by todays standards that they are beyond useless (Check out AICC if you don't know what I mean). Crappy PLs and config schemas from the dawn of computing.
That's all there is to XML: a universal n-to-1 serialisation standard. Nothing more and nothing less. Calm down.
And as for the headline: Of-f*cking-course it's here to stay. What do you want to change about it (much less 'enhance'). Do you want to start color-coding your data? Talking about the future of XML is allmost like talking about the future of the wheel ("Scientist ask: Will it ever get any rounder?"). Give me a break. I'm glad we got it and I'm actually - for once - gratefull to the academic IT community doing something usefull and pushing it. It's universal, can be handled by any class and style of data processing and when things get rough it's even human readable. What more do you want?
Now if only someone could come up with a replacement for SQL and enforce universal utf-8 everywhere we could finally leave the 1960s behind us and shed the last pieces of vintage computing we have to deal with on a daily basis. Thats what discussions like these should actually be about.
Re:I don't understand... (Score:3, Informative)
Re:Why not S-expressions? (Score:3, Informative)
Sure, you can build a different text representation for XML as sexps. But if it represents the same thing, it doesn't much matter.
Imagine that you do so, and you can write a function P that takes xml into sexps and a function Q that takes it back. If Q(P(xml-stuff)) == xml-stuff and P(Q(sexps)) == sexps, then they both do the same thing and you can effectively use either syntax. So you use the syntax you want and convert when you need to. Of course, if either equality doesn't work, then one syntax makes it possible to express something that the other does not - and then the semantics differ. So now it becomes, not a question of using XML, but using a representation that is closer to the semantics your application needs.
And that is one of the abiding lessons of computing - trying to mash semantics into a language/representation that doesn't fit well is a pain and generally a good way to waste time and effort.
Re:I don't understand... (Score:2, Informative)
If it doesn't get bumped back to me in UAT then it's getting the job done.
Re:I don't understand... (Score:3, Informative)
Oh please. Its bad enough having this bloated standard in data files , but please don't start quadrupaling the amount of bits that need to be sent down a pipe to send the same amount of data just so it can be XML. XML is an extremely poor format to use for any kind of streamed data because you have to read a large chunk of it to find suitable boundaries to process. Not good for efficiency or code simplicity. And if you say "so what" to that then you've obviously never done real network coding where throughput is THE number one priority. I've written data links to stock exchanges and clustered systems amongst other things and believe me , speed is where its at. Anyone suggesting using XML would be laughed out of the building
"People have been trying to shove everything into an HTML pipe"
I think that about says it all. God help us if people like you ever get into a decision making position. Oh , wait...
Do yourself a favour pal, go learn about network systems programming from the ground up - start with ethernet frames and work up from there to IP/TCP, sockets, memory mapped packets etc. Then you may just get a clue.