Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
The Internet

Varnish Author Suggests SPDY Should Be Viewed As a Prototype 136

An anonymous reader writes "The author of Varnish, Poul-Henning Kamp, has written an interesting critique of SPDY and the other draft protocols trying to become HTTP 2.0. He suggests none of the candidates make the cut. Quoting: 'Overall, I find the design approach taken in SPDY deeply flawed. For instance identifying the standardized HTTP headers, by a 4-byte length and textual name, and then applying a deflate compressor to save bandwidth is totally at odds with the job of HTTP routers which need to quickly extract the Host: header in order to route the traffic, preferably without committing extensive resources to each request. ... It is still unclear for me if or how SPDY can be used on TCP port 80 or if it will need a WKS allocation of its own, which would open a ton of issues with firewalling, filtering and proxying during deployment. (This is one of the things which makes it hard to avoid the feeling that SPDY really wants to do away with all the "middle-men") With my security-analyst hat on, I see a lot of DoS potential in the SPDY protocol, many ways in which the client can make the server expend resources, and foresee a lot of complexity in implementing the server side to mitigate and deflect malicious traffic.'"
This discussion has been archived. No new comments can be posted.

Varnish Author Suggests SPDY Should Be Viewed As a Prototype

Comments Filter:
  • by Skapare ( 16644 ) on Friday July 13, 2012 @09:53AM (#40638383) Homepage

    If you substitute JSON (or something like it with equal or better simplicity) for XML, then I might go along with it.

  • by spike2131 ( 468840 ) on Friday July 13, 2012 @10:11AM (#40638549) Homepage

    I love JSON, but XML has the advantage of being something you can validate against a defined schema.

  • Delenda est. (Score:3, Insightful)

    by Anonymous Coward on Friday July 13, 2012 @10:13AM (#40638569)

    Then it cannot replace HTTP and should be withdrawn, or it's been wrongfully sorted in under "HTTP/2.0 Proposals [ietf.org]"

    The IETF HTTPbis Working Group has been chartered to consider new work around HTTP; specifically, a new wire-level protocol for the semantics of HTTP (i.e., what will become HTTP/2.0), and new HTTP authentication schemes.

  • by Short Circuit ( 52384 ) <mikemol@gmail.com> on Friday July 13, 2012 @10:24AM (#40638681) Homepage Journal

    By the time a replacement of HTTP 2 is standardized, XP will be fully out of support. I get flamed whenever I say this, but it will be time to let XP die. I'm considering replacing my grandmother's box with an ASUS Transformer, as that'll handle all of her needs. (*And* the rest of my family won't say 'we don't know how to reboot the router because we don't know how to use the Linux netbook you set her up with.) Quickbooks runs on Vista and Win7. Tools and other things which require Windows XP are becoming scarcer, and workarounds and alternatives are becoming cheaper.

    Eventually, XP will be like that DOS box that sits in some shops...used only for some specific, very limited purposes. Any shop cheaping out and still using it in lab environments (such as call centers) can work around it by installing a global self-signed cert and using a proxy server to rewrap SSL and TLS connections. Yes, this is bad behavior. So is continuing to use XP. At some point, the rest of Internet needs to move on.

  • by Viol8 ( 599362 ) on Friday July 13, 2012 @10:33AM (#40638777) Homepage

    As a static data format its just about passable, but as a low overhead network protocol??

    Wtf have you been smoking??

  • by jimmifett ( 2434568 ) on Friday July 13, 2012 @11:00AM (#40639055)

    Ideally, you give the schema to the other side and they can validate the message before sending to you, catching possible errors there. You validate against same schema on your side as a safety net to week out junk data and messages from users that don't validate. It also allows you to enforce types and limitations on values in a consistent manner.

    JSON is good for quick and dirty communications when you are both the sender and the consumer of messages and can be lazy and not care too much about junk data.

    Both have their uses, but you have to know when to use which.

  • by luis_a_espinal ( 1810296 ) on Friday July 13, 2012 @11:01AM (#40639079)

    I'd suggest base it on XML with a header section and header-element to get the transfer started then accept any kind of structured data including additional header elements.

    Haven't we learned enough already from industrial pain to stay away from XML? JSON, BSON, YAML, compact RELAX NG, ASN.1, extended Backus-Naur Form. Any one of them, or something inspired by any (or all) of them, that is compact, unambiguos (there should be only one canonical form to encode a type), not necesarily readable, possibly binary, but efficiently easy to dump into an equally compact readable form. Compact and easy to parse/encode, with the lowest overhead possible. That's what one should look for.

    But XML, no, no, no, for Christ's sake, no. XML was cool when we didn't know any better and we wanted to express everything as a document... oh, and the more verbose and readable, the better!!(10+1). We really didn't think it through that much back then. Let's not commit the same folly again, please.

  • by Anonymous Coward on Friday July 13, 2012 @02:14PM (#40641059)

    Conspiracy minded folks would think that SPDY is mainly about Google being able to ensure that advertisements are served before the content. Putting it inside of SSL also ensures that any intermediate carriers won't be stripping Google's adverts.

    It also improves user's privacy by preventing personal content from being read by ISPs, proxies, and other men-in-the-middle. If any other web site turned on SSL, we would thank them for choosing to improve user's privacy. But this is Google, so it must be a bad thing.

    Google turned on SSL for search a month before they launched personalized search, where the search results can include things only the logged-in user has permission to see (if the user logs in and enables it). If they had not enabled SSL, people would (rightly) be upset that any man in the middle could see photos, documents, and G+ posts shared only with you.

    If you punish companies for doing the right thing, expect them to stop. Every company has people for and against any idea. When you punish good behavior, the people who fight for it will not win the argument next time.

If you want to put yourself on the map, publish your own map.

Working...