Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Security

XML Web Services & Security 118

Handy writes "Web Services (SOAP, .net, WSDL ? , UDDI ? ) create an even greater need for robust security. Exposed interfaces and fragmented administration coupled with a need for app-level security points to a greater need for a centralized managed security services model."
This discussion has been archived. No new comments can be posted.

XML Web Services & Security

Comments Filter:
  • by newt_sd ( 443682 ) on Wednesday May 15, 2002 @04:36PM (#3525686) Homepage
    Not only is this article not saying a single new thing about web application security, the site at the end of the link only has 4 articles on it. This smells of advertising for a new site? Now I am not one to wear a tinfoil hat but I smell a conspiracy going on with news that isn't really news!!
    • by Target Drone ( 546651 ) on Wednesday May 15, 2002 @04:57PM (#3525834)
      This smells of advertising for a new site?
      You may be on to something. I tried doing a search on Google for "Westbridge Technology" (the people who wrote the article) to find out more about them. I only got 2 hits with and a sponsored link to the Westbridge Technology home page [westbridgetech.com]. Westbridge Technology must be very new for the page to not show up in Google yet.

      A whois search also reveals that xwss.org [netsol.com] and westbridgetech.com [netsol.com] belong to the same people.

      And to top it all off Westbridge sells an XML message server. Just what you need to implement all the good stuff talked about in the article.

  • Duh... (Score:3, Interesting)

    by stoolpigeon ( 454276 ) <bittercode@gmail> on Wednesday May 15, 2002 @04:38PM (#3525699) Homepage Journal
    This was an interesting read and I'm sure it is good info for tech managers- maybe if we keep hammering at them they will get it, but if you write code and you realize that we are connecting systems deeper and deeper - security becomes more and more of an issue. That seems to be a bit of a no brainer.

    And all this talk of the computer is the network, and the future of tech and all this stuff - security is the linch pin to making it viable.

    I think stability runs a very close second- especially as more critical systems become a part of this big electronic gestalt everyone dreams of- but if it is insecure, I know I wouldn't touch it w/a 10 ft. pole.

    .
  • by floppy ears ( 470810 ) on Wednesday May 15, 2002 @04:42PM (#3525726) Homepage
    The drive to get business advantage from XML Web Services will cause turbulent times for IT managers. To successfully navigate these new issues, managers must change their mind set from "fragmented security systems focused on using network perimeter to shield closed business systems" to "consistent managed security systems focused on managing application level security for inherently distributed business systems".

    This article was written by Kerry Champion, president and Andy Yang, Senior Director of Product Management at Westbridge Technology, Inc., a provider of security and reliability infrastructure software for XML Web Services networks.

    I'm not saying I disagree with their conclusion, but you always have to be suspicious when somebody comes out with an article that concludes that to be successful you have to use their product/service or something like it.
    • He's not saying you have to use HIS company to be successful, he's saying if you want to be successful when you create a Web Service, you should consider the security of the Web Service you are creating.
    • I'll bet you a donut that they didn't even write the article. This kind of placement piece is fairly standard PR. I got a byline on an 'Advanced Imaging' article once. I wrote nary a word of it.
    • Hmmm...

      "consistent managed security systems focused on managing application level security for inherently distributed business systems"

      Am I the only one here who suspects that a solution explained in so much words, or should I say buzzword, is a good metric to the usefulness of the article? Pardon me, but it sounds like marketing speech rather than technical, knowledgable advice...

      --
      Arkan - sorry for the english, next time I'll be born elsewhere, I promise
  • Confusing (Score:2, Interesting)

    by Dick Click ( 166230 )
    Good luck for anyone actually trying to implement a secure soap based app, what with the moving targets of XML Encryption, different ways to use XML Signatures, the need to incorporate WS Routing (and possibly WS-Security). I know these specs are likely to change soon.
    • I was just looking into XML-RPC and SOAP the other day, and for the most part Transport and even Serialization are separate components - fully replacable...

      To build something that inter-ops well, you don't need to use things that are 100% standard. Especially in a component world. Worse-case-senerio a new transport protocol needs implementing in a different language - for the most part that should be very simple.
  • "Paradigm" shift? (Score:3, Interesting)

    by sisukapalli1 ( 471175 ) on Wednesday May 15, 2002 @04:47PM (#3525762)

    The drive to get business advantage from XML Web Services will cause turbulent times for IT managers. To successfully navigate these new issues, managers must change their mind set from "fragmented security systems focused on using network perimeter to shield closed business systems" to "consistent managed security systems focused on managing application level security for inherently distributed business systems".

    Hmm... I know of a manager (very higher up), when asked about security implications of some assumptions in the design of a product (for web services), very confidently responsed, "They [customers] can always configure their firewall". *That* was the solution!

    S

  • This article was written by Kerry Champion, president and Andy Yang ...


    Ala... The presentation was made by Jerry Yang, Chariman and Cheif Yahoo, and XYZ, VP and Junior Yahoo...

    S
  • an important issue (Score:4, Insightful)

    by tps12 ( 105590 ) on Wednesday May 15, 2002 @04:51PM (#3525787) Homepage Journal
    I can't stress security enough. Too often we see the methodology of "write first, secure second."

    No no no no. I'm sorry, that just won't cut it in today's world of scam artists [ebay.com]. We need to be building in security on the server side from the ground up.

    I am loath to resort to buzzwords, but "proactive" really describes just how I feel.

    At my company we have met this challenge head-on by deploying a full server force of Mandrake Linux coupled with Apache 2. Apache 2 picks up where the original left off, with the added features of clones referring to Stormtroopers (as opposed to the original modular system). I find that our server compromises have decreased ~70% since making the switch from an IIS server farm.

    I have also heard good things about BSD in regards to security and web apps. Great to see this finally getting the press it deserves.
  • SOAP Security Issues (Score:5, Informative)

    by smallpaul ( 65919 ) <paul@@@prescod...net> on Wednesday May 15, 2002 @04:54PM (#3525808)
    Here is my take [prescod.net]. And here is Bruce Schneier's. [counterpane.com].
  • A problem with the new Web Services paradigm is that there is no place for a proprietary protocol anymore. We used to have proprietary encoding schemes over closed transports (IPX). Now we have XML over HTTP over IP, all of which are public standards. Systems that were relying only on obscurity for their security are now fully exposed because data is transmitted "in the clear".

    That, along with the multiplication of software layers (Browser -> Plugin -> Applet -> TCP - > Server -> Servlet -> AppServer -> 10 other middle layers) makes for very complicated systems with slower performance and bigger security holes. All this for no good reason other that going through firewall by riding over HTTP.

    I've yet to meet someone explain to me the true advantages of Web Services. They are to me the biggest fad we've seen in corporate computing in recent years. Everybody's doing it, so it must be good.
    • They have their flaws, but any standard way to connect systems in a platform-agnostic way without worrying about the n firewalls that may or may not block the way between them is bound to get some support from developers.
      Speed is not always critical, and a lot of times you don't know shit about your users infrastructure. In those cases web services help a lot.
      Besides, the typical business case for a web service is two servers talking to each other (HTTP, FTP, SMTP or other protocol). So there is usually no human client involved. (Of course there might be a browser in the other end, but if you buid a web service just for that you might as well just have a ordinary dynamic web page.)

      Seems this is the way it's gonna be, whatever we think about it. With the support of MS and Sun and an ever increasing mindshare Web Services will be hard to avoid soon...

      There are problems as well, we might be creating a really big pile of shit that is bound hit the fan in a few years.
      Time will tell, there really is no way to tell yet.

      Lets just hope this won't be a new MS Outlook...

      Anyway, your bashing on open protocols make me think you just might be a troll. So I guess I just been trolled.
      damn!
    • While I tend to agree with your conclusion that actual Web Services (in their current formulation) are mostly a fad, I disagree with your suggestion that standardization yields weak security.

      Obscure security can only secure the trivial. Lots of eyeballs checking and auditing the code, beating on it ruthlessly, does work. (That's a slight misquote Linus' Law of Debugging from "The Cathedral and the Bazaar") Witness Linux or *BSD.

      Message-driven, discovery-based APIs for applications will be a good thing. For one thing, the actual APIs would fall more in line with O-O practice, where you classify objects by the messages they respond to (and not, instead, by the attributes they have). For another, you'd have a framework for service design-by-contract. But port 80 HTTP requests seem to be a lousy way to do this.

      • But port 80 HTTP requests seem to be a lousy way to do this.

        So do it on another protocol.
        Why is it everyone thinks web services are only meant to be HTTP?

        SOAP is not dependent on any protocol except maybe ASCII.

        Send it by SMTP, FTP or whatever you like. It's just an XML document following a certain schema agreed upon by all involved. This means that any way you can transmit plain text you can run a "web service".

        I blame MS for this fixation on HTTP.
        ...Though I have actuallty read a paper by some ms developer complaining about how HTTP is not well suited for all SOAP requests (beeing a stateless protocol some things are hard to do...).
  • For a lot of companies HTTP is an exposed interface (for others it's disconnected from core data). This sounds like the same old "security on the internet" thing we've been hearing for years now.

    If you can't be part of the solution there's penty of money to be made prolonging the problem. NB
    • For a lot of companies HTTP is an exposed interface (for others it's disconnected from core data). This sounds like the same old "security on the internet" thing we've been hearing for years now.

      Ehmm... Not really the same thing.
      The good 'ol www is a machine <--> human interface.
      Web services on the other hand is primarily a machine <--> machine interface.

      It has some uses since HTML and the www is not well suited for automated reading. (If you ever wrote a screen-scrape program and tried to keep it working for more than a few months you'd agree with me...)

      Problem is, what you are really doing is exposing a api for server to server communication to the general public.
      Web services makes this an easy task, and chanses are people will forget to secure their apps.
      Any hole will be a remote hole...

      Scary thought, huh?
      Might be the next outlook.
      ...or not
  • Web services require two things that the Internet is not notorious for:

    1) Security
    2) Reliability

    While the Internet works most of the time, is "most of the time" acceptable for web services? Some routing screw up happens and suddenly your ability to charge your customers is hosed.
  • by kbonin ( 58917 ) on Wednesday May 15, 2002 @05:01PM (#3525852)
    The only real problem w/ XML/SOAP based web-service security is patents. There are a rapidly growing set of patents being issued that cover, or will attempt to cover, every possible variation of method needed to transport XML and related datatypes securely.

    This includes a broad [uspto.gov] patent on form signing which appears to cover most forms of hierarchical documents, such as XML.

  • It hangs on your handlebars!
  • by irritating environme ( 529534 ) on Wednesday May 15, 2002 @05:08PM (#3525891)
    I fail to see why SOAP exists except to bypass firewalls, since firewalls exist to restrict what calls/ports/protocols can be made in TCPIP. What will happen in two years will be a "firewall" system for SOAP calls, followed two years later by a new protocol to bypass that security layer, billed in an exciting acronym. Repeat ad infinitum.
    • SOAP is also much lighter and easier to use than most of the other solutions out there (e.i. CORBA, XML RPC).

      Yes, SOAP definitely tunnels through firewalls, but this is for a good reason - you don't need to open up any extra holes in your firewall. W/ every other method you need to poke holes in odd ports, and often that's not an option or in the end less secure than letting SOAP take HTTP.

      • Comment removed based on user account deletion
      • SOAP is also much lighter and easier to use than most of the other solutions out there (e.i. CORBA, XML RPC).

        I have often seen this claim of the latest hyped up fad being "lighter" than a more mature existing technology. As usual, the lightness is not in the technology, but in the level of thought needed to take the first step. It certainly takes effort to use CORBA, but it is being used for real-time control systems in a way that "lightweight" SOAP cannot hope to match. If the real-world demands that have driven the CORBA specification to where it is now are applied to SOAP, it will end up heavier and harder to use, but I expect it to be displaced by the next hyped up silver bullet with embedded philosophers stone and free elixir of life before that happens. XML-RPC is really pre-hype SOAP so not an interesting comparison. DCE would be more interesting, it had its problems, but also a security model that its successors have struggled to match.

        Yes, SOAP definitely tunnels through firewalls, but this is for a good reason - you don't need to open up any extra holes in your firewall. W/ every other method you need to poke holes in odd ports, and often that's not an option or in the end less secure than letting SOAP take HTTP.

        This is exactly what is wrong with the whole SOAP approach. Poking odd holes in your firewall is just the wrong thing to do. Assuming that the mapping between ports and protocols is anything more than a convenience is the sign of just not understanding how attachers are not limited by what you intended. Failure to distinguish the roles of client-side and server-side firewalls, as the SOAP community seems to do, is also a sign of just not having the right attitude to security.

        Setting your objective as getting traffic through that irritating firewall lines you up nicely with all the people trying to break in to your system. This cannot be a good way to start if you want any useful security.

    • SOAP exists (at least partly) as a languge-independent mechanism of performing remote procedure calls - allowing (for instance) a Visual Basic client to talk to java middleware without having to go through some kind of custom written munging layer in the middle.

      There are other methods of doing this (e.g., XML-RPC), but SOAP was designed as an attempt to standardise the language (or at least the alphabet) for this kind of communication.

      Most people seem to assume that SOAP (and Web Services in general) are all about exposing internal company functions to the outside world. The name Web Service is quite misleading, really, as the mechanisms that are used are extremely useful inside corporate networks where many different client applications need to talk to many different servers, and the use has absolutely nothing to do with firewall tunnelling.
  • Yes, that's right children, resist it all you can. Run away!
    That's right you don't want to care about SOAP/WS-Security and all of that non-sense. You just stick with PHP and PERL and you'll be fine. Let me worry about XML Encryption Standards and WS-Security and the demon known as .NET.

    "Shop Smart, Shop S-Mart!"
  • by MaxwellStreet ( 148915 ) on Wednesday May 15, 2002 @05:18PM (#3525937)

    I really don't know (flame gently if I'm being ignorant), but I'm hoping someone can explain this simply.

    If https is secure... and xml/soap is http-based... what's the giant technical leap preventing https transmission of soap/xml packets?

    Also, if you're doing business with say, a vendor of yours, what's stopping the both of you from encrypting the body of the soap messages on both sides by means of a PGP key or something?

    I'm just curious as to why the issue seems to be reasonably solved with http web traffic, but isn't with SOAP...
    • I'll help you out a bit..
      From the horses ass:
      Encrypting SOAP Messages Rob Howard Microsoft Corporation September 27, 2001
      ...... SOAP over HTTPS is secure. The entire HTTP message, including both the headers and the body of the HTTP message is encrypted using public asymmetric encryption algorithms. However, this type of encryption places a dependency on the transport protocol: HTTP, whereas SOAP is a transport-independent protocol. A SOAP message can be sent using HTTP, SMTP, TCP, UDP, and so on.. A SOAP message can also be routed, meaning a server can receive a message, determine that it is not the final destination, and route the message elsewhere. During the routing, different transport protocols can be used, for example, HTTP -> UDP -> SMTP -> HTTP. Therefore a dependency on the transport protocol for security has an intrinsic problem: how much trust can be placed on the routing servers, as the routing server must be able to decrypt the message to read it, and then re-encrypt for another transport protocol? Other questions that arise are: Does the transport protocol support secure communication? What is the cost of encrypting all the data versus part of the data?
      http://msdn.microsoft.com/library/default.asp?url= / ibrary/en-us/dnaspnet/html/asp09272001.asp?frame=t rue
      "Good. Bad. I'm the guy with the gun."
      • Thanks for your informative response - I really appreciate it.

        So the answer is (if I'm reading this right) . . .

        You -CAN- make http-based SOAP services secure.

        They won't be routable, and there might be transmission problems if they're going through certain messaging servers. And there's processing overhead involved in using https encryption that could make it sub-optimal for larger messages.

        But then, if you restrict it to http, it's not really fully SOAP compliant, I suppose.

        It seems to me that, even with these caveats, it's no less secure than doing business over an encrypted http connection or secured CGI stuff.

        Am I still wrong?
        • I stopped asking why 7 months ago when.

          My Boss: "The company is alining its self with M$ and .NET?"
          Silly Me: "Why? That's just silly talk."
          My Boss: "It's the only way well get the M$/(NDA) Deal."
          Silly Me: "Why? That's just silly talk."
          My Boss: "Learn .NET or I'll find someone who will."
          Silly ME: "OK."

          1 month ago..
          My Boss: "Everything has to be done via WebServices and SOAP but you can't rely on SSL for encryption."
          Silly Me: "Why...OK"

          I have spent the last 14 hours trying to read an XML file into memory, sign it, encrypt it and stick it in a SOAP envlope using C# then decrypt the message and check the signature. I'm half way there...
      • SSL secures the transmission from prying eyes. It does not prevent someone from using your service in a manner you did not intend.

        Security issues with Web services go way beyond the fact that it's transmitted in the clear.

        All of the issues that have been dealt with in Web forms will reappear. Field type/length validation needs to occur, authentication, etc...

        You can't just turn on SSL and expect it to solve all your security problems. Utilizing SSL brings up yet another problem - it disrupts security processes in place. IDS and anti-virus mechanisms at the edge of the network can't decrypt SSL traffic and therefore ignore it.

        Unless you are using client certificates - which may be more possible with B2B as opposed to B2C - SSL isn't going to solve the underlying problems, which is verification and validation of a sometimes complex dynamic document that may contain data that is dangerous.
        • All of the issues that have been dealt with in Web forms will reappear. Field type/length validation needs to occur, authentication, etc...


          Your server side system should be checking for field type/length whatever source the data has come from. Relying on the client-side forms to guarantee the length/type of the field leads to buffer overruns - what happens if some hacker just types the url/params into the address bar in your browser?

          And as for authentication, if you are broadcasting your web service using http or https, then the web server's authentication kicks in, in exactly the same way as if you were going via a browser. The request needs to pass the authentication details in the http header, just like a browser does.

          • You server side system should be, but probably isn't doing any better a job than current Web based forms.

            How many forms use a GET instead of a POST? How many check only with client-side scripting (VB/JavaScript) and assume all is well?

            Granted, that type of authentication will provide some measure of security, but most admins don't like doing that and neither do most developers. Think about that - how many sites do you visit that require Web server initiated authentication?

            Very few. Why? Because it's a drain on the admins. You'd have to hire a single person just to update the damn server authentication! Most systems use a database driven method and combine it with user self-service for registration/authentication.

            The same goes for Web services - and probably more so because it's directed primarly at B2B right now. There needs to be integrated billing, audit trails, etc... basic authentication isn't going to cut it and company's aren't going to go for SSL solves everything if they're smart.

            • You server side system should be, but probably isn't doing any better a job than current Web based forms.

              It's not just a case of doing a better job than the web based form. It's about checking the input in case someone has circumnavigated the form. This kind of server-side checking needs to be done whether you are working on a web-site or not. I insist that my developers have this kind of checking in their server code, and have reviews of the stuff that they develop partly to check for this kind of thing. Never trust the client application any more than you absolutely have to.

              Most systems use a database driven method and combine it with user self-service for registration/authentication. Most professional web/application servers (e.g., WebLogic) provide pluggable security, where the server intercepts all calls and allows/prevents access to its resources(such as web pages), but delegates calls to an external module to actually do the checking of usernames/passwords/roles etc. These external modules can be linked into whatever security system you feel like using - lookup in database, LDAP, kerberos token etc. Usually there is no developer involvement in setting this up. You purchase the plug-in that you require to your security system of choice.
    • I still don't get it, but there is one more thing that comes to my attention. Web Services could be used among different sites, potentially owned by different people. Perhaps the problem here is cross-site authentication, something that could be solved the way Kerberos works.
    • You're talking about the wrong kind of security. Think "bugs" instead of "encryption".

      The issue is that SOAP exposes backend functionality to the end-user (or end-developer).

      As an example, consider the server that offers the web service that lets you look up your bank balance. Compare it to a CGI program that does the same thing. They have almost the exact same security issues: privacy of the data (solved by https) and authentication (making sure you are who you say you are). Both the web service and the CGI are equally dangerous by themselves.

      The difference is that SOAP and other web services add the extra abstraction layer that offers one more place for the developer(s) to screw up. And it makes it one step harder for a sysadmin or developer to find problems.

      As the recent SOAP::Lite vulnerabilty points out, this is a non-negligible risk.
      • I'm not sure what you're trying to say here. Are you saying that by creating components with clearly defined interfaces you are more likely to have errors than by building one big monolithic application?

        Or are you claiming that it would be better for an external client application wishing to use the funcionality provided to use screen-scraping from the CGI output?
        • I'm not saying that one is better. I'm saying that one has more layers and more encapsulation involved. This can be a good thing if written correctly, but as many have pointed in comments and links, there have been problems with the services implementations at times.

          I agree that screen scraping is a pain for the end developer, but I posit that it is easier to audit a script using CGI.pm than one that uses SOAP::Lite.pm, given a not-very-large CGI application.
          • That's all well and good for creating the web client, but what about trying to integrate the service being offered into an external application?

            For instance, I worked on a system a couple of years ago where we developed a sales/accounting system, and for one part we needed to go to an external vendor's web site and type in their details. Initially, this was done manually. We then automated it doing screen scraping. This meant ages spent writing the screen scraping code, and then altering it every time the redesigned the page. If this had been a web service, it would have simply been a case of calling it as a component within our overall system.

            Web Services aren't designed with the intention on making it easier to build user accessable web pages. They are designed to make it easier to call the business functionality of an external system from an application.
            • You're missing the point. This has absolutely nothing to do with whether or not web services are easier on the end developer. Clearly they are. The issue is SECURITY.
              • The post that I was responding to was the one with the statements

                You're talking about the wrong kind of security. Think "bugs" instead of "encryption".

                and

                The difference is that SOAP and other web services add the extra abstraction layer that offers one more place for the developer(s) to screw up. And it makes it one step harder for a sysadmin or developer to find problems.T

                This was, as I understanding it, claiming that additional security risks were being introduced because of the extra complexity of using web services. I don't get how having a site offering a cgi-generated web page is in some way more secure than one offering the same thing via web services.

                The statement The issue is that SOAP exposes backend functionality to the end-user (or end-developer). is disingenuous. All (useful) web sites offer backend functionality to the end user - what else are they there for? I'm currently making use of slashdots backend database by posting this message. A SOAP service does nothing more than offer the same service via a mechanism that is easy for another application, rather than another user, to talk to. Assuming that the interface is properly designed, it should expose precisely no more functionality, and create precisely no more security issues, than the equivilant cgi-generated page. What extra security holes would there be is slashdot offered a "post your comments" web service?
                • The extra security holes are in the SOAP (or whatever web service protocol) layer. Yes, you are right that I'm doing a bit of apple/orange comparison when contrasting SOAP to plain CGI. But the point is that (at least in Perl) CGI is pretty safe if you know what you are doing, but recently there was a gaping hole in SOAP::Lite and XMLRPC::Lite. If Slashdot offered a post web service, they would perhaps have had their whole database hacked, since SOAP::Lite was allowing arbitrary remote function calls (if I have not misunderstood the bug report).
                  • Aha. You're talking specifically about SOAP::Lite. That's a particular perl implementation of a form of SOAP. That particular bug is nothing to do with SOAP itself, just the way that this person has implemented it.

                    Using a normal SOAP message, on a properly secured web server, there is no way that you can call arbitrary methods of the server any more than you can call arbitrary methods of the server code behind an html web site.

                    There are security problems in this area (e.g., SOAP doesn't currently provide native services to allow you to turn on/off functions on an individual service based on user permissions), but these problems exist in html forms as well, and the current solution is the same - if you have a particular service that can be used by more than one role of user, with more than one set of permissions, then you have two different SOAP services. You CAN control the access to a particular service, and no-one gets access to a service that you didn't give them. You pass the name of the function in the SOAP header, and if it ain't on available on the services that you have access to, then it ain't going to get executed.

                    Another method is to put a bit of security directly into the service (e.g., through checking ACLs in the code). Not ideal, but not difficult.

                    Assuming either of these methods was used, and the SOAP service only offered a post method (and they didn't use the SOAP::Lite implementation), a Slashdot post service would not allow us to do anything more than post.
    • Well, here's the deal. Back in the day we have different types of packets going to different ports. (if you have a *NIX box, type cat /etc/services for a listing) Port N is for function X. Port M is for Y. So if you send packets to my server with a certain port, then I can assume it is for a specific service guard against specific things at the firewall level, not at the (less reliable) application level.
      Consider what requests had to go through before:
      Web browser (or some client used by evil folks) ==> Firewall==> Web Server==> Applications (server?)==>data

      Webservers operate on port 80 and so port 80 is mostly open thru the firewall. Webservers also typically have quite a limited function set which makes them easier to secure. Most of the information is handled in limited strings and it is mostly locked down these days.
      SOAP also uses the same port 80 (to sidestep firewalls) and *all* functions go through that port. With SOAP (and .Net) protecting specific services via the firewall level is moot now. There will be a rich function set all on the same port 80. This means that in order to have the same level of security in the SOAP framework, applications have to have similar security to that of your old webserver/firewall.
      Now that we skip the firewall and the web server:
      Web browser (or some client used by evil folks)==>(SOAP)Applications server==>data

      Most of the few functions in a webserver are pretty basic. Most of the functions in a SOAP server will be complex. Imagine all the functions used to manipulate your personal information on a .Net server that uses SOAP. This is a very rich function set to check and make secure. And now it is the web applications programmers who do much of the checking, not the web server software/firewall writers.

      All this is somewhat simplified but in general, applications are now the "weakest link" on the web. But the same people who write web applications will soon write object interfaces on SOAP servers which are that much closer to the mischevious out there.

      I think you can call this 1 degree of seperation. ; )

      Cheers,
      -b

      • by Anonymous Coward

        SOAP also uses the same port 80 (to sidestep firewalls) and *all* functions go through that port.

        SOAP is just the transport mechanism. You can assign SOAP servers to any port you want, not just 80. A lot of people focus on port 80 specifically because they're planning on using a web server to also handle their SOAP traffic, and because they want to eliminate firewall issues. This port probably should not be used except for non-secure, non-critical systems.

        One thing I haven't seen discussed much in this thread is that SOAP isn't just for "web services". It's also a very good mechanism for general client-server systems. You can use SOAP instead of building your own custom RPC protocol, and you can use the myriad tools available for developing SOAP apps in lots of diffferent languages and operating systems. As to the security, it's no different than when you're setting up any other client-server systems. SOAP is just the transport -- just like if you were setting up a custom client-server solution with your own call and return protocol you'd need to decide how best to set up security, you need to do the same thing with SOAP apps.

        As time goes on and the various security pieces made to work with SOAP get further developed, then the job of securing SOAP based client-server systems becomes that much easier.

        • SOAP is just the transport mechanism

          SOAP is only part of the transport mechanism - specifically the application layer message protocol. It can use any transport binding that it wants (e.g., http,nntp,carrier pigeon).

          A lot of people focus on port 80 specifically because they're planning on using a web server to also handle their SOAP traffic, and because they want to eliminate firewall issues.

          That's their choice. They can do exactly the same thing with other services (e.g., WebLogic can have its t3 protocol broadcast on port 80, if you want to achieve this). It's not a feature of SOAP, its a feature of the developer's implementation.
      • SOAP also uses the same port 80 (to sidestep firewalls)

        Soap can use port 80 if it wants, but there is absolutely nothing stopping you having your SOAP server running on any port you want. There is nothing forcing you to even use http/https for the transport layer.

        To quote from section 2.1 of the SOAP standards [w3.org]


        A SOAP message such as that in Example 1 may be transferred by different underlying protocols and used in a variety of message exchange patterns. For example, with a Web-based access to a travel service, it could be placed in the body of a HTTP POST request. In another transport binding, it can be sent in an email message (see section 3.2). Section 3 will describe how SOAP messages may be conveyed by a variety of transports.


        There are security issues with SOAP, but this whole port 80/firewall thing is a red herring.
  • by GOD_ALMIGHTY ( 17678 ) <curt.johnson@gmail.NETBSDcom minus bsd> on Wednesday May 15, 2002 @05:23PM (#3525969) Homepage
    It amazes me how much directory services are overlooked, even for this one simple use.
    LDAP is made for doing centralized management. Be it user management or even configuration of services, it's built into every system and OpenLDAP is seriously robust. Just take the 10 minutes or whatever to figure out how to use LDAP and familiarize yourself with the most widely used schemas.

    Using LDAP schemas is like going to create a user table in a database and having the table definition laid out for you. Also all applications should be able to follow the structure. Voila, portable services for applications.

    Please, go familiarize yourself with LDAP. Not to mention SASL (RFC 2222) is meant as a system independent way of handling authentication and authorization. OpenLDAP, Cyrus IMAP and a number of other server apps handle SASL quite well, not to mention it's included in most distros.
    IIRC, the Java Authentication and Authorization APIs also deal with SASL quite well.

    The solutions to most of the problems that come up with 'Web Services' (a limited tool being forced on everything) have been solved by a simple trip to the IETF's RFC repository. Now you just need to use a language and environment that has libraries built for the RFC's. C or Java are your best bets, Perl comes in next, but I've found the libraries to be in various states of working, not something I'd bet my next project on.
    • IIRC, the Java Authentication and Authorization APIs also deal with SASL quite well.

      yep [jcp.org]
    • The problem with LDAP is that it wants to be the authoritative source for all the information it makes available. This is a fundamental weakness of massively centralizing the storage of information that spans an entire organization, in LDAP, an RDBMS, or anything else. We should be very careful about our assumptions wrt "all applications."

      From a (slightly outdated) whitepaper [sf.net] describing macs [sf.net]: "What's required is distributed responsibility for the information and uniform, centralized access to it." (macs is the Modular Access Control System, a project I'm working on to let different APIs interconnect, with an emphasis on access control.)

      Don't get me wrong, I love LDAP -- but it's no panacea. There is no panacea. I may be biased but macs allows the different owners of different information to manage it their way, while making it available to others in another way. A sort of many-to-many cross-interfacing of APIs and storage mechanisms, which lets folks choose the best tool for the job, be it LDAP, /etc/passwd, SASL, or something else from the RFCs.

      Cheers!

      mds

  • by Jack William Bell ( 84469 ) on Wednesday May 15, 2002 @05:38PM (#3526070) Homepage Journal
    This issue is exactly why Microsoft thought they could put over Hailstorm. As a centralized model for Web Services with built-in security, user identification, preferences and certificate management Hailstorm looked like a damn good way for Microsoft to break into a new revenue space while consolidating control over the Internet.

    Luckily for us Web Services weren't anywhere close to ready, at least compared to the hype for them, and Microsoft fell for their own marketing by introducing Hailstorm too soon. If they had kept it under wraps until Web Services were actually being rolled out (and running into the need for centralized security) they might have been hailed as saviors. Instead they jumped into the fray too soon and, combined with the antitrust problems, found themselves in a world of shit.

    I don't know if Micrsoft has abandoned Hailstorm for good -- I do know they don't have a problem walking away from anything that doesn't pan out. But there is a chance Hailstorm, or something very similar (perhaps funded by Microsoft, but not directly owned), will return when the time is right. I expect the best model for this would be for Microsoft (and/or their competitors) to partner with the big banks and credit firms. In this case you have the businesses with the largest need for such services (and who already have significant databases) opening up their system as another revenue source. If my conjecture is valid I would expect to see announcements of such partnerships in the next six months or so.

    In any case what I would like to see is an open source 'Hailstorm'. I understand there are a couple of such projects like that out there now. It would be a very Good Thing (tm) if these projects would settle on a single wire format and data model soon. Why? Because the first such system in general use is going to set the standard for everyone that follows. I would like to see both the standard itself and at least one of the implementations of that standard be open and free (as in speach).

    A further extension of this concept would be to allow easy, trusted, collaboration between user identification systems. This kind of decentralization would help keep the biggies from controlling the entire dataspace. Unfortunately it may be difficult or impossible to do without compromising security.

    Perhaps the best way to start is small and simple: An identification server of some kind. This service would allow you to check with with a trusted authority to make sure someone accessing your service is who they say they are. Such a server should also allow for anomynity by allowing someone to create an identity that cannot easily be traced back to the real person. Such an anomymous identify should be marked as such in some way in order to allow the service provider to decide if they want to accept it or not, but should be set up so that only the original creator of the identity can use it.

    I can go on, but then I already have. Haven't I?

    Jack William Bell
    • Someone moderated my original post as a 'Troll'. I am very curious why! I certainly didn't intend the post as as a troll and I cannot say that I can see anything in the post that would qualify it as such. Perhaps my mention of Microsoft and/or Hailstorm? Perhaps because I used a four-letter word in it?

      Can anyone illuminate the moderator's thinking for me on this?

      Attention moderators, feel free to moderate this post as 'Offtopic'. But know that I am asking out of real interest, not because I want to start a shitstorm. Perhaps I can use the information to write better posts in the future. This post is definately not intended as a troll either...

      Jack William Bell
      • About Hailstorm. As I understand it, besides it not being critically needed at this point as you mentioned. Companies really want to house the information locally rather than pay MS to do it. So much so that the system was just not going to sell the way it was. It looks like the services package will probably be released after some rework under a new name as a local server package much like the "Back Office" server.

        BTW you got a "Troll" because you said "Microsoft" and didn't append the word "Sucks"! That's not just a joke, it's true unfortunately. Do a little metamoderation when you get the chance and metamoderate such moderators off of slashdot. It'll do the place a lot of good.

        • . . . which is still needed to provide decent security for collaborative web services. That aspect of Hailstorm will probably eventually lead to (probably several) centralized Hailstorm-like services. Espcially where money is moving around (which is why I mentioned banks and credit companies).

          I certainly understand companies wanting to keep the information local (espcially sales and preferences info that can be used to infer sales). This kind of thing is very important and I doubt they would want to share it with Microsoft or anyone else. I am sure that was one of the reasons Microsoft folded their hand, and I am sure you are right about it coming back as a package.

          Still I stand on my prediction for the need of central identification services and the loss of personal control if someone doesn't provde an open source implementation of such.

          As to the "Troll" moderation, you might be right on that as well (although it seems a bit over the top). I do believe meta-moderation works because I know it makes me think before I moderate.

          Jack William Bell
  • by Anonymous Coward
    The point of doing web services in XML across HTTP is that it is easy and can use established technologies. If you don't want anyone intercepting the message (channel-based security), that is what SSL is for and works trivially with any web server and client, and is built into Java.

    Once you have a secure pipe, it doesn't take a genius to solve the additional security needs of 95% of the applications. Add a password here. Add a signature or message digest there. Do a calling card pattern. Most of what certain vendors are screaming for is huge overkill to highlight their own products that they would like to have people using instead of what is here today and works well.
    These are the same people that kept RSA under restrictive patent for so many years. Just say no.
    • by Anonymous Coward
      Not quite that simple. Security is as much about encrypted transmission as it is making it difficult for intruders to forge their identity and pretend to be a trusted party. If a hacker gets into a trusted system, then proceeds to exploit all the 3rd parties said company works with, the vulnerability goes through the food chain.

      The thing about security is, when you think you're done, you've only begun.

  • WS Security (Score:3, Informative)

    by bburcham ( 131217 ) on Wednesday May 15, 2002 @11:04PM (#3527585) Homepage
    Well Microsoft [microsoft.com], IBM [ibm.com] and somebody else [verisign.com] have released the WS Security [microsoft.com] "spec" (whitepaper) to address some security issues with SOAP, namely message-level digital signature and encryption. It's technically clean, if a little light on detail.

    Things to note (strategic):

    None of SOAP [w3.org], WSDL, UDDI, and now WS Security are "Royalty Free".

    SOAP isn't a de jure standard -- it's a W3C "note".

    UDDI [uddi.org] was supposed to move into an open standards body in 2001 but still hasn't.

    By publishing WS Security on their websites and through no open standards body we see Microsoft, IBM and that other company abandoning even attempts to appear open.

    On the technical side -- if you want to see a little deeper into the security issues left unsolved by SOAP, I recommend you look at the OASIS technical committee specification, ebXML Message Service Specification version 2.0 rev C [oasis-open.org].

  • I've been working on a project: macs [sf.net], that provides (among other things) a protocol neutral authorization mechanism for hierarchical sets of resources. Featuring things like delegated administration mentioned in the article. We have been using this to control user access to things like web sites and file servers, but it would be trivial to adapt it to protect APIs instead.

    Do people really leave their APIs dangling out there for all to call? Would this be a feature people would find useful?

  • by Shirotae ( 44882 ) on Thursday May 16, 2002 @06:09AM (#3528594)

    The article by Rich DeMillo [com.com] (CNet news.com May 15, 2002) is much better. He gets to the underlying issue that we are patching up problems as they arise rather than paying any attention to understanding what we are really trying to achieve. In particular he says "The headlong rush to Web services is going to make things worse."

    DeMillo has been around long enough to know what he is talking about, but I expect his wisdom to fall on deaf ears in today's instant gratification culture.

  • by Anonymous Coward
    Using insecure protocols to boot?

    Geez, isn't that the same thing everyone rants on .net for?

    At least with the current mishmash of crap that usually passes for any corporate network there's no "mother lode" to crack into to get the keys to the entire kingdom...

HELP!!!! I'm being held prisoner in /usr/games/lib!

Working...