May 05, 2002

The State of Web Services

As a result of the recent barrage by SOAP and Google-bashers (not to be confused with Googlewhackers), I was left with no choice but to deeply analyze the state of web services. First a word of warning—this is not a complete picture nor is it for the technically faint. Also, when I mention REST I am referring to it as it pertains to web services (as opposed to the web where it has been successful). The pro-REST crowd is arguing their case in hopes of proselytizing the current followers of the open standards web services architecture in which SOAP is an essential part. Although they might think that they have a strong case, the REST model, in which XML, URIs, and HTTP are the core technologies, will not be adopted as the standard for web services.

The argument for REST reminds a lot of the Tanenbaum-Torvalds "Linux is Obsolete" debate of 1992. This time the argument isn't about system architecture, but rather the web services architecture where it has become SOAP vs. REST. Ten years later, every major enterprise has some kind of Linux strategy—IBM has a wide array of Linux products, Sun has recently announced a Linux initiative, and we can all agree Linux has a strong hold of the server market. If this event has taught us any lesson it is that breakthroughs do not happen overnight and that we should not attack technology that is still evolving. Who knows why these kinds of debates begin—personal vendettas, fear of change, or maybe just a general lack of knowledge on the subject.

Speaking of 1992, last week was the anniversary of the L.A. riots and we are still faced with the famous question Rodney King stuttered 10 years ago—"Can we all just get along?" Can web services get along? Yes, if they continue with SOAP and do not get sidetracked by other proposed replacements. There are has been an influx of articles on the web about the REST vs. SOAP issue but, for the sake of all the '>' that would be wasted with rebuttals, I will only highlight some key issues.

The argument for a REST-based web services model does not come from the belief in a better architecture, but rather a lack of understanding of the current architecture. A lack of understanding of the evolving standards and a lack of vision towards the future also adds flame to the fire. To put this article in a nutshell—the current state of web services will survive because it is everything-independent. More importantly, because everything is XML-based, it will survive as long as everything speaks XML.

When you look at the XML/HTTP/URI-style service, the use of HTTP implies that everyone is satisfied with HTTP as a transport mechanism for web service message exchanges. However, this is not the case as we have seen various proposals and solutions that plug the holes where HTTP is flawed such as security (SSL/TLS) and reliability (HTTPr, BEEP, JMS, and callbacks). People do not want an application protocol similar to an OS where you have to patch up all of its problems.

The W3C has also received a bit of bashing for its slow progress with web services standardization. There have even been some suggestions for them not to have any involvement with web services. People forget that it was the W3C that led the web to its full potential. I doubt that anyone can compete with their experience from building a successful world wide web and their commitment to looking after the interest of the community instead of business. The reason for hostility towards the W3C is clear—their objective is to move away from REST. This is obvious by their mission statement below:

"W3C is transforming the architecture of the initial Web (essentially HTML, URIs, and HTTP) into the architecture of tomorrow's Web, built atop the solid foundation provided by XML."

Initial Web alongside the Web of tomorrow

Source: W3C

A good analogy for using SOAP is that of online payment methods. Some sites accept a few credit cards, some accept PayPal, some e-gold, etc. By offering the most payment methods, a company can insure that it doesn't turn away any potential customers. The various payment methods can be compared to SOAP binding—you must advertise which protocol you are going to use and the client must use that protocol. The one who supports the most protocols has a better chance of appealing to a wider audience. To assume that HTTP will always be used is a misconception. A large number of people are not satisfied with HTTP and have created various initiatives such as SOAP over BEEP, Jabber, HTTPr, and others.

The main attack on SOAP has been more about SOAP-RPC than anything else. Examples of SOAP-RPC are often shown using the SOAP Encoding rules. Specifying the datatypes in the request are considered optional by the W3C but one can quickly notice its importance. For example, you can have a method that performs different operations according to the datatype received. This is similar to method overloading in which you have a method with different signatures. REST argues parameters should be passed over HTTP as is done with URIs and CGI query parameters. This is easy to do over HTTP, but there are security issues that are *not* solved by SSL such as encrypting part of the data being exchanged and secure sessions between more than two parties. With SOAP you can encrypt the sensitive data without sacrificing perfomance, which is lost when using SSL for the duration of the entire session. Upcoming SOAP security specs and already available XML security technologies address these issues and more.

As web services evolve we will see more providers diverting away from HTTP as the protocol of choice to bind SOAP to. HTTP has been a "quick and dirty" way to get SOAP out there and play around with it. What makes SOAP nice is that it can bind to any application protocol. It can also hop from one protocol to another which makes it optimal for interoperability. SOAP message routing will need to be addressed to make this happen, but the fact that developers are using SOAP with various application protocols and not just HTTP means that message routing *must* be addressed (it is a work in progress). This simple fact that SOAP can go protocol hopping in of itself should wash away thoughts of a RESTful solution.

Regarding SOAP-RPC, I am an advocate of generating stubs from WSDL files and letting your compiler check for errors prior to invocation. This way you won't waste the resources of the web service by calling unknown methods or sending the wrong parameters. Although WSDL files rarely change (if they even change at all), the option of dynamically grabbing the WSDL file and processing it everytime you want to invoke a service has some performance costs. My recommendation is to generate stubs and update if you have to. There is another debate involving WSDL which we will not focus on too much. The question is, "What if the WSDL file changes?" The answer is that web service providers will undoubtedly offer backwards compatibility and provide the necessary tools and examples for using the new interface.

REST can be successful and has been successful with its proof being the success of the Internet. However, the reason for its success is largely due to the Internet's public nature. REST is fine for performing simple tasks (such as *instant* transactions) and I see no problem with its use of XML and URIs. But as Don Box said, "If it takes three minutes for a response, it is not really HTTP anymore." The problem with REST lies in its use of HTTP and what that means for web services. HTTP can be extended to add the features necessary to use with web services such as security and reliability but do you really want to push a protocol to do things it wasn't intended to do?

Here is where the reality check comes in—SOAP is not so simple after all. I don't see a big market for free public web services (unless they point to products e.g. price quotes). Most likely web services will be used in the context of B2B or B2C and therefore requires a very complex architecture. Complex in the sense that their will be a long list of technologies and requirements that will need to be integrated. If this was not the case, then REST wins for its simplicity. But "simple" often connotates insecure. Security and reliabiliy will be at the top of the business web services requirements list and therefore the need for "simplicity" for them will change to "security".

SOAP Security Extensions, WS-Security, and other specifications for Web Services security have been proposed. The reason there is no rush is because, until everybody can agree on which XML-based languages are going to be used, then there really is no hurry. We have all these technologies that guarantee interoperability *iff* people agree to use them and agree on how they are going to use them. For example, the idea of dynamic web service clients that find web services and bind automatically becomes harder when every service has different method names. This poses another problem—if everything has to be open, interoperable, and essentially the same then what will make them different? The web services model has to keep this in mind and allow for competition and creativity. Competition and creativity often leads to propietary solutions to technologies that are still in the standards process. We have seen this with the barrage of web services products hitting the market. Vendors are filling the holes left out by unstandardized technologies by providing their own extensions. The push to be one of the first to support web services will lead these vendors to defeat the whole purpose of the open standards web services model.

There has been a lot of criticism about SOAP being able to "sneak" past the firewall. The argument goes on to say that because it piggybacks on HTTP then it is therefore a security threat. My response would be to not use HTTP! Does a SOAP server have to sit on the same HTTP server? Run on the same port? I will stick to using HTTP for what it's for—transferring hypertext. I don't plan on using HTTP for mission critical web services but will offer HTTP support because of its ubiquity. New web server vulnerabilities arise all the time so server administrators have enough to worry about already. Viruses are an example of "sneaking" past firewalls that pose a *real* security problem. I don't consider SOAP as "sneaking" anywhere because, if anything, SOAP is welcomed by web services. Another thing, for all the people claiming SOAP is a security risk, are there any examples of how this scenario would take place? For SOAP to be considered a security risk I would think that a SOAP request would have to be accepted first. I am sure business web services will employ some kind of authentication instead of happily accepting this magical SOAP request that will comprimise their system.

Also, since HTTP is not a requirement of web services, you can expect pure web services servers to be deployed. This way you can separate your website traffic from your web service traffic. They provide different functions and this distinction should be made.

The point being that you have a variety of ways for deploying web services. The "web" in "web services" is no misprint but rather implies that they will follow a similar model as the "web". For example, when you check your email you have a log-in page, can change settings, and so on. You can also bypass all the log-in requirements by opting to save your passwords in your browser or through the use of cookies. Similarily, as web services mature, expect a similar automated negotiation process in which authentication credentials are sent along with invocation requests.

Google has made a strong statement by providing a SOAP interface. It was not the result of jumping on the bandwagon, but rather the implementation of a useful service that will turn hype into reality—what we have all been waiting for. It's not easy if you're writing SOAP requests from scratch, but with SOAP support in every major language, message assembly and communication is simple. We will see more companies follow in Google's footsteps.

If you are integrating different web services into one application, then you don't want to worry about one using REST and one using SOAP. The open standards web services crowd want something that will always be extensible and promote SOAP for this exact reason. Everyone uses the same standard and everyone is happy.

I am surprised that the REST crowd have not touched upon more areas that would make their case stronger. For example, if vendors plan to deply J2ME on their devices, they are required to implement the HttpConnection interface. This guarantees that HTTP communication will be available on all MIDP devices and thereby guaranteeing access to RESTful web services. However there is a JSR for J2ME Web Services and implementations of SOAP messaging for J2ME already available such as kSOAP. Metrowerks CodeWarrior Wireless Studio includes a client library with a SOAP-JMS client, SOAP-JMS Bridge, a SOAP bridge to .NET, and Queued invocation of Web Services.

One of the aspects of web services that I really find amusing is the story of dynamic smart agents that scour UDDI registries, automatically negotiate, and perform useful tasks with web services. This fairy tale goes on to say that all this happens transparently, reliably, and securely. A major factor to the success of the web was the ability to do research and weigh your options. Most of our findings on the web happen through search engines or referrals. I doubt this paradigm will change. We will find web services the way we have been doing for years and then, once contracts are established, transparency is possible. I am not saying that the idea of dynamic web service clients is not possible, but rather that it will take some time to develop. Also, searching through a UDDI registry offers the advantage of singling out web services, but I'm not sure if I trust a registry hosted by IBM or Microsoft. I'll think about taking that route when Google starts a public UDDI registry. For now, I really cannot think of too many uses for UDDI until web services for a specific task all have similar WSDL files. That way if I want quotes on airline tickets I can search, bind, and then invoke them without having to reprogram the client. Another useful thing UDDI can be used for is redundancy within private networks. For example, if a client has access to a company's internal service and that service goes down, then it can run a query for backup services and get back to business.

Another question that remains is *when* deploying web services should be the viable choice. When a company does choose to deploy a business web service, one of the most significant traits of web services is going to be the contract and license control web services will have over their clients. The threat that pirating has posed to software companies will not be a factor for web services because of the amount of control providers will have over incoming traffic. The Google APIs allow 1000 queries to a specific key. We will see similar solutions such as subscription-based services that have daily, weekly, monthly, yearly plans, certain number of requests per cycle, pay-per-request, etc. All aspects of security will be important factors for insuring that these subscription-based plans become successful. Expect vendors to release various security solutions that authenticate, authorize, and insure data privacy, transmission reliability, non-repudiation, and anything else that would fit into such a subscription-based model.

The verbosity of XML and bloated DTDs/Schemas pose another problem. All of this XML processing also has performance costs, but the strength of today's servers seem up to the task. Performance was also the downside of SSL transactions but we saw the rise of numerous SSL hardware and software accelerators. Expect the same solutions for XML optimization as we already see with numerous XML Accelerators hitting the market.

In conclusion, we all must realize that web services and all the related technologies are evolving. Criticism is necessary to bring about new ideas and issues that the W3C and other groups will be forced to address. Also, we need to have an understanding of how standards intend to solve problems associated with web services while maintaining a vision of interoperability. We cannot see the future, but we can plan for it. This implies that we build a strong foundation for web services that will last a long time. The only way we can assure this is by making everything XML-based because, as technologies change and emerge, when the dust settles XML will always be there.

Posted by Nasseam Elkarra at May 5, 2002 08:59 PM