Select Page

Despite the signatories to Don Box’s paycheck he is correct in assessing that HTTP has reached the end of its usefulness as a protocol for delivering applications. Even Tim Berners-Lee thinks so. Of course, HTTP will always be with us for the simple accessing and retrieval of information, which is the only thing it was designed for in the first place.

What HTTP has never been, and what many people (to their own detriment) believe HTTP to be, is a functional model for user interface (read: application) design. You only need to compare the ease of use of your favourite desktop email client to that of a web-based email service such as Yahoo! mail to see why this is true.

HTTP’s main design “flaws”, if you can call them that, are the lack of a stateful model and the absence of any sort of rich exchange with the server of contextual information about what’s happening on the client.

Sure, you can load lots of Java, VBScript, JavaScript, ASP, Style Sheets, etc onto HTTP but it becomes an uncomfortable fit and most of the client side standards are anything but — as a result whenever you try to achieve a reasonable degree of interactivity on a web interface you’re always kicking a certain percentage of your audience out the door because they don’t have the right OS or Browser. This is the fundamental GUI Paradox that the web/internet faces as we try to adapt the client-server model into more valuable applications.

MSN’s Hotmail is a good example, which uses lots of VBScript and looks absolutely fantastic in Microsoft IE. Their trade off is an obvious (and strategic) one where they decided that universal access (which is a foundational principle of HTTP) was less important than a quality user interface. It’s probably one of the most complex web-based User Interfaces out there.

Anyway, lots of protocols are portending to drive functional user interfaces to greater degrees of relevance in the Client-Server model but the action that fundamentally matters will be in Peer-To-Peer, as GNUtella proves.

My point in forwarding this piece is to further isolate a theme which runs through my thinking, namely the fact that people will always choose a quality user experience over compatibility. That theme was lost to all of these dot coms who threw millions of dollars towards trying to get us to use the web to do everything (including WORD PROCESSING) and ultimately got themselves twisted around trying to break free of the GUI Paradox.

Web pages should be cheap, lightweight, and simple. Applications should be heavy, feature a high degree of interactivity, and highly complex… We need an open protocol to supply that via the network.

I can’t believe I’m agreeing with Microsoft.

-Ian.

——-

http://zdnet.com.com/2102-1105-845220.html

Microsoft guru: Stamp out HTTP By Matt Loney Special to ZDNet News February 26, 2002, 7:40 AM PT URL: http://zdnet.com.com/2100-1105-845220.html

LONDON–Now that IPv4 is slowly being replaced by version 6 as a way of increasing the Internet’s address space, it appears that another bedrock of the Internet, HTTP, is also reaching its limitations.

Delivering the keynote at European DevWeek in London on Tuesday, Don Box, an architect for Microsoft’s .NET Developer Platform team, said HTTP presents a major challenge for Web services, for peer-to-peer applications and even for security. A replacement will eventually have to be found, he said, but it is not at all clear who will provide this replacement.

HTTP, or HyperText Transport Protocol, is used by virtually every Web page on the Internet. It is the mechanism by which a browser sends a request to a server on the Internet, and then receives the response.

“One of the big challenges facing Web services is our reliance on HTTP,” said Box. However, there is nothing wrong with HTTP per se, as its ubiquity and high dependability means it is the only way to get an a reliable end-to-end connection over the Internet, he added. “If people can’t search the Web they call the IT department, so the IT department makes sure HTTP is always working. We have engineered the hell out of it.” So much so, indeed, that Box likes to think of HTTP as the “cockroach of the Internet” because “after the holocaust it will be the only protocol left standing.”

But, he said, we can’t stay on HTTP forever, despite all the investment and engineering that have gone into it. Among the problems with HTTP, said Box, is the fact that it is a Remote Procedure Call (RPC) protocol; something that one program (such as a browser) uses to request a service from another program located in another computer (the server) in a network without having to understand network details.

This works for small transactions asking for Web pages, but when Web services start running transactions that take some time to complete over the protocol, the model fails. “If it takes three minutes for a response, it is not really HTTP any more,” Box said. The problem, said Box, is that the intermediaries–that is, the companies that own the routers and cables between the client and server–will not allow single transactions that take this long.

“We have to do something to make it (HTTP) less important,” said Box. “If we rely on HTTP we will melt the Internet. We at least have to raise the level of abstraction, so that we have an industry-wide way to do long-running requests–I need a way to send a request to a server and not the get result for five days.”

Adapting for P2P Another problem with HTTP, said Box, is that it is asymmetric. “Only one entity can initiate an exchange over HTTP, the other entity is passive, and can only respond. For peer-to-peer applications this is not really suitable,” he said. The reason that peer-to-peer applications do work today, said Box, is that programmers create hacks to get around the limitations of the protocol, and this is not good. “It’s all hackery, it’s all ad-hoc and none of it is interoperable,” he added.

There is work going on to address the shortcomings of HTTP, said Box. Several working groups are working on the problem at the W3C, the organisation responsible for Web standards. And even though Microsoft is working on the problem too, Box did say that Microsoft is unlikely to succeed alone.

“Microsoft has some ideas (on how to break the independence on HTTP), IBM has some ideas, and others have ideas. We’ll see,” he said. But, he added, “if one vendor does it on their own, it will simply not be worth the trouble.”