Listen to this article

00:00
00:00

You have found two old, faded photos in your attic and you want them revived digitally. You do not want to buy an entire photo-editing package for such a small task, nor do you want to take the photos to a digital photo laboratory and pay heavily for such a small job. Surely there is another solution?

John Darlington and his team at the London e-Science Centre, based at Imperial College’s Department of Computing, believe that there is – or at least soon will be. They have developed a prototype business model in which the developer of an application such as photo retouching or picture compression would offer it as a chargeable service over the internet.

Consumers would chose which service they want, upload their image to the service and pay – typically via Paypal – when the modified image is returned. In this way consumers get a direct and simple means to access precisely the application they require, without having to buy it.

The concept has been developed and funded over the past two years as part of the UK’s multidisciplinary e-science programme – Prof Darlington is director of the London end. His team has developed working pay-per-use mechanisms and demonstrated them using examples from three-dimensional rendering – creating an image from 3D design data – for 2D and 3D design optimisation and even for access to large research telescopes.

The telescope image service, demonstrated at a UK e-science event in September, was based on a telescope in Hawaii. A live camera showed the telescope slewing round to the correct position before taking the specified image. “Our demonstrations have been very successful – at least when it wasn’t raining in Hawaii,” says Jeremy Cohen, a member of Prof Darlington’s team. Last week there was a demonstration of the system at SuperComputing 2005 in Seattle.

The team’s business model is a novel twist on a concept with a long history in computing and many different names and interpretations – service computing. This seeks to separate the users of a computer application from its originators and providers.

The Imperial team’s approach to the concept is fundamentally different. Underlying the provision of chargeable services would be a new approach to the internet, in which its existing structure is reformulated as a series of open markets.

This would comprise service providers, execution providers that would provide the computing power necessary to carry out the work, and brokers who would scour the net to find the best service for users and negotiate terms on their behalf (see panel). In the full model everything required to complete a task – software, execution environment or data – would be available as a use-on-demand, pay-per-use service.

This revives, in a different form, the Network Computer vision of the late 1990s, in which consumers were to use stripped-down terminals to access their documents and software over the internet, tapping into remote processing power. That concept foundered on lack of bandwidth and worries over security if data were to be held remotely.

Prof Darlington believes the time could now be right for service computing. “There are issues with the internet as it is now,” he says. “A lot of rubbish is being traded over it – this is a clear example of Gresham’s Law in action. Where buyers cannot reliably assess the value of goods being offered, prices and the quality of goods traded are forced down. The low-value stuff crowds out the higher-value goods.”

The fact that this does not happen, however, when large and trusted organisations such as eBay or Amazon are conducting trading, convinced Prof Darlington that similar organisations, with a high level of public confidence, could provide the foundations and necessary inter-relationships for a service computing market to flourish.

The Imperial team’s service computing concept, if widely adopted, could have big implications for the entire IT chain. Separating execution from application development would free software developers to exploit their work commercially, without worrying about losing control of their intellectual capital or investing in the infrastructure that would link them to potential customers and get the work done.

It would also provide a fillip to companies offering utility computing facilities to provide processing power on demand as a service. Sun Microsystems, for example, let the Imperial College team use its Sun Grid service for execution on the demos. Conversely, the new approach might be bad news for Microsoft as individual users’ computers could run with much simpler operating systems.

The challenge now is to get the service computing market started in earnest. “How much demand will there be?” asks Prof Darlington. “That’s the $64,000 question. But my bet is that it could be massive. We have our eyes on the mass market of the global internet to provide a market for both consumers and producers of services.”

He is tempted to start a commercial service before the end of the year with something “amusing and harmless”, and is encouraged by the success of trivial services, such as ringtones and football scores, in the mobile phone market. “Once you get something on to the internet, the dynamics of the web can take over and the throughput could be enormous,” he says.

How service computing works

The core technology that would enable the Imperial team’s vision of service computing to become reality is available and in use now.

It is web services, the set of standards based on XML (extensible mark-up language) that can be used for sharing applications over the internet without having to link incompatible IT systems.

In a nutshell, the chargeable services such as photo compression would be “wrapped” in a web service which would be used to move the work between the various brokers, service and execution providers. The main issue for the Imperial team has been to find mechanisms to provide this mobility and to ensure that everyone involved gets paid.

The user would be unaware of the series of interactions and verifications that are initiated by his or her starting the process by sending a request to a broker which might include the maximum he or she is prepared to pay, or the time to wait, for the service to be carried out.

The broker would go to a registry of services available, or straight out on to the web to find what was available, then contact service providers and execution providers.

Having negotiated a price, the broker would send the work to the execution provider, which would access the service provider’s software online, carry out the work and send it back to the broker. The latter would send it on to the user.

For payment the broker would verify that the user’s account was sufficiently in credit and then request a payment token that would be used to pay the execution and service providers through their Paypal accounts.

In the final step the broker would take payment from the user’s account.

Copyright The Financial Times Limited 2017. All rights reserved.
myFT

Follow the topics mentioned in this article

Comments have not been enabled for this article.