Try the new

June 21, 2006 11:41 am

The internet: The engine ain’t broke but should we fix it?

  • Share
  • Print
  • Clip
  • Gift Article
  • Comments

One of the great things about the internet is its adaptability.

But with such modern problems as spam, denial-of-service attacks and worms plaguing users, some might think the internet is not adapting quickly enough. Can the net be “fixed” to resolve issues such as these, and how should it be done?

One person particularly interested in this question is Vinton Cerf, who helped develop the original internet architecture. Mr Cerf and his peers intentionally made it a “dumb” network.

All the internet does is route traffic, which meant it could support a wide range of applications from the outset. It is the application software itself – everything from e-commerce websites to instant messaging and Voice over IP – that constitute the “smart” bits of the network.

The internet was therefore designed to be as adaptable as possible, but one thing its creators could not anticipate was how commercial models would evolve. “In some sense, the net has not adapted well to its commercial manifestation,” says Mr Cerf.

Take spam e-mail, which Mr Cerf says arose because e-mail became both ubiquitous and free. The market adapted to try to make money out of it, and caused problems for the majority of users. Companies are now trying to introduce smarter technology into the e-mail servers connected to the internet.

In e-mail authentication systems, such as DomainKeys and SenderID, e-mail servers check that an incoming message really has come from where it says it has. This helps to stop “spoofing” – the practice of pretending that an e-mail has come from somewhere else.

Both spammers and phishers – internet crooks who send e-mails to try to obtainpasswords and bank account details from unwitting users – use spoofing extensively, so stopping it would be one victory for online crime fighters.

The technology is sound, says Mikko Hypponen, chief researcher at security software company F-Secure, but the flaw is that it does not stop abusive e-mail. It just verifies where it is coming from. And crooks can stay a step ahead by sending mail from networks of compromised PCs called “botnets” that they control remotely.

“Spammers have become early adopters of this technology. Most spam is now sent via botnets from infected machines with perfectly valid authentication records,” he says.

While some companies attempt to put new technologies on to the network to solve commercial problems, others are focusing on “fixing” the user experience. HTML, the mark-up language which tells a browser what a web page looks like, is now 17 years old, and it shows.

For example, doing anything complicated, such as booking airline tickets, often involves submitting multiple forms to a server somewhere and waiting for it to respond before purchasers can carry on, in a “click-and-wait” operation.

Companies such as Adobe and Microsoft want to change that with “rich internet applications” (RIAs) – web pages using new types of mark-up language to make browsers smarter.

Using RIAs, browsers will be able to do more work in a single web page, so there will be less submitting of different forms to the web server to complete a task. This new class of web page will also look more sophisticated than traditional HTML pages, with better animation, for example.

But not everyone is happy about such developments. Craig Labovitz, director of network architecture of Arbor Networks who has directed several National Science Foundation research grants, says that many of today’s enhancements can cause their own problems.

“It’s almost more about whether the fixes will break the internet, rather than whether the internet is broken. And that is the trick: can you continue to adapt and layer things on, and which things can you layer on?” he asks.

Mark Quirk, head of technology at Microsoft UK, dismisses such concerns. “We don’t think the needs of consumers would be met by pursuing only the standards path,” he says.

Instead, customer demand should also drive what companies do. Mr Cerf identifies this as “a fundamental tension between innovation and an opportunity to try to grab market share and lock customers in, which is what non-interoperable protocols tend to do”.

Trying to fix some of the internet’s problems through new types of web page and enhanced e-mail will not change the fact that good information is hard to find. Today’s web contains millions of documents that can only be searched by their text content.

The Semantic Web initiative, spearheaded by the World Wide Web Consortium, aims to tag these documents with information about the concepts they describe, explains John Davies, head of next generation web research at BT. It is an attempt to create information that knows both what it means and how it relates to other things.

For example, search for information on a particular CEO, and today’s search engines bring up documents containing the CEO’s name. “A more sophisticated example of a semantic query would be: Find all documents referring to a Person that has Position ‘CEO’ within a Company, located In a Country with name ‘UK’,” Mr Davies explains.

These technologies work to solve the internet’s problems at the application level, using new protocols and software to work more effectively.

Other initiatives attempt to solve problems by adapting the core of the network itself, changing the machines that route traffic rather than the e-mail servers, PCs, and search engines that use them.

Mr Cerf is to participate in an initiative organised by the US National Science Foundation to rethink the fundamentals of network communications.

It should not be taken as an attempt to replace today’s internet, he says, because it cannot be switched off overnight to be upgraded. “It’s an attempt to ask if it were starting from scratch now, what would it look like? And the next question to ask is: How can we get there?”

In the meantime, some companies are taking it on themselves to solve problems by re-engineering the core of the network. One such problem is the distributed denial-of-service (DDoS) attack. This occurs when thousands of compromised PCs are remotely instructed to target a company’s website at once.

Cybercrooks have been using DDoS for years to blackmail companies by threatening to bring down their websites at crucial times – such as gambling sites before a big event.

Cable & Wireless, which provides internet service to large companies and on a wholesale basis to other internet service providers, is trying to prevent DDoS attacks.

Its technology monitors packets of information flowing across its network, recognising traffic that looks like a DDoS attack and filtering it out. Now, it is selling the service to other ISPs.

Together, changes made to technology running at the core of the network and enhancements to applications running at its edge could fix some of the problems that occur as people adapt the network to suit their needs.

When people and technology collide, something generally needs fixing – and there is no reason the internet should be any different.

Copyright The Financial Times Limited 2017. You may share using our article tools.
Please don't cut articles from and redistribute by email or post to the web.

  • Share
  • Print
  • Clip
  • Gift Article
  • Comments