It is easy these days to be overtaken by the future. Take e-mail, for example, the mainstay of modern business communication: “Younger people see e-mail as something older people do,” grimaces Don Rippert, chief technology officer for Accenture, the world’s largest consultancy.
“I thought it was a novel, next generation technology, but to young people it’s: ‘You should post to my site on MySpace [a social networking website], you should text message me, you should instant message me, why would you want to e-mail me? I don’t even check my inbox any more’.”
Indeed, the Gartner Group predicts that by 2011, instant messaging will be the de facto tool for voice, video and text communications in business. By then, of course, the young will have moved on to something else – refinements of the imaginary worlds of Second Life and other “virtual worlds”, or metaverses, perhaps.
Sometimes our technological future and our traditional present even collide, with risible results. “I will pay with my phone,” Tero Ojampera, Nokia’s chief technology officer, declared as he ordered coffee in a branch of McDonald’s, the fast food chain, expecting to be able to wave his handset over a wireless card reader. His waitress was unimpressed: “We don’t accept phones as payment,” she sniffed dismissively.
Such misunderstandings aside, near field communications (NFC), a short range (hand’s width) wireless technology, looks set to turn mobile handsets into electronic wallets and could revolutionise the way we pay for goods and services within the foreseeable future. Prototypes already exist and trials are under way, underlining a truism about forecasting technological futures: what we can reliably predict already exists, at least in the laboratory.
Beyond that, according to Andy Mulholland, chief technology officer at Capgemini, Europe’s largest computing services group, other factors cloud the issue: “I have a formula about where technology takes us. For a year ahead, you can be confident about where it is going because betas [prototypes in test] and alphas [finished designs] for the products which are going to come out already exist.
“For three years to five years ahead you have people collecting requirements definitions for the releases beyond that. Beyond five to 10 years, you might have an idea of what could come out of the laboratories. But from 10 to 20 years ahead, your only guidance is demographics: what people have grown up with and what they see as normal.”
So for this article, we will stick with the foreseeable and practical rather than the blue yonder. Inevitably, an expert’s view of the future is coloured by his or her area of expertise – Mr Ojampera looks forward to mobile handsets fabricated in new materials that would enable them to be bent or stretched, while Mr Mulholland anticipates technologies based on communications and interaction rather than today’s transaction-based approach.
But overall, this straw poll of some of the IT industry’s leading experts reflects a thoughtful, rather downbeat view of the future rather than the technocratic, gung-ho self-assurance of earlier years.
Many were concerned about personal privacy and security in a world where miniature video cameras would be ubiquitous and failure to safeguard sensitive data could open individuals and companies to legal action. “E-mail, voicemail and text messages used to be ethereal,” says Mike Lynch, chief executive of the UK company, Autonomy. “But no longer. Now you can see them being brought up in court.”
Crispin O’Brien, chairman of the technology group at consultancy KPMG, while questioning whether fixed line telephones have a future, worried about the mobile alternative: “There are huge, huge risks in mobile working,” he says. “If you leave a laptop on the back seat of a car, you could be in breach of all sorts of confidentiality issues.”
He argued that business was remiss to “just hunker down behind the firewall” – the defensive barrier at the perimeter of the business – rather than working out how to guard BlackBerrys – devices offering mobile e-mail and more – from eavesdroppers. “Mobile working and devices such as the BlackBerry require a new approach to security,” he says. The French security service’s recent decision to ban BlackBerrys from the president and prime minister’s offices underlines the point.
Mobility is clearly a challenge and opportunity. Mr Ojampera of Nokia thought the principal underlying trend would be the merging of mobility and Web 2.0 – shorthand for the latest iteration of the World Wide Web that emphasises interactivity and shared experiences: “Communities of people sharing experiences combined with a very intelligent mobile device which ‘knows’ your location will bring a completely new type of interactivity.”
He predicted in three or four years the emergence of mobile devices with visual recognition technology linked to intelligent databases. A tourist in London for example, could point their mobile phone at the Houses of Parliament. The phone would recognise the image and provide a commentary – perhaps spelling the beginning of the end for the tour guide?
Mr Ojampera also argued the next iteration of the internet would be created around mobile rather than the fixed computers. Each individual’s mobile device would become a web server in its own right with other internet users given specific rights to access the content.
According to Mike Lynch of Autonomy, the emergence of the first elements of Web 2.0 such as YouTube and MySpace and other social networking sites represents the biggest change in the IT industry since its formation.
Instead of simplifying the material world so that it can be handled by computers, people now expect computers to be able to cope with human-friendly information: “Rather than us being slaves to what the computer needs, the computer will have to follow what we like.”
The cause, of course, is the explosion of unstructured information and Mr Lynch, as head of a company that specialises in the management and retrieval of such information, is in no doubt about the complexity of dealing with it: “No one is going to be able to tag every piece of information and knowledge, even if they agreed on how the tags are defined. It’s a case of the hype getting ahead of the reality.”
Important information will, however, be tagged with metadata and made easily accessible. Andrew Herbert, head of Microsoft’s Cambridge laboratories, argues that computers will increasingly become a prothesis for civilisation’s overburdened memories: “The computers will know who you work with and show you things before you need them.
“So if you usually interact with certain documents while you have the monthly accounts open, then the computer might go ahead and get those documents ready for you in case you need them. It will be a sort of intelligent pre-search function – in search, we’re moving on from searching on key words to searching for concepts and the technology behind this is machine learning.”
There is broad agreement that there will be fundamental changes to ways of working. Don Rippert of Accenture points to an increase in the use of contractors – people employed for specific projects – and ways of substituting for today’s centralised workforces: “Having people drive through traffic for an hour to get to work will become unacceptable.”
“Increasingly, we will have to find ways for a distributed workforce to collaborate across time and distance,” he says, arguing that the latest, highly realistic videoconferencing systems would play a part, as would elements of Web 2.0: “Look at how teenagers use computers and the internet. They seem to be able to collaborate with each other with no effort whatsoever.”
He argues that derivatives of consumer products such as MySpace could be used in an enterprise setting. It was already happening in an informal way: “If you put SOA into a keyword search in YouTube [a video-sharing website] you will get video after video about service oriented architecture” – hardly regarded as a popular topic on teenage-orientated websites.
Crispin O’Brien of KPMG agrees that the elements of Web 2.0 – something he called Enterprise 2.0, or social networking in a business context – would be significant: “The more you can network people in an informal way, the more value a company can create,” he says.
He also argues that computers have to behave more like humans and that the next phase of workplace IT could be as influenced by social anthropology as by writers of computer code.
John Gage, chief researcher for Sun Microsystems, combines the sinister and social potential of technological progress in pointing out that, in his view, the big issue is the conjunction of the identity of people, objects, programs and data with location.
“Couple Google’s plans to map the surface of the Earth to an accuracy of 20cm with global positioning systems – which means the position of the 2bn-3bn existing mobile devices could be located to within a metre – with IPV6 [the latest internet protocol that allows for a virtually unlimited number of internet addresses and therefore objects connected to the internet] and we have the makings of a police state as well as incredible logistical power. We would know the location of every package, truck, container vehicle and traffic jam.”
In fact, with the latest storage and processing technologies, the capability is almost here to record every incident in an individual’s life. But is that something to which we should aspire, the experts ask?
Some corners of every life are best left dark. Indeed, bloggers are already being warned that their words ay come back to haunt them in later life in applying for jobs or promotion.