First, the good news: the cost of computer storage is continuing to fall, with prices of raw storage reducing by about 30 per cent a year. At the same time, storage capacity continues to rise – in January this year EMC launched the first storage array capable of holding one petabyte of data.

So, while spending on storage was up by 10 per cent in 2005, storage capacity grew by 50 per cent, according to market analysts IDC.

The bad news is that the improvements may be of marginal benefit for companies as they strive to control and protect growing mountains of data, ensure it is in a form that will satisfy diverse regulatory requirements, and at the same time provide users with timely access.

In the past, storage was something you just kept on buying. Now data storage policy is a strategic issue in its own right.

“Compliance has propelled storage from back office obscurity to the top of the boardroom agenda,” says Ian Cook, CEO of Logicalis Group, a systems integration company in a paper he is presenting at Storage Expo in London in October. “However, long before compliance was an issue, companies were beginning to realise that data… could deliver real business benefits if managed effectively.”

The appreciation of the strategic importance of storage is driving significant changes in the storage market. Two to three years ago direct attached storage – where each server has its own storage – was the predominant storage architecture. But security concerns, coupled with the proliferation of data, exposed the weakness of this approach. Capacity cannot be shared between servers and each storage system must be handled separately for routine maintenance.

This prompted the rise of networked storage, which allows capacity to be shared across networks, and variants of this now comprise more than half the market. There are two models: Network Attached Storage, where the device is directly attached to the network; and Storage Area Networks, where storage devices have their own network.

In parallel with new storage architectures, new software solutions are helping to increase capacity utilisation and reduce maintenance costs. Software is available to automate tasks ranging from back-up and archiving, to deleting data and file management. These applications draw on the technique of virtualisation, in which all storage capacity is presented as a logical whole no matter where it is on the network, allowing it to be managed centrally.

But whatever the storage architecture, it will never translate into reduced costs unless there is an enforced data management strategy. All information is not born equal – some is more important, some needs to be kept for longer, some needs to be available at all times. This has prompted the development of Information Life Cycle Management, a range of techniques for tracking, indexing, archiving and restoring information from multiple locations across a company. Data is given an expiration date and then backed up to different repositories according to its importance and value.

From this, it follows that storage systems should be tiered to match their performance, availability and functionality to the category of data they store. Over its lifetime, information will be moved from one tier to another, based on its changing value.

One tool in the information management armoury is data protection management. This moves on from the technical question of whether a back up took place to ensuring a business has properly protected its data, says Jim McDonald, chief technical officer of WysDM Software, a provider of back-up analysis software.

“It is possible for a back-up to complete successfully but fail to contain the required data. Some data on the server being backed up may be unavailable, such as Microsoft Outlook files that are in use,” says Mr McDonald. WysDM’s software ensures that a back-up that succeeds but reports large numbers of unavailable files, will be flagged up for further investigation.

There is no doubt that for smaller companies the biggest storage headache is e-mail. As well as being one of the fastest growing areas of data accretion, it is also one of the most unstructured and dynamic. Volumes continue to grow dramatically, with predictions that this year 84bn e-mails will be sent each day, according the Michael Gaines, senior manager of Zantaz, an e-mail archiving specialist.

E-mail archiving automatically applies corporate storage policies, siphoning off messages from e-mail servers based on any combination of parameters, such as age, size, status, sender and location to scalable, searchable archives that are optimised for the storage of e-mail.

The archiving process is transparent to users. By compressing both the message and the attachments, and saving only one copy of each, with tags identifying each mail box in which it was found, storage costs can be reduced significantly.

According to Mr Gaines, one customer, Northumberland County Council, had an employee whose e-mail took up 232 megabytes prior to archiving and 14 megabytes afterwards. Another, EDF Energy, recovered three terabytes in network storage space.

Now there is a pressure to integrate discrete content management systems holding information such as engineering drawings, contracts or marketing resources, with e-mail management systems, according to Dave Gingell, vice-president of marketing at EMC.

“E-mail is, after all, just another type of content,” he says. “Both types of system can work together to produce related sets of information on demand, even if they’re wildly differing content types.”

For example, all the content related to a particular project can be handled as one item, even though the information consists of Word documents, PowerPoint presentations, e-mail messages and attachments.

One striking factor is how much of the action in the storage world is of a defensive nature – from business continuity, to regulatory compliance and litigation risk. In the face of this many companies don’t have chance to consider archived data as an asset, says Lynda Black, product marketing manager at Copan Systems, which provides instant access archiving systems.

“One of the primary ways to achieve business value is being able to access a particular file from your archive – and have it in your hands when you need it,” says Ms Black.

While traditional archiving systems copy information on to tapes that must be physically restored to disk when the information is required, Copan’s MAID technology (Massive Array of Idle Disks) holds the data on disks that are turned only when required. This means information is accessible by end users, but the cost is similar to archiving to tape.

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Follow the topics in this article

Comments

Comments have not been enabled for this article.