A few weeks ago, when I was chatting with the head of one of America’s largest food and drink companies, he made a revealing comment about data flows. Like most consumer groups, this particular company is currently spending a lot of money to monitor its customers with big data.
But it is not simply watching what they do or do not buy. These days it is increasingly scrutinising the micro-level details of pay and benefit cycles in every district in America. The reason? Before 2007, this executive said, consumer spending on food and drink was fairly stable during the month in most US cities. But since 2007, spending patterns have become extremely volatile. More and more consumers appear to be living hand-to-mouth, buying goods only when their pay cheques, food stamps or benefit money arrive. And this change has not simply occurred in the poorest areas: even middle-class districts are prone to these swings. Hence the need to study local pay and benefit cycles.
“We see a pronounced difference between how people are shopping today and before the recession,” the executive explained. “Consumers are living pay cheque by pay cheque, and they tend to spend accordingly. Then you have 50 million people on food stamps and that has cycles too. So for our business it has become critical to understand the cycle – when pay [and benefit] cheques are arriving.”
Sadly, it does not yet seem possible for outsiders (or journalists) to crunch the numbers across the entire economy. Large companies are very secretive about their big-data projects (this particular company, which produces many of America’s best-loved snacks, would not let me reveal its name). And though economists monitor macro trends in retail spending, they have not traditionally analysed micro spending swings.
Nevertheless, this story is not unique. Executives at Walmart, for example, have recently noted the rising impact of the “pay cheque cycle”; Kroger, another retailer, notes that the proportion of customers using food stamps has doubled, creating additional swings. And as these anecdotal tales mount up, they are interesting for at least two reasons. First, and most obviously, they should remind us of the silent, dark underbelly of economic pain that is stalking America’s current “recovery”. Most notably, it seems that the financial fragility of the poorer section of US society has risen sharply in recent years, as unemployment remains high and real incomes and household wealth fall. (A revealing survey published last week, for example, suggested that the wealth of Hispanic and black families declined by 44 per cent and 31 per cent respectively between 2007 and 2010.)
Measuring this financial fragility – like measuring micro-level spending swings – is tough, since it is not an issue that economists have traditionally tracked. But one in seven Americans (about 50 million) are now thought to be living in poverty and a similar number in “food insecure” households. Meanwhile, six million are using food banks and 47 million are on food stamps. And when the Brookings Institution tried to look at this fragility issue a couple of years ago, by analysing how many households could find $2,000 in a hurry, it concluded that a quarter of families had no access to ready, rainy-day funds. “Although financial fragility is more severe among low-income households, a sizeable fraction of seemingly middle-class Americans are also at risk,” the study concluded.
The second reason I find this trend intriguing – if not tragic – is what it reveals about our attitude towards time. During most of the past century, it has often seemed as if a hallmark of modern “progress” is that our planning horizons, as a society, have expanded. Unlike peasants or herdsmen in the pre-modern age, who lacked the ability to measure the passage of time or calculate future risks with precision, 20th-century man appeared to have so much control over the environment that it was possible – and desirable – to take a long-term view. No longer were people destined to scramble in a reactive manner; they could plan ahead, mastering time. The fact that people were no longer foraging for food each day, but were able to visit a supermarket proactively at pre-planned intervals, was a good metaphor for a much bigger social and cognitive shift.
But, as the past five years have shown us, history does not go in a straight line, or proceed homogeneously. If you were to ask wealthy Americans to visualise the future, they might well describe it as a carefully calibrated road along which they expect to travel. But if you ask poorer Americans, who are scrambling from pay cheque to pay cheque, they are more likely to perceive the future as a chaotic series of short-term cycles. Economic polarisation, in other words, creates different cognitive maps, and also creates, of course, those subtle shifts in spending patterns that the big data experts in consumer goods companies now want to track.
Let us just hope that historians, sociologists and psychologists will be able to get access to that big data treasure trove in the not too distant future. It could be deeply revealing; if not poignant, too.