It can trace its roots to Europe’s apothecaries in the Middle Ages and the 19th-century dye and bulk chemical producers of Germany’s Ruhr valley, but the pharmaceutical sector has become a significant global industry in its own right.
Academic researchers and doctors make most breakthroughs, but the sector has supported the commercialisation of drugs that have contributed to a significant extension in the quality and length of life, with such benefits as higher productivity.
Drugs to reduce cholesterol and hypertension have eased the threat of heart disease. Treatments for asthma, HIV and rheumatoid arthritis have turned debilitating illnesses into more manageable chronic diseases. An effective cure for malaria exists and one for hepatitis C is in sight.
The sector employs hundreds of thousands of skilled staff, pays large sums in taxes, supports a global ecosystem of suppliers and spends more than $40bn a year on research and development – typically 10-20 per cent of sales is reinvested, the highest proportion of any industry.
But big pharma companies are victims of their own success. New scientific breakthroughs have slowed, while rising regulatory and other barriers increase the costs of clinical development.
Selling more high-priced, high-margin drugs has left a hole to be filled, with generic competition escalating and health services balking at the bill.
The sector is busy trimming costs, re-examining pricing policies and outsourcing research to universities and smaller biotech companies.
Treatments to offset the effects of ageing remain one of the biggest research challenges. But big pharma itself is in search of a remedy for its own ageing process.
Modern pharmaceuticals and biotechnology
When Greg Winter was a young scientist carrying out laboratory work in the 1970s, he saw the potential of using antibodies to tackle infection but had no idea of the extent to which his research would eventually be commercialised, writes Andrew Jack.
His efforts to “humanise” monoclonal antibodies (MABs) derived from mice, to tackle disease without excessive side-effects, paved the way for a modern category of medicines that has helped build the modern biotechnology industry and resuscitate large pharmaceutical companies.
Initially, much of the focus was on applications in cancer, and drugs such as Avastin for breast and colon cancer are among the results.
MABs exist in multiple sclerosis, asthma and beyond, while the most important impact has been in the treatment of rheumatoid arthritis, leading to blockbusters such as Humira, the first fully human MAB, which is now the world’s second-bestselling medicine, generating more than $9bn in revenues last year. The success highlights how complex, global and uncertain the process of biotech drug development is, relying on academic and government-funded research, and the vagaries of commercial support for development.
The UK Medical Research Council funded the original work in the 1970s of César Milstein and Georges Köhler to isolate MABs, for which they won the Nobel Prize. Winter humanised them, and his colleague Michael Neuberger developed a new technique to help. To date, the MRC has received some £600m in royalties as a result.
Winter says that traditional big pharma companies were initially sceptical, and it was smaller US biotech companies – notably Genentech and Centocor (now known as Janssen Biotech) – that worked to develop these pioneering biological treatments.
One of Winter’s own companies, Cambridge Antibody Technologies, which developed Humira, was acquired in 2006 by AstraZeneca, which in turn licensed the drug to Abbott Laboratories. Its success was so great that investors’ concerns that it would destabilise the rest of the business helped trigger the spin-off last year of Abbott’s pharmaceutical arm into AbbVie.
Today, large pharmaceutical companies are keener than ever to buy biotech companies or license their most promising products. MABs and other biological drugs have provided an important shot in the arm for the survival of an industry dominated by chemicals until just a few years ago.
In 1798 Thomas Malthus, the British theorist, postulated that the world’s population would eventually outstrip the planet’s ability to produce sufficient food for all, leading to widespread famine and death, writes Amy Kazmin.
Nowhere did this dismal prediction seem likelier than India in the 1950s and early 1960s, when increases in grain production failed to keep pace with population growth, forcing New Delhi to depend on imported food aid. Fears of imminent famine intensified after two droughts in the mid-1960s.
Today, however, India is self-sufficient in food grain. The turnround is the fruit of the Green Revolution, which brought high-yielding hybrid seeds and other high-tech, intensive farming techniques to millions of small farmers across Asia.
The Green Revolution was driven by philanthropic organisations (the Rockefeller and Ford foundations), international agricultural research institutes that developed the new high-yielding seed varieties, and governments that ploughed money into fertilisers, irrigation networks and pesticides.
The transformation of many Asian countries from subsistence to surplus food producers has created business opportunities. Commodity traders such as Glencore, Cargill, Archer Daniels Midland, Noble Group and Louis Dreyfus have made billions of dollars from the processing, storage, transportation and distribution of wheat, oilseeds, sugar and agricultural goods.
Production of high-yielding seeds – dominated by Monsanto, DuPont and Syngenta – fertiliser and modern irrigation systems is also big business.
The Green Revolution has its critics. In India, many believe intensive cultivation has damaged land fertility and strained north India’s water table, while many small farmers have been ruined by investing in expensive seeds that fail if there is not enough rain. But the avoidance of Malthus’s dire prophesy is certainly reason to cheer.
Get alerts on Entrepreneurship when a new story is published