Immigration control is at the top of the government’s list of priorities but the latest official statistics, to be published on Thursday, are expected to show that it is still three times as high as the Conservatives’ “tens of thousands” target.
Ministers are divided about how to tackle the challenge: Philip Hammond, the chancellor, is seeking to exclude foreign students from the figures, on the basis that they are not the focus of public concern. Theresa May, prime minister, has ruled out this suggestion.
Amber Rudd, the home secretary, has proposed new limits on overseas students, in response to official figures showing that large numbers are overstaying illegally after their studies have finished.
But underlying the debate is a growing concern that the net migration figures from the Office for National Statistics are unreliable. The statistics underpin government policy but there are five reasons why the numbers, calculated from a passenger survey, may not be completely accurate.
The International Passenger Survey main measure of migration was not designed to do this job
The IPS was established in the 1960s as a travel and tourism poll. But because the UK does not require migrants to register after arrival it has become, by default, the measure of net migration, one of the most politicised and publicised statistics of current times.
Britain is one of only two EU countries to rely on passenger questionnaires for data collection — the other is Cyprus.
As the ONS admits, the survey has “inherent limitations” for measuring migration, not least because only a very small proportion of the people travelling to and from the UK qualify as “long-term international migrants” who are staying in the country for 12 months or more.
In a review of the survey carried out three years ago, the ONS said: “Of some 800,000 people interviewed by IPS staff at ports each year, under 5,000 are identified as migrants. Whilst this sample size does allow estimates of migration at the UK level to be made, these estimates are subject to relatively wide margins of uncertainty.”
The ONS says that, despite these deficiencies, the IPS is still “the best available source to measure migration” and that the survey is “fit for this purpose”. MPs conducted an inquiry into the IPS three years ago and disagreed, calling it “little better than a best guess”.
The survey is hampered by practical constraints
Travellers are questioned for the IPS only between 6am and 10pm, so it takes no account of overnight flights.
Research by the Financial Times shows that a third of flight departures after 10pm from Heathrow, the UK’s busiest airport, were to countries within the top 10 origin nations for international students in the UK, including China, Nigeria, Hong Kong, Saudi Arabia and Singapore.
The ONS acknowledges the problem of missed flights and passengers.
“There is the potential for coverage error in the IPS due to the exclusion of certain ports/routes and time periods from the sampling frame . . . about 5 per cent of travellers entering or leaving the UK are not covered by the IPS, as they travel at night when interviewing is suspended, or routes are too small in volume or too expensive to be covered.”
The ONS says that it weights the survey to compensate for the missing data.
IPS staff are also not allowed to approach unaccompanied schoolchildren. This means the survey may be undercounting emigration among international students who look young and school-age students at UK boarding schools.
The data become less reliable the more detail you look for
Official statistics are published with confidence intervals that express the margin of error. In the case of the IPS, these intervals become proportionally larger for more detailed estimates — such as the numbers of workers, or students — because they are based on much smaller samples than the headline net migration figure.
David Cameron, the former prime minister, was concerned enough about the survey to ask the ONS for a briefing on how it was compiled, according to former Downing Street aides. Statisticians responded that they could not be very confident of any figure below the net migration estimate.
Looking in detail at one group, for example, the data on Asian emigrants returning after a period of study in the UK have a confidence interval of 10,000 for an estimated figure of 40,000.
Emigration is especially hard to measure
The ONS review of IPS methodology states that estimating emigration — including overseas students returning home — is “problematic” and contributes to “substantial uncertainty” in the net migration figures.
One of the reasons emigration is difficult to measure is that, whereas those arriving in the country usually have a clear plan about why they have come, those leaving may express uncertainty over when and if they are coming back; for example, if they are job-hunting but do not yet have a firm offer of employment.
The other problem is that while immigration data can be cross-checked against Home Office visa numbers, there are no other data sources for comparing emigration.
This problem was meant to be solved by the introduction of exit checks in April 2015. The Home Office says there has “not yet been time” to record complete data on travel movements for longer-term migrants who are in the country for more than a year and on which the net migration number is based.
However, a former government adviser who has seen preliminary exit check figures told the FT that they indicate that emigration is significantly higher than is recorded in the survey.
People’s plans change
This is especially relevant for data on overseas students, whose plans are likely to be fluid: for instance, those leaving the UK after their course has ended may say they expect to return shortly but never do, meaning they are not counted as emigrants. Similarly, someone who came in on a student visa may leave after a year of employment, telling survey staff that they were in the country to work rather than study.
Another glitch is that students on masters courses may arrive intending to stay for more than a year but then leave after just 11 months, meaning they are counted as “long-term” migrants when they arrive but not when they leave.
As a result, estimates of how many students overstay in the UK after their studies vary wildly.
According to the passenger survey, about 93,000 non-EU international students are overstaying in Britain after the end of their studies. This has prompted Home Office proposals for further curbs to student visas.
But the Annual Population Survey suggests that only 30,000-40,000 non-EU migrants who came as students are still in the UK population after five years. Government visa data suggest the figure is about 40,000.
Madeleine Sumption, director of the Oxford Migration Observatory, says it is “clear” that “we just don’t know what the net migration of non-EU students is.”
She added: “In order for the IPS to be correct, you would have to have non-EU students overstaying on a massive scale. The question is, is that plausible?
“I think if it was happening on this scale it would be evident in other ways . . . for instance we might see large numbers of Chinese students entering the labour market.”
Those whose businesses have been affected by visa restrictions are particularly frustrated by the unreliability of the data. James Pitman, from Study Group, which prepares international students for UK universities, points out that a third of the total net migration figure is because of international students.
“This has fuelled concerns about immigration, which may even have affected the Brexit vote,” he said. Mr Pitman is now considering whether to call for a public inquiry into how the figures have been “misinterpreted”.
Additional reporting by Sarah O’Connor
Letters in response to this article:
Get alerts on UK immigration when a new story is published