• UAV at work in Afghanistan: the military technology is to protect EU and US borders in future (Photo: US Air Force)

UK counter-terrorism chief pledges respect for civil liberties

14.11.10 @ 11:37

  1. By Andrew Rettman
  2. Andrew email

LONDON - The junior minister responsible for counter-terrorism in the UK, the EU's most surveillance-heavy country, has promised to keep civil liberties high on her agenda. But some new developments in security technology are making their own creators feel "uncomfortable."

Speaking on Friday (12 November) at the Global Security Challenge (GSC), an industry event in London, Pauline Neville-Jones underlined the gravity of the threat from Islamic radicals, increasingly in Northern Ireland and from cyber-attacks.

She said that "planes falling out of the sky" can be "election-determining events" for governments as well as causing loss of life. But she added that protecting free societies and free trade is at the heart of her work.

"We do regard the whole side of the civil liberties agenda ... as being absolutely crucial to what we are trying to do. There's absolutely no point in having a safe society which is not a free one. You know - we might as well be in prison," she said. "We cannot, ladies and gentlemen, jam up, jam up the workings of international trade because we have to inspect each and every package which goes through the system."

The GSC meeting saw mostly Australian, British, Canadian, Israeli, Singaporean and American small and medium-sized firms pitch for investments from the UK Home Office, the US Department of Defense and US intelligence services.

EU and US authorities, including the Warsaw-based Frontex agency, are exploring the use of Unmanned Aerial Vehicles (UAVs), or spy drones, to improve border security. The Observer, a British weekly, on Sunday reported that the UK plans to use drones to provide intelligence on future anti-austerity demonstrations in the country.

The technology is still in its infancy. But Nato countries are experimenting with UAVs on the Afghanistan-Pakistan border which can fly for long periods at low altitudes in harsh terrain and bad weather. They can be mounted with face-recognition kits which capture people's details even if they are on the move in low light, cross-check them with databases of suspects and send alerts to smartphones. They can also be armed, but UAV strikes tend to kill 10 civilians for every one militant they hit.

The US Office of Naval Research Global (ONRG) is meanwhile looking how to sift massive amounts of online data at the level of exabytes to help make tactical decisions and to spot terrorist threats.

One idea is to make private internet users do the work without knowing what they are doing when they carry out simple online tasks. The model for this is the so-called Recaptcha project for digitising the archives of the New York Times (NYT). Part of Google, Recaptcha supplies other websites with squiggly images of letters that internet users have to decipher in order to confirm they are human. The squiggly images are in fact bits of New York Times archives that could not be read by automatic scanners and the internet users' decryptions are fed back to Recaptcha to fill in the gaps.

In contrast to an upcoming EU bill to help people delete online private data, ONRG associate director John Callahan wants maximum data availability. Asked at the GSC event in London if the internet can be made to "forget" people, Dr Callahan said: "No. And we shouldn't even try ... There is an appeal for privacy, but in order to process the data, to make big decisions, we need powerful analytics."

The GSC event was briefly interrupted by a small group of students in a sign of public unease about the security industry in general. "We don't want to disrupt your meeting. We just want to say that we're disgusted by this and we want you to stop and think what you're doing," the group's spokeswoman said.

Most delegates were politely amused. But some of them grapple with ethical problems in the course of their work.

Outpacing the law

One scientist at the GSC meeting who designed guidance systems for Tomahawk missiles in the run-up to the 2003 Iraq war told this website: "We had the Pentagon screaming blue murder for us to give them the go ahead. But I couldn't do it until the system was 100 percent right. These things have to fly over people's homes, over hospitals. These are people's lives we're talking about."

On the unmanned drone side, technology is outpacing the law. In some cases it may be unclear who to blame if an automated vehicle accidentally caused financial or physical harm. It is also unclear whether it is legal to store and share visual data gathered on EU or US citizens.

Signe Redfield, who works on underwater spy robots for the ONRG, said that normally the operator who programmed the UAV is liable if it crashes into something. But some drones are clever enough to decide where to go by themselves. "What happens if all the operator did was to launch it?" she asked.

Another US contact indicated that the Pentagon is researching devices that can be pre-programmed to automatically shoot at certain types of target in certain areas. "I don't know of any government that is developing this kind of technology [UAVs] that isn't looking into this," the source said. "I'm not comfortable with it."

Noting that the next frontier in robotics is artificial intelligence - truly autonomous decision-making - the contact added: "What if the machine decides itself [who to shoot] - are you going to take it to court?"