Sex toys and smart robots: Who's liable?
By Peter Teffer
A wave of shock went through the audience of one of the panels at the RightsCon event in Brussels on Thursday (30 March), as technology consultant Ken Munro demonstrated that the camera on an internet-connected vibrator can be hacked.
The images of the camera feed, shown on a screen, were remarkably sharp.
Join EUobserver today
Become an expert on Europe
Get instant access to all articles — and 20 years of archives. 14-day free trial.
Choose your plan
... or subscribe as a group
Already a member?
-
Should autonomous robots have a legal status? (Photo: Paul Vera-Broadbent)
There were of course some questions about why a sex toy should have a camera, but that was not the point.
As we move toward a world with more and more devices connected to the internet, this raises concerns about security, privacy, and human rights.
Take internet-connected sex toys, for example.
There is a market for remote-controlled pleasure devices, which could be in demand with couples who have a long-distance relationship, or if one of the partners travels frequently.
“What happens when you think you are interacting with your partner, but someone has man-in-the-middled [hacked] your sex device?,” said Amie Stepanovich, of Access Now, the main organiser of RightsCon, a conference on human rights and the Internet.
“What happens when you find out that the person on the other end of that isn't somebody you know whatsoever? What are the legal implications, what are the security implications, what are the policy implications when you can start hacking into sex toys?”
According to an American former prosecutor, Arthur Rizer, the criminal ramifications depend on the country you live in.
“In Denmark, you have to have forced penetration. So hacking into a dildo that somebody is using, or sex toy that somebody is using, in Denmark would not be a crime under sexual assault,” he said.
In Belgium, the criminal law on rape contains the phrase “by whatever means without consent”, which seems to indicate that hacking a sex toy in Belgium could be considered rape.
In Ireland, it wouldn't be rape, but it could fall under the definition of sexual assault: “vaginal intercourse manipulated by any object by another person”.
“When they wrote the word manipulated, I highly doubt they were talking about this, but as a defence attorney, if I was a prosecutor I'd take that case to court any day of the week and I would probably win,” said Rizer.
In the above-mentioned examples, there is still a human involved that could be considered liable for the malevolent actions.
But what if the sex toy is self-learning? What if sex robots with artificial intelligence become predatory?
Liability
You can substitute the sex toys with connected fridges or self-driving cars - as we move more towards a world that has an Internet of Things - liability questions become pressing.
“When software is in everything, where does the liability lie?” said technology lawyer Mishi Choudary.
“I don't think anyone is ready. I also don't think the First World is ready in terms of regulation of these devices.”
The European Parliament is trying to put the issue of robotics and artificial intelligence (AI) on the political agenda.
In February, it adopted a text in which it asked the European Commission, which is formally the only EU institution with the power to propose laws, to submit “a proposal for a legislative instrument on legal questions related to the development and use of robotics and AI foreseeable in the next 10 to 15 years”.
Frankenstein and Asimov
The parliament resolution was unlike what MEPs usually discuss, including a mention of the “possibility that, in the long-term, AI could surpass human intellectual capacity”.
The text, supported by 396 MEPs and rejected by 123, included references to Frankenstein's monster in the work of Mary Shelley and science fiction author Isaac Asimov's laws of robotics.
MEPs said that “in the scenario where a robot can take autonomous decisions, the traditional rules will not suffice to give rise to legal liability for damage caused by a robot”, because it would not be possible to “identify the party responsible”.
One of the parliament's requests to the commission - the parliament having no legal right to demand action - was potentially far-reaching, and met with some scepticism at the conference.
MEPs asked the commission “to explore, analyse and consider the implications of all possible legal solutions”, including “creating a specific legal status for robots in the long run”.
Robotic personhood is 'dangerous'
The idea behind creating a legal status was that “at least the most sophisticated autonomous robots could be established as having the status of electronic persons responsible for making good any damage they may cause”.
But a former US ambassador to the United Nations Human Rights Council called the idea “very dangerous”.
“I understand the intention behind it, which is basically to ensure that somebody [is] accountable and liable for damages caused by robots,” said Eileen Donahoe at another RightsCon panel.
“But I don't think the consequences of giving personhood to robots have been thought through.”
Ben Wagner, researcher at the Centre of Internet & Human Rights, and an independent expert reporting to the human rights body Council of Europe, said the proposal had “the right idea, but perhaps not the perfect implementation”.
“We know how to make humans responsible, we don't know how to make algorithms responsible,” said Wagner.
He noted that the “danger” of giving personhood to robots or artificial intelligence programmes is that it can serve as a “get-out clause” for humans to shirk their responsibilities.
“Just saying robots are persons is a step too soon, and perhaps also a step too early, but the conversation needs to be had,” said Wagner.
Commission is evaluating
The European Commission is “carefully” looking at the parliament's suggestions, spokeswoman Nathalie Vandystadt told EUobserver on Thursday.
The report is “an important contribution to the public debate on this issue”, the spokeswoman said.
“We already have EU legislation applying to robots, for example related to safety and privacy. We are currently consulting on challenges related to liability and are evaluating legislation in this area,” she added.
The commission is currently gathering views from citizens and interest groups through a public consultation, asking for input on “emerging Internet of Things and robotics liability challenges”.
Document
- European Parliament resolution of 16 February 2017 with recommendations to the Commission on Civil L