Artificial Intelligence: Robot rights and regulation
Fiona Moss, Associate at Mundays Solicitors, examines the complex legal issues surrounding Artificial Intelligence.
Photo Copyright: Johan Swanepoel | 123rf.com
What is Artificial Intelligence?
Artificial Intelligence (AI for short) is broadly used to refer to man-made computers and systems which can be made to act in such a manner we would call intelligent, i.e. that can make decisions based on sentiment rather than logic and essentially ‘think’.
Does Artificial Intelligence affect me?
Artificial intelligence promises to change our lives in a multitude of different ways, from cleaning our homes, driving our cars to diagnosing disease before doctors.
Is this not just science fiction?
No! A lot of the more ambitious AI projects are still some way off. But there are plenty of AI offerings already in use. Are you now talking to Siri? Does your phone recognise faces in photos or automatically tag them? Many websites now offer customers the opportunity to chat with a customer support representative while they’re browsing – but not every site actually has a live person on the other end of the line. These, along with computer games which respond to your actions, are all forms of AI which we use daily.
The reality is that AI machines will make decisions based on pre-programmed code. Machines learn to do so based on a set of fixed rules decided by a human and, as the technology develops, the need for regulation is being considered across several countries.
Regulations
Earlier this year, the European Parliament’s legal affairs committee voted to begin drafting a set of regulations to govern the development and use of AI and robotics. The European Parliament’s report suggests that robots, bots, androids and other manifestations of AI are poised to “unleash a new industrial revolution, which is likely to leave no stratum of society untouched.”(!) What is clear is that the European Parliament is taking AI seriously.
Will robots have legal status?
Included in the report is preliminary guidance on what the European Parliament calls “electronic personhood”. The granting of some form of legal status makes it easier to deal with concepts such as liability and ownership. We are familiar with companies being given the status of a corporate ‘person’ and this is similar to what has been suggested for AI.
Will robots be able to own things?
Yes. If AI is granted a separate legal status, a machine will be able to own physical property in the same way that a company can. When it comes to ownership rights of machines in things they produce, i.e. reports, articles, creations and derivations of software or other ‘works’, we apply existing copyright laws.
In England and Wales, copyright arises without registration and belongs to the author (unless it is ‘made by’ an employee in the course of their employment, in which case the employer is the first owner). Legislation specifically states for computer-generated works that the author is the person who makes the arrangements for the creation of the work.
For now therefore the law is clear: any text written by a machine will be owned by the author of the computer programme.
Will robots have other rights?
Whilst AI will be able to own things and enter into contracts if given a legal status, the extension of any application of human rights was not considered in the European Parliament’s report. This may be a step too far at this stage of technological advancement.
What about liabilities?
There is no ‘one size fits all’ when it comes to robot regulation. If a machine is given legal personality it can take part in legal cases as both claimant and respondent – that is, to sue and be sued, but there is no legal framework that applies to AI.
In the absence of such a legal framework, robots are treated as almost like complicated pieces of machinery and, depending on who uses them and for what purpose, different regimes will apply. So for example, a drone delivering parcels will need to comply with civil aviation laws and responsibility for non compliance would likely fall with the person giving the instructions. Such laws would not be applicable to a handwriting recognition appliance for example.
Decisions still need to be taken as to which person is the most appropriate to be pursued where AI has caused damage. Take the example of the self-driving, fully autonomous car: who should be liable in case of an accident where there is no human ‘driver’? Should the manufacturer be responsible or the owner, or perhaps a producer of any navigation system it is following? The European Parliament takes the view that self-driving cars are “in most urgent need of European and global rules” and warn against fragmented regulatory approaches. They also call for new mandatory insurance scheme and compensation funds to cover damage caused by their robots.
How does AI impact on data privacy?
Much of the development of AI requires the gathering of personal information and arguably data protection legislation is behind the game when it comes to AI.
Are we aware of how our personal information is being used to develop AI? Have we consented to its use? For example, face recognition software by its nature will involve the gathering of imagery much of which may be unknown to the person who is being tracked.
The Data Protection Act 1998 has been in force for 18 years and, whilst it has been flexible enough to adapt to developments in technology such as mobile apps and cloud computing, whether it can adapt suitably to AI is yet to be seen. The difficulties may be heightened when the new EU General Data Protection Regulation becomes law in May 2018.
What happens next?
The European Parliament’s report is the first time the European legislature has considered AI/robots and their potential impact on society at all and this paves the way for European legislation to be put into place. There are likely still to be a couple of years to go before any new legislation is in effect however. Of course, post Brexit, our government will not be required to adopt the laws, but may follow the example in implementing an AI framework.
essence info
Mundays LLPCedar House, 78 Portsmouth Road, Cobham KT11 1AN
Telephone: 01932 590500
Website: www.mundays.co.ukFiona Moss, an Associate at Mundays LLP, specialises in corporate and commercial law and is a franchise specialist. She deals with acquisitions and disposals, joint venture/shareholder arrangements and investment as well as general corporate governance and compliance and procedural issues.
On the commercial side, Fiona covers general commercial agreements, distribution, licensing, consultancy and is a franchise specialist acting for franchisors and franchisees alike.
Fiona can be contacted by telephone on 01932 590611 or by email at fiona.moss@mundays.co.uk.