Back to top

What is LEGROB?

The Union is called to play an essential role in establishing legal rules and ethical principles on the liability of smart robots (embodied artificial intelligence, AI) that are coherent to the EU’s contribution to society, without stifling innovation – an aim set forth by the European Parliament Resolution of 16 February 2017, with recommendations to the Commission on Civil Law on Robotics (2015/2103(INL), Recitals U-Y). This task is yet to be done.

Smart robots are no longer an issue of science fiction. They are already acting in areas as diverse as transportation (e.g. driverless cars which by 2040 are expected to amount to 75% of the total), health care (e.g. nanotech robots or robotics assisting in oncology), financial services (robo-advisors) or the legal profession (e.g. ROSS intelligence or similar, used in discovery, due diligence processes and other tasks traditionally performed by junior lawyers). They are expected to take 1/3 in traditional professions by 2025. And there is still a great room for their technological development which entails that we are just witnessing the beginning of the so-called fourth Industrial Revolution.

Legal certainty in this field is expected to foster business and innovation. The EU is to take action if it does not want to live by standards set by others (European Parliament Resolution of 16 February 2017, Recital ‘S’) such as the US and China, forefront competitors in AI. In view of the current stage of development of robotics and AI, the European Parliament has considered appropriate to start with the regulation of liability issues (European Parliament Resolution 16 Feb 2017, Recital ‘Y’). This standpoint is shared by the Commission which has expressed that it will explore the need to adapt the current legislative framework to the new technological landscape posed, among others, by AI, in particular to civil law liability (Commission Communication on the mid-term review on the implementation of the Digital Single Market Strategy – COM(2017) 228 final, 11). Indeed, the liability of smart robots is in need of urgent action due to the inadequateness of the existing rules to tackle it and the potential harm that robots can cause. An example could illustrate this point. In an accident provoked by a driverless car, the manufacturer might not be responsible as it could be the case for traditional products. The accident could be due to the programmer of the software or, instead, result from an incorrect training of the car. In terms of insurance it is not clear that the owner of the car should cover it as he/she is not driving (i.e. not provoking the accident). Additionally, ethical issues arise as in the event that the car has to decide (based on previous training) whether to damage third parties or not do so but, instead, put the owner’s life at risk. The topic, thus, affects several disciplines.



The aim of this Module is to debate, teach and conduct research on how the liability of smart robots should be regulated by EU law. The Module assumes the positive effects of a common EU legal framework on robotics, in the understanding that it would both foster the internal market as well as the external competitiveness of European businesses. It expects to join the efforts of regulating robotics by training future experts (lawyers, policy makers, businesspersons and Liability of Robots: A European vision for a new legal regime technology managers) and conducting research, with the aim of bridging the gap between policy makers and academics, with a European view.

The Module is designed to modernize our degree in Law (including dual degrees) the dual degree of Law and International Relations (LLB + BIR) by introducing an assessment of the liability of smart robots cross curriculum. This methodology is innovative as it takes a project-based approach in which an issue is analyzed from a variety of legal disciplines and embedded in the traditional discourse of each one of them. The innovation also arises from the fact that students will be trained to think in a future regulation – unlike customary teaching that starts from existing hard law. The aim is to begin to prepare future lawyers and policy makers in the regulation of a topic that will be of prime relevance during their professional lives and provide them a European vision to approach it.

The Module will also target students of professions directly involved in robotics. They will be introduced to EU legal studies in the particular field of liability of robots, thus promoting EU studies beyond its current framework at the institution and also beyond what is customary at a national level. This will be achieved by following the project-based approach which is legal in nature but deals with the most directly affected students whether law or not. Additionally, these students will enrich the Module by providing insights on business and technology – a particularly valuable asset for the members of the Module in a topic that is constantly evolving.

From the perspective of scholars, the Module will improve their teaching and research abilities by introducing them into a cutting-edge issue which will be multidisciplinary addressed. The latter is core to the research activities as well as the audience to which the Module addresses.

If Europe is to remain a leader in robotics, the future professionals need to receive a modernized training, with a Union vision. Equipping students with the capacity to think on the regulation of robotics (added to the technical skills that they are already acquiring) will place them in the avant-garde of their respective fields of expertise while being knowledgeable of the potential of EU law to address the most innovative of issues.

By: Professor Francisco de Elizalde

We use both our own and third-party cookies to enhance our services and to offer you the content that most suits your preferences by analysing your browsing habits. Your continued use of the site means that you accept these cookies. You may change your settings and obtain more information here. Accept