This post is part of New Technologies and the Law in War and Peace Symposium.
The machine itself makes no demands and holds out no promises: it is the human spirit that makes demands and keeps promises. In order to reconquer the machine and subdue it to human purposes, one must first understand it and assimilate it. —Mumford, Technics & Civilization (1934)
The idea for collaboration on this book sprang out of a conversation on a sunny sidewalk café table with Bill Boothby over a year ago. I had attended yet another conference where the regulation of technology was being discussed, and was frustrated (as usual) by three particular things in this context. Firstly, the lack of understanding about the individual technologies under discussion. Secondly, I was disappointed by the lack of imagination about how these technologies will connect with each other. Thirdly and finally, it seemed obvious to me that there needed to be a connection made between the laws in peace and the laws in war, from which many lessons could be learned in both circumstances.
This book addresses my third concern, providing an analysis of how laws in peacetime and in war operate, and where their similarities, parallels and gaps are. This is a great start to approaching the law on similar topics, but in completely different contexts, in a way that has great potential to move both areas of law forward. Connecting the law in peacetime and wartime seems obvious to me, but as areas of law that have developed separately and with very different cultures, I learned that the idea of connecting them was not obvious to all. The conversation with Bill Boothby was one that I had had with many others, but Bill Boothby was the one who went on to coral, coordinate, and collaborate, and the result is this timely and extremely useful book.
Some months later, I was lucky enough to attend a conference in Stockholm run by SIPRI where security and technology were discussed. At this conference, again in one of the side conversations, a colleague shared the work she was doing with the International Standards Organisation on autonomy and how this work, in an internationally agreed forum, could potentially help to move conversations about autonomy forward in a defence context, and particularly where cyber-physical systems (CPS) were becoming more prevalent. The term CPS was brought back into the limelight by the World Economic Forum (WEF) in 2015, when it also identified current technological developments as the ‘Fourth Revolution’.
What the WEF refers to as the Fourth Revolution is powered by data, but is not limited to the use of data in artificial intelligence (AI). It includes the use of AI in aspects as mundane as our shopping recommendations, and as discriminatory as our health insurance policy rates or whether we get selected for some jobs (for now). More importantly, it encompasses the embedding of digital systems in physical systems, and then these being built to scale, and communicating with each other. The implications of the Fourth Revolution are widespread and profound. They are already changing everything about our everyday lives: from our global geopolitics down to the way that we perceive ourselves and move in the world (think about social media and our phones). Some of these technologies have the clear potential to change who we are, and existing law almost certainly does not provide adequate frameworks and protections. In addition to the existing law, creative approaches to meet the rapidly changing applications of systems will be needed to ensure that values remain protected and aspirations are upheld.
The legal challenges arising from current CPS fall into two broad categories: firstly, the shorter-term legal issues that are captured by such ideas as privacy, monopoly or anti-trust law. These are all very important and require lawyers in these areas to consider deeply how these laws are being undermined, and to ensure respect for the existing law.
The second category concerns the broader changing way in which we are interacting with the world and with each other. In the longer term, CPS, or artificial intelligence embedded in physical systems that go to scale, will challenge our broader legal system itself. What does it mean to have administrative law principles in transactions in which no humans are involved? What does trust look like where there are no humans to interface with? How is trust established and where can decisions be appealed, or even understood? Who is responsible for the actions of algorithms embedded in systems? These issues arise from my first two concerns with legal conferences discussing technology, namely, the lack of understanding of the individual technologies under discussion and then, the lack of imagination about how these technologies will connect with each other.
A better understanding of basic technologies by more lawyers is required, but is not sufficient. A corollary of the writing contained in this book is that it is not only up to the lawyers. Computer scientists, technologists, engineers, just to name a few, need to have a better appreciation of the risks arising from the creation of their technologies – whether through dual use, lack of transparency, deliberate misuse, or even enabling values which we, as a society, cannot support. We need to have more robust conversations about what it means for the human to be at the centre of all technological decisions, rather than for them to be in the hands of a few powerful companies. Technology should be used to improve lives, and by that we include all of humanity –with gender, geographic, and all other kinds of diversity. This book is an excellent primer to begin ensuring that these goals are met by surveying the existing law in peace and wartime, and by doing so, achieving an oversight of where the gaps may be.
As Mumford said, we as humans, but particularly as lawyers responsible for good governance, have a responsibility to ‘understand’ the technology. Lawyers (and others) must also ‘assimilate’ the technology –they need to have the imagination to consider how these technologies will connect with each other and what legal issues will arise both in the short and long-term for all. Linking conversations about development and use of technology in peace and wartime will become more important as these technologies connect, are embedded in physical structures, and go to scale. Many thanks to Opinio Juris for creating the space to contribute to this conversation. I consciously call it a conversation as we don’t yet have all of the answers. We are just starting to frame new questions, and it is these new questions that will be key in shaping the technologies that will, in turn, shape us. It is all of us, as lawyers, and as humans, from all different walks of life and educational backgrounds, who must make demands, and ensure that promises are kept.