Ethics in AI Research Seminar (Wednesday - Week 6, HT22)

Visions of future technology from the early 19th century imagined many ways in which it would improve our lives: bringing more time for leisure, restoring calm, make better decisions, and to live better, more economically, and more fulfilling lives. Instead, people today are being tricked, cajoled, surveilled, harassed, and even blackmailed by apps and interactive systems on a daily basis, often in ways that do not serve their best interests. What happened? While researchers have decried the rise of so-called "dark patterns", we believe there is a far broader and more problematic lack of consideration of how systems are treating people, which is becoming even more of a problem as systems become more capable (e.g. mixed initiative, and social), as well as pervasive. In this talk, we attempt a simple idea: to apply the philosophy of respect to the design of systems, demonstrating that respect is an incredibly versatile and useful construct for thinking about many different ways individuals should, and indeed deserve, to be treated.

Max Van Kleek is Associate Professor of Human-Computer Interaction with the Department of Computer Science, at the University of Oxford.  He works in the Software Engineering Programme, to deliver course material related to interaction design, the design of secure systems, and usability.  He also leads (as Co-Investigator) the EPSRC PETRAS project: ReTIPS, or Respectful Things in Private Spaces.  Until 2017, he was interaction research theme in the EPSRC Project SOCIAM, and leads several SOCIAM projects at the intersection of personal and social data systems and architectures.  His current project is designing new Web-architectures to help people re-gain control of information held about them "in the cloud", from fitness to medical records.  He received his Ph.D. from MIT CSAIL in 2011.

To receive joining instructions please register here
Ethics in AI Research Seminar Convenors: John Tasioulas and Carina Prunkl | Any queries should be directed to