We are pleased to invite submissions for our forthcoming Workshop on Intelligent Machines and Human Behaviour, to be held as part of the AISB 2019 Conference in Falmouth UK, 16th-18th April 2019. Artificially intelligent machines are becoming increasingly prevalent in modern society and are likely to play an important, even ubiquitous, role in future everyday decision making. This is a trend that is likely to accelerate as new techniques for automated-reasoning and machine-learning are applied to decision making within real-world domains. That these machines will have a great impact upon human society is beyond doubt. There is the potential for such machines to improve nearly every aspect of human life, particularly when artificial intelligence can overcome the well known shortcomings in human decision making such as those identified by behavioural economists. Insights from behavioural economics are behind the rise of ‘nudge’ initiatives, and are in themselves subject to a critique of their ethics. However there is also the potential for AI machines to act to the detriment of people. For every cancer successfully detected at an early stage, there could be a bank computer denying (or approving) a mortgage, or the consequences of an autonomous vehicle that makes a poor decision about whether to evade an obstacle or emergency brake. This is not to ascribe explicitly malicious intent, but merely to recognise that most current, and likely future, machine systems will be as imperfect as those who have created them. Additional complexities can stem from the interplay between intelligent machines and human society. A further layer of risk and complexity is added once humans with malicious intent are included. Whilst a machine can be used to help recognise poor behaviours, for example eating excess junk food, and can in turn help manage that person’s behaviour in order to form better habits, such an approach could be used in the absence of informed consent. This raises the possibility that a sufficiently motivated organisation could attempt to manage the behaviour of whole electorates and so influence the political direction of a nation. This may sound far fetched but it is merely an automated version of the recent application of “nudge” techniques to politics in the United Kingdom. Whilst machines can help people to live better lives, or reach their full potential, there is also the mirror scenario of machines being used to manage an individual’s behaviour to the detriment of that individual. Thus the study of how machines, in particular intelligent machines that can learn to recognise behaviours and respond accordingly, interact with humans, and how the behaviour of humans can be directly or indirectly affected as a result, is a topic of timely and deep importance.

Submission

We welcome full papers (limited to 6 pages in length using the AISB format) or else of position statements (limited to 3 pages in length using the AISB format) around thematic topics in the area of Intelligent machines and human behaviour. Thus participants have the opportunity to discuss both more mature work, without prejudicing further conference submissions, and early stage work or topics of interest. All submissions will be peer-reviewed by members of the programme committee. AISB uses the ECAI format and templates are available from:

http://aisb2019.falmouthgamesacademy.com/programme/submissions/

Accepted papers will be grouped into thematic sessions that incorporate extensive time for questions and discussion. The session will close following a town-hall discussion that is designed to facilitate the mapping of papers to the interdisciplinary landscape, and the development of future collaborations between participants. Please submit to the organising committee via easychair:

https://easychair.org/conferences/?conf=imhb2019

by 14 January 2019. Full details are available from the workshop web-site at http://arg.napier.ac.uk/events/imhb

Organising Committee

  • Simon Wells s.wells@napier.ac.uk (School of Computing, Edinburgh Napier University)
  • Kate Pangbourne K.J.Pangbourne@leeds.ac.uk (Institute for Transport Studies, University of Leeds)
  • Hannah Bowden hannah.bowden.19@ucl.ac.uk (UCL Department of Behavioural Science and Health, University College London).