How AI impacts work

Professor Marleen Huysman, head of the KIN Center of Digital Innovation, is leading social science research on how Artificial Intelligence (AI) is being applied in organizations. While societal debates on AI are often characterized by either scary or optimistic projections of AI automation, Marleen fosters research that is more down-to-earth but no less impactful. Her main message is that AI is happening now and that it is not necessarily making organizations more intelligent.

Unlike other social scientists, Marleen incorporates AI technology in the theory she develops and she also invests much more in understanding how the technology changes to practice.

Social scientists use surveys or interviews to find out how people experience AI systems. The problem is that these methods fail to show how these systems are changing practice – especially in the long term.”

Instead, by embedding researchers in organizations, KIN research is revealing these changes. AI developers often falsely presume that intelligence can be extracted from human heads and outsourced to an AI system, which can then be used anywhere. In reality, that intelligence lies in social practices where professionals encounter problems and help each other out. Organizations therefore experience unintended ripple effects, such as having to introduce new functions to interpret the abstract output of AI systems. If organizations should try to outsource the expertise of their professionals to AI, they risk losing social intelligence that they will not be able to regain.

Ph.D. student Lauren Waardenburg investigates the way predictive policing works out in practice. Predictive policing relies on a smart algorithm that tries to predict, based on data of crimes committed, where and when crimes will occur in the future. This system should help police officers prevent crimes, but Lauren found that the system also has different implications in practice. What does happen revealed itself over the last 2.5 years of her ongoing fieldwork? These revelations did not come for free:

“You need to have a tireless curiosity in the people and what their work is really about, as well as the technology and how it really works.”

Sometimes Lauren skypes with fellow Ph.D. student Elmira van den Broek to compare their experiences. Elmira is investigating how an AI system is developed and used to assist the internal recruitment process of a multinational. Like Lauren, Elmira has already gained many valuable insights from being embedded in the company.

Marleen is developing a collaborative model for AI development in organizations. Lauren, for example, collaborated closely with the data scientists who developed predictive policing and now shares their joint insights in an article for the entire Dutch police force leadership.

The collaborative AI model suggests that data scientists should borrow from embedded research methods. Data science curricula should teach social science concepts. There should be a sincere effort as well as genuine interest to understand social practices in organizations, in order to bridge the gap between AI developer and user.

Author: David Passenier