Fair or Biased? Users’ Fairness Perception of Algorithmic Systems
Tsvi Kuflik - Department of Information Systems
Doron Kliger - Department of Economics
Orna Rabinovich-Einy, Avital Mentoviz and Avital Shulner Tal - Faculty of Law
Machine Learning
Deep learning
Neural Networks
Algorithms
Processing
Database Collection Grant 2021
Systems based on artificial intelligence and machine learning are part of our daily lives. These systems run search engines, recommendation and rating systems and are used in a variety of areas such as filtering job applications and credit applications, and assisting in making legal and medical decisions. These different applications have a broad impact on our lives and therefore it is necessary to examine the fairness and transparency of these systems.
The main problem is that such systems are generally considered as “black boxes”, i.e. even when these systems have been tested and defined as fair, their actions and results are not always clear or perceived as fair to their users. Lack of explanations about how the system works and how and why decisions were made in the system may lead to biases, discrimination and a sense of unfairness among users and this can affect their desire to use the system.
Our research focuses on understanding users’ unfairness perceptions and reducing them. We explore how the demographic and personality characteristics of users and the different characteristics of the system affect the perception of fairness of the users and how different explanations about a system can improve users’ perception of fairness. We further plan to test and develop the topic of personalized explanations, in addition to developing guidelines for testing users’ fairness perception regarding algorithmic systems. The study is a multidisciplinary one, combining Information Systems, Economics, Law and Psychology. This research is partly supported by the Cyprus Center for Algorithmic Transparency and by a grant from the Data Science Research Center (DSRC) at the University of Haifa, Israel.