Tickets

Free, booking required.

Date

Mon 21 Mar
12:00 pm – 1:30 pm

Venue

Melbourne Connect - Launch Pad
700 Swanston Street, Carlton VIC, Australia

Access

Wheelchair Access

Two Can Productions

Discriminatory Algorithms Past Event

Presented by Melbourne Connect


In this presentation, the Centre for AI & Digital Ethics (CAIDE) gives an introduction to how machine learning techniques learn to make decisions that discriminate against minorities, and discuss why removing this discriminatory bias is difficult. Machine learning technologies are powerful techniques for discovering patterns in data, which can be used to make new inferences. However, recent research shows that these techniques can be unreasonably biased against minority groups; for example, hiring algorithms that exclude women, and recidivism algorithms that assign higher risk scores to people who aren’t white. Such algorithms can operate at a scale not possible for human decisions makers, meaning the impactful decisions that are harmful to people can be widespread. In this interactive presentation, we will lead participants through a series of short exercises that demonstrate the basics of machine learning, how and why machine learning techniques learn discriminatory biases, how some biases can be discovered and mitigated, and why mitigation techniques do not entirely solve the problem.

Participants

Professor Tim Miller

Professor Tim Miller is a pioneer in the field of AI-assisted decision making and explainable AI. His work focuses on the areas of Human-AI interaction and collaboration, explainable AI, accountability and trust. Tim has extensive experience developing novel and innovative solutions with industry and defence collaborators. Tim is the Deputy Head of School (Academic) in the School of Computing and Information Systems as well as being the Co-Director for the Centre for AI and Digital Ethics. He is most well known for his ability to convey complex Artificial Intelligence concepts to those outside his field and provide explanations and critiques with theories from the social sciences.

Dr Marc Cheong

Dr Marc Cheong is currently a Senior Fellow (at Melbourne Law School) and Senior Research Fellow in Digital Ethics (at the Centre for AI and Digital Ethics, CAIDE), Faculty of Engineering and Information Technology; and an Honorary Burnet Institute Senior Fellow. In 2022 Marc will be taking on a lecturing role in the Information Systems group in the School of computing and information systems. He is interested in the intersection of technology (big data, social media, etc) and philosophy (existentialism, ethics, epistemology, and Experimental Philosophy).

Gabby Bush

Gabby Bush is the program manager at the Centre for AI and Digital Ethics (CAIDE). In this role, Gabby coordinates the work of CAIDE, including engagement, research dissemination, grants and projects, including work on monitoring and surveillance, bias in algorithms and the CAIDE research stream in Art, AI and Digital Ethics. Gabby joined the Centre from Canberra, where she spearheaded engagement and partnerships in technology and development. Prior to that Gabby ran the eGovernance and Digitisation project for the United Nations Development Program in Samoa. Gabby hails from Aotearoa New Zealand and has postgraduate qualifications in International Development and Religious Studies.