Session
Organizer 1: Alex Comninos, Association for Progressive Communications
Organizer 2: Deborah Brown, Association for Progressive Communications
Speaker 1: Jelena Jovanovic, Technical Community, Eastern European Group
Speaker 2: Vidushi Marda, Civil Society, Asia-Pacific Group
Speaker 3: Alex Comninos, Civil Society, African Group
Association for Progressive Communications
Break-out Group Discussions - 60 Min
- Alex Comninos and Deborah brown (APC) will be moderators - Jelena Jovanovic (cyber security professional) will provide an overview of the concepts of algorithmic transparency, algorithmic justice, algorithmic bias and real life examples of the effects of algorithms from an information security perspective. - Vidushi Marda (Article 19) will provide an overview of the human rights aspects of automated decision making. She will focus on the GDPR Article 22 and the EU guidelines on Automated decision-making. She will provide a policy and human rights perspective.
Alex Comninos, Civil Society, Male, Africa Group
Deborah Brown, Civil Society, Female, Western Europe and others Group
Jelena Jovanovic, Technical Community, Female, Eastern European Group
Vidushi Marda, Civil Society, Female, Asia-Pacific Group
Lorena Jaume-Palasi: Civil Society, Female, Western Europe and others Group
Joy Liddicoat: Academia, Female, Western Europe and others Group
Karen Reilly: Technical Community, Female, Western Europe and others Group
Chinmayi Arun, Academia, Female, Asia-Pacific Group
Malavika Jayaram, Civil Society, Female, Asia-Pacific Group
I - Introduction to the issues by the speakers (25 minutes) The speakers will in twenty minutes (five minutes per speaker) introduce the problems posed from a human rights perspective of automated or algorithmic decision-making. Algorithmic justice, algorithmic bias, and algorithmic transparency shall be introduced as concepts.The technical, legal and human rights issues will also be posed. 2 - Break out group discussions I [25 minutes] Groups will ask how algorithms affect their lives and identify problems that algorithms could cause for them. 15 minutes Groups will report back 10 minutes 3- Break out Group Discussion II [25 minutes] Groups will discuss technical and policy solutions to ensuring algorithms can provide a right to explanation. 15 minutes Groups will report back 10 minutes 4. Panel discussion of Group's responses 10 minutes The speakers will respond to the report backs and issues raised by the groups. 5. Questions from audience to panelists 15 minutes
The speakers each introduce briefly their own concerns and interventions regarding AI
Groups will ask how algorithms affect their lives and identify problems that algorithms could cause. The groups will be broken up thematically.
One person shall report back from each group
Group discussion shall feed into a final outcome document.
For more info on group discussion, see the agenda.
All breakaway groups will have to chat and come up with an output for presentation, whether online or offline. Online participation will be done via breakaway groups on the collaborative online notepad etherpad - which would allow the participants to chat as well as to come up with a document for presentation at the feedback sessions The online participation will be advertised through Twitter and the RP platforms. Online participation can be done in groups that have IGF "real life participants too" will be encouraged to use etherpad to develop and report on their discussions. An ideal online participation outcome would involve the on-site and offline participants both working on the same etherpads, thus building bridges with RP.
Part 1: Lightning talks - 25 minutes
- Each speaker gives a "lighting talk" of max 2 minutes on their specific area of intervention/expertise.
Part 2: Breakaway group discussion - 20 Minutes
- Breakaway groups discussing different aspects of algorithmic transparency
- The remote participants will organise an internet breakaway group
- Someone from each group volunteers to rapporteur
Part 3: Report back from breakaway group discussions - 10 Mintes
- Rapporteurs report back and display their flip charts
- Remote participants, the internet reports back
- Some panelists take notes and document in order to create an outcome document for the event.
Part 4: Questions - 5 - 10 minutes
Wrap up with questions and interventions from audience and remote participants.
Report
Session Type: Breakaway Sessions
Date and Time: 12 November 11:20-12:20
Organisers: Alex Comninos & Deborah Brown
Rapporteurs: Alex Comninos, Deborah Brown & Joonas Mäkinen, as well as delegated members of the breakaway groups.
- Session Type (Workshop, Open Forum, etc.): The session was meant to consist of breakaway sessions, but there were too many attendees, so we had an open discussion with the audience instead.
- Title: Algorithmic transparency and the right to explanation
- Date & Time: 12th November 11:20-12:20
- Organizer(s): Alex Comninos (VOUS.AI) & Deborah Brown (APC)
- Chair/Moderator: Deborah Brown & Alex Comninos
- Rapporteur/Notetaker: Alex Comninos & Joonas Mäkinen
- List of speakers and their institutional affiliations (Indicate male/female/ transgender male/ transgender female/gender variant/prefer not to answer):
- Please state no more than three (3) key messages of the discussion. [150 words or less]
- The terms really need to be unpacked. We mean many things by AI, there are many concepts contained within. The explainability of systems involves a lot of unpacking and addressing algorithm, codes and systems (technological and human)
– AI contains a social component, the human systems that manage AI are very important looking forward.
- Code is made by people, it might copy social bias from humans –– either training data is skewed or it was unsuccessful coding
- Please elaborate on the discussion held, specifically on areas of agreement and divergence.
There was broad support for the importance of protecting personal data and of meaningfully unpacking the right to explanation. There was consensus that the right to explanation is important, there was however less consensus as to how it would be achieved in practice, and whether it is achievable. Some participants focused on the importance of transparency institutions. Some focused on the challenges posed by the explainability of automated systems, which even with open code, and without complex AI, is hard to explain without a deep understanding of the development process, and documentation of this process. It was suggested that transparency was a more achievable goal.
- Please describe any policy recommendations or suggestions regarding the way forward/potential next steps. [100 words]
There needs to be more incentive to make the development process open, so that automated decisions are explainable. We need to extend explainability beyond narrow technical and mathematical explanations towards meaningful explanations understandable by lay people.
- What ideas surfaced in the discussion with respect to how the IGF ecosystem might make progress on this issue? [75 words]
We need more stakeholder discussion on AI, and more focus next year on algorithmic transparency and the intersection of AI and data protection.
- Please estimate the total number of participants.
350
- Please estimate the total number of women and gender-variant individuals present.
All women panel
- To what extent did the session discuss gender issues, and if to any extent, what was the discussion? [100 words]
Gender issues were raised by panelists and participants and were linked to bias and issues of justice in decision-making.