Illustrating for algorithmic bias

September 27, 2018 by · Comments Off on Illustrating for algorithmic bias 

As part of the UnBias project I was asked to create illustrations for the Fairness Toolkit’s Trustscape and Awareness Cards. The toolkit is designed to raise awareness and create dialogue about algorithms, trust, bias and fairness. My involvement in the project started with a series of quick sketches for stickers to be used with the Trustscape. The sketches were made in response to the results of workshops with young people who identified issues, themes and difficulties in the network world, and described a wide range of bias in algorithmic decisions and how they impact on peoples lives. 

 

For the UnBias Awareness Cards the brief was to create a design for each of the eight suits: Rights, Data, Factors, Values, Process, Exercise and Glossary. The fronts of the cards contain examples, activities, scenarios and information about algorithmic bias and the ways prejudiced behaviours can emerge in systems. The focus of my illustrations was on how algorithmic decisions could affect people and communities; how do we know decisions are being made fairly and not threatening rights; how do we know decisions are not being based on gender and race? How do we know we are in social media bubble, what is real or fake and what to trust?

At the same time I also wanted the illustrations to celebrate some of the pioneering developments in computing, often made by people who wanted to enable others, and to reference the history of communication technologies, computation devices, predicting machines and mass communication technologies. 

It was important for each card to be unique but for the common themes to flow through all of them.  Across the cards you will find patterns and references to computation devices and processes: QR codes, punch cards, network diagrams, server arrays, excerpts of code for sorting algorithms, circuit board diagrams, flowcharts, early devices like the Difference Engine and Tide Predicting Machine no 2, the Mac Classic and the handheld devices and social media apps we use today. Since algorithms work behind the scenes of the web to filter and sort data, several cards feature machines used for measuring, weighing, sorting, ranking, dividing and filtering.

The main text styles are inspired by typefaces that have a relationship to the history of computing. ‘Factors’ is based on the early Selectric font for IBM’s Selectric electric typewriter which went on to become one of the first to provide word processing capability. ‘Exercise’ and ‘Example’ were inspired by the typefaces in early forms of electronic communication; telegrams,  teletext and ticker tape. The lettering of  ‘Data’, ‘Values’, ‘Rights’, ‘Process’ and ‘Glossary’ were inspired by fonts I had seen on early computation devices, like Pascal’s Typewriter, Babbage’s Difference Engine, Kelvin’s and Ferrel’s Tide Predicting Machines, and by typefaces used on mass-produced adverts and posters in the industrial revolution.

The edge of the main title scrolls are decorated with mathematical motifs like > <, ( ), X, etc. And the outer borders are decorated with binary. One of the simplest ways of visualising an algorithm is using a flowchart, and the centre shape of each card is inspired by the frames used in flowcharts to represent different stages of the process:- ‘stop/start’, ‘database’, ‘processing’, ‘decision’, ‘repetition’ ‘connector’.

UnBias Awareness Cards – Glossary Suit Illustration

Glossary is a bit different to the other cards, there is only one Glossary card and it holds a definition of the meaning of ‘ALGORITHM’. The images on the back reference various storage and processing devices, reel to reel, server array, a mac classic, an early word processor, tablet, ticker tape, punch cards, fortran cards, blackboard and an abacus. 

The card also celebrates some pioneers in mathematics. The algorithm on the computer screen and on the blackboard is Euclid’s Greatest Common Divisor (GCD), dating back to Ancient Greece it is one of the oldest algorithms still in usage.

The writing around the scroll border are excerpts from Ada Lovelace‘s pioneering algorithm to calculate Bernoulli numbers, written in the early 1840s, it is considered by some to be the first computer programme. Ada was an english mathematician, thought to be the first computer programmer and the work this is from is one of the most important documents in the history of computing. 

Standing at the chalkboard is Dorothy Vaughn, a leading mathematician and early programmer who worked at NASA and its predecessor in the 1930s, 40s, 50s and 60s. Working in a time of racial segregation she led the West Area Computing team. She was the first African American supervisor at NASA and one of very few women at that level, but was not officially acknowledged, or paid, as such for several years. She was visionary in her realisation that computers would take over much of the human calculators work and taught herself FORTRAN and other languages, which she then taught to the other women, to be ready for the change. Her work fed into many areas of research at the Langley Laboratory and she paved the way for a more diverse workforce and leadership at NASA today.

Grace Hopper was a groundbreaking programmer who, in the 1950s and 60s, pioneered machine-independent programming languages and invented one of the first compiler tools that translated English words into the machine code that computers understood. Grace was an American computer scientist who realised that people would more easily be able to use computers if they could programme in English words and then have those translated into machine code.  She created the FLOW-MATIC the first English like programming language and was instrumental in the Development of COBOL, which is still widely used today. She did much to increase understanding of computer communications and went on to push more women to enter the field and for people to experiment and take chances in computing.

A Raven sits on the Blackboard watching  because all Corvids (Ravens, Crows, Rooks etc)  are renowned for their problem solving skills (the Crow Search Algorithm (CSA) is based on the intelligent behaviour of crows).

UnBias Awareness Cards – Data Suit Illustration

Mapping Perception

September 16, 2018 by · Comments Off on Mapping Perception 

In memoriam Dudley Sutton:

MAPPING PERCEPTION from andrew kotting on Vimeo.

MAPPING PERCEPTION was a four year collaboration between Giles Lane, curator and producer (Proboscis), Andrew Kötting, the acclaimed director of Gallivant, This Filthy Earth and Ivul and Dr Mark Lythgoe, neurophysiologist at the Institute of Child Health, London.

MAPPING PERCEPTION examines the limits of human perception through an investigation of impaired brain function, making visible the connections between scientific and artistic explorations of the human condition, probing the thin membrane between the able and the disabled.

At the heart of the project is Eden, Andrew’s daughter. She was born at Guy’s Hospital, London, in 1988 with a rare genetic disorder – Joubert Syndrome – causing cereberal vermis hypoplasia and several other neurological complications. Eden thus participates in the project as both a catalyst and a cypher for a more general investigation into how we see the world and perceive difference.

MAPPING PERCEPTION had four main outcomes:

a 37 minute 35mm film
an immersive & environmental sensory installation
a book & CD-ROM
a website

UnBias Toolkit Workshops at V&A Digital Design Weekend

September 12, 2018 by · Comments Off on UnBias Toolkit Workshops at V&A Digital Design Weekend 

I will be running four workshops with Alex Murdoch exploring the UnBias Fairness Toolkit at the V&A’s Digital Design Weekend on Saturday 22nd and Sunday 23rd September. Each workshop is intended for different audiences and contexts in which the toolkit could be used.

UnBias Fairness Toolkit Educators Workshop
Seminar Room 1, Sackler Centre for arts education
Saturday 22, 11.30-13.30
Algorithms, bias, trust and fairness: how do you engage young people is understanding and discussing these issues? How do you stimulate critical thinking skills to analyse decision- making in online and automated systems? Explore practical ideas for using the UnBias Fairness Toolkit with young people to frame conversations about how we want our future internet to be fair and free for all.

UnBias Fairness Toolkit Industry Stakeholders Workshop
Seminar Room 1, Sackler Centre for arts education
Saturday 22, 14.30-16.30
The UnBias project is initiating a “public civic dialogue” on trust, fairness and bias in algorithmic systems. This session is for people in the tech industry, activists, researchers, policymakers and regulators to explore how the Fairness Toolkit can inform them about young people’s and others’ perceptions of these issues, and how it can facilitate their responses as contributions to the dialogue.

DESIGN TAKEOVER ON EXHIBITION ROAD
Sunday 23, 10.00-17.00
Celebrate ten years of London Design Festival at the V&A with a special event on Exhibition Road. Bringing together events by the Brompton Design District, Imperial College, the Natural History Museum, the Science Museum and the V&A, this fun-filled day of design, workshops and talks will offer something for everyone, and a unique way into the many marvels of Albertopolis.

UnBias Fairness Toolkit Workshops
Young people (12-22 yrs) 12.00-13.30
Open Sessions 15.30-17.00
What is algorithmic bias and how does it affect you? How far do you trust the apps and services you use in your daily life with your data and privacy? How can we judge when an automated decision is fair or not? Take part in group activities exploring these questions using the UnBias Fairness Toolkit to stimulate and inspire your own investigations.

Download the V&A DDW Brochure

Colleagues from Oxford University and Horizon Digital Economy Institute will also be running UnBias activities as part of the event:

UnBias
The Raphael Cartoons, Room 48a
Drop-in from 12.00-16.00
How do you feel about fake news, filter bubbles, unfair or discriminatory search results and other types of online bias? How are decisions made online? What types of personal data do you share with online companies and services? Do you trust them? Explore these through a range of activities, from Being the Algorithm to Creating a Data Garden, and from Public Voting to making a TrustScape of how you feel about these issues. Suitable for families.

UnBias Fairness Toolkit

September 7, 2018 by · Comments Off on UnBias Fairness Toolkit 

This slideshow requires JavaScript.

The UnBias Fairness Toolkit is now available to download and use. It aims to promote awareness and to stimulate a public civic dialogue about algorithms, trust, bias and fairness. In particular, on how algorithms shape online experiences, influencing our everyday lives, and to reflect on how we want our future internet to be fair and free for all.

The tools not only encourage critical thinking, but civic thinking – supporting a more collective approach to imagining the future as a contrast to the individual atomising effect that such technologies often cause. The toolkit has been developed by Giles Lane, with illustrations by Alice Angus and Exercises devised by Alex Murdoch; alongside contributions from the UnBias team members and the input of young people and stakeholders.

The toolkit contains the following elements:

  1. Handbook
  2. Awareness Cards
  3. TrustScape
  4. MetaMap
  5. Value Perception Worksheets

All components of Toolkit are freely available to download and print under Creative Commons license (CC BY-NC-SA 4.0).

Download the complete UnBias Fairness Toolkit (zip archive 18Mb)

DOI