775 million adults are illiterate, globally. We set out to make use of the ubiquity of smartphones and internet connection around the world, designing an app that teaches illiterate people how to read and write.
The thing that striked me most during my 2-month stay in Abidjan, Ivory Coast, was the smartphone adoption rate of a population which is 60% illiterate. How can it be that people who cannot read are able to send and receive messages, manage their mobile money accounts, make calls and listen to music on their smartphone? And if the smartphone is already being used for all these actions, could it be used as an educational tool, too?
Our research involved sitting in literacy classes for refugees, going through academic journals on teaching methods, as well as taking inspiration from gamified apps for language teaching and children.
We challenged ourselves to design for people with complete illiteracy, believing that if we can teach them something, we will be able to teach less extreme cases, too. We started by exploring how human perception works, before it is biased by our education system. This sounds easier than it is - because we are biased to an extent that makes it difficult to see through. Due to my relocation in Munich, we partnered with refugee centres and education centres to conduct our testing. We conducted a series of explorations, with the aim of understanding how human perception works. And I even started learning Arabic, in an attempt to understand how my mind deals with learning new symbols.
Informational hierarchies which might seem evident to the regular user are not self-evident to our users: In this case, the lower & upper case M are not being perceived as two elements belonging to the same group.
One of the most difficult challenges was having to explain a task to user without written words (e.g. “do this, then that”), and without audio explanations (the refugees’s languages varied too much, and they were all learning German). However, we were encouraged to find a way by the fact that the best teachers never speak in the language of the student.
During testing, we identified that listening to a word/letter prior to handing over the task is crucial. We made use of constraints that force the user to first hear the pronunciation of the word/letter. We then used animations to explain how the exercise works.
We realised that with this unique user group, we have the opportunity not only to design a useful app, but also to challenge educational models and information architecture structures.
From left to right: 1 & 2 - One of the most interesting ideas we tested was teaching the alphabet not alphabetically, but according to the structure of the keyboard; 3 - A chat-based teaching assistant, inspired by Duolingo and other chat apps.
Perhaps the most interesting learnings of this project was how just complex our perception actually is. We take for granted the thousands of micro skills we’ve acquired during our lives - from recognizing symbols to identifying complexvisual structures - which help us decipher the complex world we live in. Working on a project like Alo is a good reminder of the extent of our own ignorance.