Help App

The Help App is built for victims and witnesses of emergency situations, such as car accidents or heart attacks. It shortens the response duration of the emergency service, by creating a communication channel between the call center and the user.

My role

I coordinated the human-centered design process in our team, conducted research, aggregated the results into early-stage prototypes, and designed how the app looks and feels.

Visual Identity



The initial idea came about when I saw a person falling on the sidewalk, and didn't know how to respond. Other people joined, but they were as afraid of doing something wrong as I was. This episode coincided with a design challenge on mobile health I was engaged in, providing for a good support platform.

Assessing how people deal under extreme stress proved to be a difficult task, since these are rare events. We therefore took a human-centric approach. We mapped out the involved stakeholders and their touchpoints. We received inputs from parties like accident victims and rescuers, experts (the Red Cross), analogous environments that deal with high stress situations (Women's Helpline).

From our research, we learned that the user interface needed to be clear and operable under situations of extreme stress, even by elderly or injured. We integrated the support for speaking and hearing impaired, through SMS communication. We used color and bold shapes to give it a medical, highly functional look and feel. By using large and clear press-areas, we kept the number of screens to a minimum. We used a mix of online and offline technologies to avoid dependency on data connection.

Tools and methods

> Stakeholder map; customer journey; personas; interviews with users, experts; immersion in context and in analogous environments

> Sketch; Marvel Prototyping

Smart call

Imprecise positioning and mandatory questions lengthen the call duration with the emergency service. Here, the user calls the emergency service and the app sends over the GSP coordinates, phone number and name of the user. A mix of online and offline technologies make this possible in any situation.


Smart SMS

Speaking and hearing impaired are currently unable to access emergency services directly, and being in state of shock or partial disability creates typing difficulties. The user completes a form, and the data is being sent over as text - including the GPS coordinates.

Aftercall support and guidance

The call center communicates the status of the situation and the rescue time needed. Situation-specific information is being recommended to the user, pulled from the app's storage - like 'How to do CPR', and accompanying acoustic tempo.


Visual and acoustic signals

This feature facilitates finding the victim or the witness, when the rescue team is arriving at the place of the accident.