Philipp Steinacher

Interaction Designer and Developer

Designing for Inclusion

We as designers can help to remove obstacles, enabling people to participate in society no matter their individual challenges. Let’s imagine a future of adaptive systems that puts human empathy at their design core.

The project around „Designing for Inclusion“ was part of a class in my bachelors degree course in Interface Design at the Potsdam University of Applied Sciences in collaboration with Microsoft Research and the Microsoft Design Expo 2015.

In the beginning the project started without any real constraints in terms of the expectation of an actual product and prototype. The task was to explore a context dependent disability and understanding not understanding it as an attribute of a person. Our design solutions were supposed to address a person’s permanent, situational and temporary constraints.

The goal was to create a product, service or solution for someone with a context-dependent disability. It must meet a clear need and be extensible to wider applications and should have a realistic chance of adoption.

Research

To familiarize ourselves with different impairments, their meaning to those who are affected and how they live with it on a daily basis, we did some basic and naive low fidelity prototyping to address a few topics. That lead to a common ground to talk about topics and making sure that we all felt we could talk about difficult topics with in a same and confident language.

Furthermore, we wanted to understand the influence of design in the context of disabilities better and consulted articles and books like „Design meets disability“ by Graham Pullin that argues that some impairments combined with well designed assistants can make affected, superior to able-bodied people.

Quite quickly we settled on the topic of visual impairments since we realized that there is a huge group of partially sighted people world wide. While we argued that there are a lot of assistants out there to help people out in situations where they need help.

Interviews

To better understand how affected people live with constrained eyesight, we started to talk to experts and partially sighted people. We had the chance to follow and observe some of them during common daily tasks.

Soon enough we focused on a more specific area: How do visually impaired go grocery shopping. Therefore, we asked them to go grocery shopping with us following them to develop insights into their behaviour and understand their difficulties. In the end we could break down our findings into five main insights:

Problem

Originating from our main insights we concluded that visually impaired people find themselves with two big problems that we wanted to tackle:

First of all, the discovery of new products is challenging, frustrating and time consuming. Therefor, exploration is luxurious.

Secondly, people don’t step out of their known structures and buy the same things over and over again because they know exactly where to find them.

Concept

Based on our initial research we developed a smart wearable: Polo is a discovery tool for blind and sighted people alike. It consists of a bracelet and a companion app, guiding you around. The bracelet provides directional information by vibrations. It is also equipped with a bone conduction hearing aid for communicating product information while keeping the user’s ear free. The slower you move, the more detailed the product information are.

We propose a mental model that argues that navigating the supermarket is based on three different knowledge layers that allow you to find different kind of products or explore new items.

Taste Layer

The understanding of the general layout of the supermarket allows you to find the general area where you might find a product.

Architecture Layer

If you’re familiar with a specific store, you understand which isles to walk to locate the item you’re looking for.

Product Layer

The knowledge of items that you buy frequently allows you to find it without even thinking about it.

While sighted people can easily make use of the top two layers to find new products and explore a diverse set of products, visually impaired have difficulties to move knowledge up the hierarchy. Hence, it’s hard to find products they haven’t used before.

To resolve this major issue, Polo guides a user through the supermarket. When the user walks at regular speed, the bracelet will announce the general contents of an isle via the bone conduction hearing aids. As soon as the user walks more slowly, the device announces what’s in the shelf in front of them. The moment the person reaches for a product, Polo will call out the name and as soon as the product is in the hand, Polo will automatically tell the price and necessary details about ingredients.

Polo does not only allow blind people to explore a grocery store with all it’s variations of different products and tastes but also a wide range of diverse people who might be in need of help within the store. We distinguish between three different but overlapping user groups who would have a more convenient shopping experience.

Permanent

Permanent visually impaired people can use Polo in their daily life to reclaim their excitement for different taste on their own..

Temporary

Temporary blind people, e.g. a person who had a treatment at the eye doctor or people who have a hard time to locate themselves in a new store can make use of Polo to follow their daily routine as usual without additional help.

Situational

Sometimes customers have to find something really quick on time. Polo allows them to locate a specific product and navigate to them in no time and without any obstacles.

Prototyping & Validation

To test and validate our concept we built a first prototype using tape, wires, vibration motors and a spark core micro controller. The technology behind polo isn’t too sophisticated, since it just uses bluetooth localization to navigate the store and NFC tags to identify individual products.

We decided to create a low fidelity prototype to determine interaction patterns and learn first hand how it would feel the most natural to use such a device and service. Hence we could also test if our assumptions about the whole experience would work out with a working device on our arms.

To test the experience of navigating the store, we set up our bracelet with vibration motors and a small audio system which we could both control remotely. Therefore we were able check if we can expand the mental model of a grocery store by guiding a test subject around in our virtual mockup store. We were able to test different vibration patterns and isle category announcement granularities. Since we had the test platform around our arms and could adjust the different parameters we could learn and make the system better while we tested it with our users.

After a some adjustments we realized that distinguishing between the four vibration motors for direction works extremely well. Users could easily determine different directions and follow them, even when trying to grab a small item. At all time the test subjects had a clear idea about their surrounding even though they couldn’t see at all or just partially. The promising results made us incredibly excited about the feasibility of such a product and how it can make the shopping experience more convenient for everybody.

Retrospection & Learnings

This concept of contextual information dependent on movement patterns presented to the user by vibrational patterns and audio information easily adopts to other scenarios where a user finds himself in a more or less unknown three dimensional information space. Imagine a library guiding visually impaired people to the book they want or pretty much everyone within a museum using an audio guide: Changing the density-level of audio information dependent on the way a user moves through the museum, resting longer at works he finds interesting with more detailed information and rushing through parts he doesn’t care about at all with very basic information, seems like a very promising base for an audio guide — not only for visually impaired.

The design process itself has taught us a lot about the use of good user research and being guided by a prototype driven approach. While we were very focused on getting the interaction details to the right level, our advisors were very helpful to remind us to think about the whole experience and transport the message behind the product.

Background

Learn more about how the team created Polo in our more detailed project blog.

  1. Designing for Inclusion
  2. Research
  3. Interviews & User Studies
  4. Ideation & Prototyping
  5. Midterm Presentation
  6. Shooting a Movie
  7. Final Presentation

Polo has been designed as part of the Microsoft Design Expo 2016. We are proud that the project has been nominated for the Interaction Design Awards 2016 shortlist by the Interaction Design Association.

Interaction Awards 2016 Shortlisted Logo

Team

Thanks to our advisors Prof. Boris Müller (FH Potsdam), Fabian Morón Zirfas (FH Potsdam), Richard Banks (Microsoft Research), Don Coyner (Skype) and Andreas Koller (Skype). Also to our partners: Microsoft Research, Institut für Unschärfe and Biocompany.