Home » Uncategorized » Tour of Usability Lab at Mad*Pow

Tour of Usability Lab at Mad*Pow

The start-up world is filled with uncertainty. When entrepreneurs design a new app, device or software product, they can never be certain how patients or customers will interact with that product. Understanding usability factors is not always straight-forward, but local experience design firm Mad*Pow has a new lab space set up to help engineers and designers better understand how people interact with their products.


The new usability lab is located in Mad*Pow’s leather district office in downtown Boston. I recently had a chance to chat with Dan Berlin, Experience Research Director at Mad*Pow, check out the lab’s tools, and try an eye tracking experiment myself.

The core goal of a usability study is to understand the user’s relationship with an interface: how does it make her feel, what are her frustrations, and how might these impact the performance of the device or software? As Dan knowingly states, there’s nothing that proves you have a problem with your interface like watching a grandma struggle and cry while trying to use it.

Mad*Pow’s usability lab is set up to allow engineers, designers, and developers to non-intrusively monitor a user’s interactions with an interface using techniques like qualitative interviews, eye tracking, and biometrics. A two-way mirror separates observers from study participants, and allows for dynamic use of the space based on the needs of the study; the lab is setup for both usability studies, as well as focus groups. Video and audio is pumped into the local observation room, or to the cloud to allow clients to view the study live over the Internet.

Dan gives us a tour and explains some of the features of the Usability Lab in the video:

Eye tracking can be used to optimize visual layout of interfaces, while biometric sensors track a participant’s non-verbal response to an experience. Dan hopes to bring biometric research to the forefront of user experience (UX) research through the use of galvanic skin response (GSR) and heart rate to determine whether the user is feeling stressed, frustrated, or excited. While I didn’t get plugged in to the biometric sensors, I did get to try out an eye tracking experiment where Dan asked me to find Will Powley’s profile on the Mad*Pow website. Check out how it works:

While these tools may seem high-tech, Dan mentioned that these are the next step for usability research and are areas for future development. A typical usability study today relies on qualitative data gathered from participants in a one-on-one session with a trained facilitator. The next step for UX research, according to Dan, is to capture the biometric traces and marry these to the eye tracking data so that researchers can tell what a person was feeling while looking at something specific on the screen.


If you’re interested in using the lab for your next project, please contact Dan Berlin at dberlin@madpow.com. Many thanks to Dan and the Mad*Pow team! If you are curious to learn more about biometrics, eye tracking, and usability, Dan also teaches a two-day course at Bentley University called ‘Measuring Emotional Engagement’ as part of the User Experience Certificate Program.

Shannon Moore

Shannon Moore

    Shannon is an Associate Consultant at DRG Consulting, where she helps clients in the life sciences approach strategic problems. As a new-comer to Boston, she's very excited about all of the medical innovation happening in her neighborhood, and loves learning about the people and resources that make it so vibrant. Shannon also holds a PhD in Biomedical Engineering where she studied the biomechanics of bone regeneration. She can be reached at shannon@medtechboston.com.

    Similar posts


    1. Cristin says:

      How does incorporating the biometric measurements into a study design work? If the participant has something clipped to their finger, does that require a non-typing task? Looking forward to hearing more about it.


    2. Dan Berlin says:

      Hi Cristin. Yes, when the sensor is clipped to a participant’s finger, we limit the study to non-typing tasks. Hopefully, someone will create a wireless sensor that attaches to the wrist, so we can remove this limitation. Affectiva used to make one, but they discontinued it last year.

    Follow us!

    Send this to a friend