
Linepod
Building a mobile sensemaking platform for blind users (2016-2017)
Current displays for the blind struggle to display spatial information. To interactively make sense of complex spatial data, users must often resort to text-focused technology, like Braille displays or screen readers. Our lab previously presented Linespace, a sensemaking platform for the blind - a large interactive tactile display using lines as its primitive, where users can interactively feel glyphs, lines, and textures and hear audio feedback.
Our bachelor project now built on Linespace’s initial proof-of-concept with Linepod, a portable, interactive, 2-D spatial tactile display. Linepod interactively prints high-quality tactile lines to swell paper, senses touch input using an infrared layer above the paper, and provides interactive audio feedback via a smartphone. Linepod is self-contained and able to run on battery power.
We demonstrate Linepod through several demonstrative sensemaking applications, including a tactile web browser, displaying tactile outlines of websites to facilitate screen-reading; a camera application, including filter-based editing; and minesweeper, to show the potential for game-based problem solving.
My contribution during the project was making the device interactive and enabling application development. Therefore I
- reverse engineered the protocol of the infrared touch sensor,
- designed a communication protocol between the microcomputer in the device and an Android smartphone and
- conceptualized and implemented a d3.js/Android framework that we used to build 5 applications.

Linepod uses swell paper, a type of paper that swells up on contact with heat.
By heating up this special kind of paper with a laser, we can create tactile lines that a blind user can feel.