Interaction Designer
001-MacBook-Silver---Multiple-TS-pics2.jpg

Sign Access

 

Typed Signs

Typed Signs is a series of ongoing projects to improve accessibility to Sign Languages in both analogue and digital worlds.

 
 

The Why

Sign Languages like ASL (American Sign Language), are used within Deaf communities. They are distinct from English, meaning they have different grammatical structures, and words that have no English equivalent (like the French terms “déjà vu” or Japanese term “komorebi”). Unfortunately, written forms of signed languages are currently unintuitive and difficult to produce. The ability to write in ones native language is important, as it allows information to be mentally processed without constantly translating between two completely different languages, one of them subvocal (a common cognitive load for signers). 

The ability to print Sign Languages would facilitate its propagation, especially to many areas with limited deaf resources (98% of the deaf population have no access to education in sign language).

In addition, an accessible encoding for Sign Language opens it up to digital technologies. It can help computers recognise signing, useful for communicating with “Artificially Intelligent Assistants” (akin to how voice is used for “Siri, Google Assistant, Amazon Echo, etc.”). It can also be useful for within VR (Virtual Reality) or MR (Mixed Reality), and creating Animated Avatars that can sign – for mediums like Cartoons and Videogames. This in turn facilitates cultural awareness and education of the language.


Processes

The purpose of the Typed Signs Lab is to improve accessibility to Sign Languages like ASL.

Community feedback is incredibly important for these projects. Simply conducting in person user-research is insufficient given the broad range of geographies and demographics. Hence, we design and release multiple pilots at rapid paces. These pilots offer crucial insight into the design and planning of future product iterations.

These products and experiments include apps, dictionaries, and investigations into designs built upon Computer Vision and Artificial Intelligence. All are part of this design process.


Typed Signs Lab – Portal

A preliminary website showcasing the pilots open for public testing. These are functional prototypes, therefore ones not reliant on wizard-of-oz or other in-person user research methods.


iMessage App (Public Pilot)

The Typed Signs iMessage App consists of animated ASL signs in addition to a finger-spelling composition area. Stickers can also be received, and sent via the Apple Watch.

We suspected the increased fidelity (in animations and visuals) would increase usage over pilots reliant on FormalSignWriting. The higher usage was day and night. Fingerspelling was also much more popular than expected (50-50 compared with natural signs).

However, there are drastically fewer signs in this pilot (compared to those that leverage the SignWriting database). Procedural animation is currently being investigated to bridge the gap between quantity and quality.


 
 

ASL Words (Public Pilot)

ASL Words is an app for looking up ASL signs by typing english words. Signs can also be found when a user searches for anything on their phone via spotlight. The 10,187 signs here originate from the SignWriting database.

Though there's been positive feedback from students, it seems most users find SignWriting to be too abstract and low in visual fidelity. Hence, this pilot and future features (e.g. categories) are currently de-prioritized.


 
 

Daily ASL Bot

Building on Giphy and SignWithRobert's release of over 2000 ASL gifs. An automated Twitter bot was created to for Twitter engagement. This was tied to preliminary research related to online social networks and outreach.


 
 

ASL Lookup by English (Public Beta)

A search engine to look up existing ASL signs listed on popular sites.

After collecting various ASL links, we explicitly wanted an Interactive Experience that allowed instant search results as a user typed. All existing frameworks result in designs with delays of several hundred milliseconds. Hence a custom search function was created to allow the instant lookup of over 12,000 ASL glosses. 


ASL Lookup by Sign (Public Beta)

Currently, ASL signs can only be found by their english gloss. This is a significant obstacle to language acquisition. This pilot allows the look up of signs by their visual appearance.

The ongoing design is informed by academic and quantitative research into the distribution of signs in ASL. Keys are mapped to handshapes, motions, contacts, and locations. As informed by other pilots, the entries here need be of a higher visual fidelity.


Research into Design possibilities (Internal Testing)

These projects are concerned with maximizing accessibility in remote areas, hence our design medium is digital consumer devices over more bespoke solutions.

We hypothesized that advances in these mediums allow for new interactions. Such as the ability for mobile devices to recognize hand-signs in real time. Preliminary tests (leveraging Deep Neural Networks, Computer Vision, and iPhone7's A10 chip) have proved fruitful. Such investigations contribute to overall direction as well as detailed design decisions.

( Another exploration determined it's quick and easy to perform live translation of english words to ASL (via text, speech, or camera). However, as cool as it is, this is not a priority for actual users. )


Why Now?

Principles of Human-Centered Design (HCD) and Interaction Design (IXD) have largely remained the same over the last 30 years. However, like the advent of the Printing Press and the Graphical User Interface (GUI), the mediums our services reside have changed drastically. This decade has seen the rise of smartphones, crowdsourcing, and big data (artifacts which were not present in earlier attempts to writing signed languages).

Invisible leaps in Deep Learning in 2012 and improvements to mobile processors in the last couple years have been precursors to the astronomically different services we can now design and create. In this case, it is now possible to design for an evolving, multi-dialect, 3D language like ASL. We can do things like accurately match searches to individual user-intent, process vastly more feedback, transcribe signs from camera feeds in real time, and much more.


Overall

Sign Languages are 3-Dimensional languages with conceptual-origins, they open up dimensions of thinking that aren’t easily found in linear languages with phonetic-origins. Typed Signs is an ongoing series of projects to ease communication within sign languages, in an endeavor to open up new worlds.

As Ludwig Wittgenstein puts it: "The limits of my language are the limits of my world."

Screen-Shot-2016-11-06-at-1.24.56-PM.png