The conceptual design of an intuitive, effective, and fun search experience based on gestures.
Those with hearing impairments, who use a sign language as their primary language, experience language barriers and challenges when using text-based search. In other words, for signers, using text is like communicating in an entirely different language, and creates frustrating barriers to natural, intuitive articulation.
Recent technological advancements in sign language video translation have exposed an opportunity to make search more accessible and intuitive by enabling gestural, video-based input.
How might we design an intuitive, fun, and effective search experience based on gestures to provide relief from the constraints of text-by-default search, unlock signers’ more intuitive modes of articulation, and bring joy to the experience of searching with sign?
Multi-Modal User Experience
Case Study Deck
Problem & Opportunity
Design Principles & Style
Final Design & Prototypes
Mobile Widget Experience
This project underscored the paramount importance of empathy and sensitivity towards user needs when designing inclusively for digital products. This experience highlighted the importance of deep-rooted understanding of the user's perspective, and that empathy must be at the forefront of every decision made throughout the design process.
The project also highlighted the significance of contextual research in uncovering valuable and timely opportunities for innovation. Through secondary research of academic papers, I was able to identify the precise pain points and limitations that signers face when using text-based search. I backed up this finding by conducting desk research of hearing-impaired communities, to guide the direction of the design and ensured that the solution was genuinely addressing a pressing need.
The project reinforced the importance of simple and intuitive UI design, especially when dealing with complex use cases. I discovered that a clear and straightforward user interface not only enhances usability but also empowers users to interact with confidence.
In the end, the lessons learned from this project have not only enriched my design skills but also instilled in me a commitment to creating inclusive, user-centered digital experiences.
Next, I would like to test my interactive prototypes by conducting comprehensive user testing with hearing-impaired individuals, to ensure that the design effectively addresses their needs while also uncovering any lingering pain points in the user experience. This iterative approach will allow us to refine the design, making it more intuitive and efficient based on direct feedback from the users.
I also plan to enhance the experience of visual feedback the application provides by designing fully-animated hand-sign and gestural sequences. These animations will serve as a crucial component of the user interface, aiding users in reviewing the accuracy of their search queries. This visual feedback will not only enhance the user experience but also boost user confidence in the system's understanding and responsiveness.
Finally, I plan to build the app to integrate seamlessly with other devices, such as Google Nest for video capture of users' sign queries and smartwatches for initiating search queries with sign gestures on the go. Such improvements will expand the conceptual design's compatibility and accessibility across various platforms and devices, making it integrated and intuitive in users' daily lives.