Free Webinar | AI x New Consumer Expectations

Webinar-Breaking-News-Icon-1
Join us on

15 or 16 May 2024

Sign up now

Crowdsourcing images of hands, GiveAHand.ai aims to bridge the AI gap for sign language

While the AI landscape is developing at breakneck speed, capabilities surrounding sign language recognition have been hindered by a lack of well-structured and extensive data sources. To fix that problem, the American Society for Deaf Children has launched GiveAHand.ai, an ambitious project to create the world's largest open-source image library of hands, complete with tagged data.

Anyone can upload an image of their hand using the website and a webcam. Participants hold their hand in whichever gesture they choose, and a picture is taken and uploaded to the database. Next, the image is tagged, and keypoints are added to define the location and shape of each finger. Researchers and developers can download the full, structured data set to create more accurate hand and finger detection models.

Since the goal is to democratize access, GiveAHand.ai's database will be available for both commercial and non-commercial use. The initiative was developed in partnership with creative studio Hello Monday (part of DEPT®), which previously worked with ASDC on fingerspelling.xyz.

Trend Bite

Improvements in hand models have the potential to unlock novel learning methods and real-time translation tools for sign language, similar to those available for spoken languages, creating new levels of language accessibility for the deaf and hard-of-hearing community. And thanks to GiveAHand.ai, anyone can pitch in.

Advancements in technology have the potential to level the playing field for consumers like never before, connecting people no matter how they communicate. As American Society for Deaf Children points out: 'Deafness is not a disability, but language deprivation is.' If you need a refresher on why accessibility is the new social justice frontier, be sure to (re)read our OMNIBILITY trend briefing.