inovate

TapType: AI-Assisted Hand Motion Tracking Using Only Accelerometers

taptype figure 4

The team at the Detection, Interaction and Perception Laboratory at ETH Zürich, Switzerland, has developed TapType, an interesting text input method that relies solely on a pair of wrist-worn devices, which detect acceleration values ​​when the wearer taps on any old surface. . By feeding acceleration values ​​from a pair of sensors on each wrist into a Bayesian inference classification-like neural network which in turn feeds a traditional probabilistic language model (predictive text, for you and me), the resulting text can be entered up to 19 WPM with an average error of 0.6%. Expert TapTypers report speeds of up to 25 WPM, which could be quite usable.

READ MORE:  Elon Musk's big plans for Twitter: What we know so far

Details are a little scarce (this is a research project, after all) but the actual hardware looks pretty simple, based on the Dialog DA14695 which is a nice Cortex M33 based Bluetooth Low Energy SoC. This is an interesting device in itself, containing a “sensor node controller” block, capable of managing sensor devices connected to its interfaces, independently of the main processor. The sensor used is the Bosch BMA456 3-axis accelerometer, which stands out for its low power consumption of only 150 μA.

READ MORE:  Researchers look to deliver 'unbiased judgment of AI bias' in medicine
The user can “type” on any convenient surface.

The wristband units themselves appear to be a combination of a main PCB housing the BLE chip and the support circuit, connected to a flexible PCB with a pair of accelerometers on each end. The whole thing was then slipped into a flexible strap, likely constructed from 3D-printed TPU, but we’re only guessing, as the progression from the first platform to the wearable prototype is unclear.

What’s clear is that the wristband itself is just a dumb data-streaming device, and all the smart processing is done on the connected device. System training (and subsequent selection of the most accurate classifier architecture) was performed by recording volunteers “typing” on an A3-sized keyboard image, with finger movements tracked with a tracking camera. motion, while recording acceleration data streams from both wrists. There are some additional details in the published article for those who wish to dig a little deeper into this research.

READ MORE:  Evisoft embeds AI into contract management software, raises $100M

Eagle eyes may recall something similar from last year, from the same team, which correlated bone conduction sensing with VR-like hand tracking to generate input events in a VR environment.

Source link

Leave a Comment