Ultrasound Based 3D Tracking of Hand Position with AI

Embedded system for ultrasound based 3D hand tracking with HW accelerated AI inference reaching 18 frames per second performance.

We have developed and implemented ultrasound based 3D hand tracking device with HW accelerated AI inference. The device is using ultrasound speaker for generating of periodic 40 KHz chips and 2 x 16 microphones arranged in two orthogonal lines. Ultrasound signal coming to the microphones is converted by an beam-former algorithm and creates 3D reflection space serving as input to the AI algorithm for tracking of 3D hand position.

In the stage of training of the AI model, system generated the ultrasound chirps and collected the 3D ultrasound echo intensity data jointly with an infra-red camera based Leap Sensor data and also with standard webcam data. Leap Sensor generated reference data for training of the AI model.

Trained 3D AI model provided sufficient 3D tracking precision for both hands and also for left, right or none hand class.

The AI model was converted to the normalised AI model suitable for implementation in the int 8 arithmetic supported by the AMD DPU accelerator. It was integrated in the programmable logic part of the AMD Zynq Ultrascale+ device, together with HW accelerated beam-former. System is running Linux and performs ultrasound tracking of hand position with performance 18 frames per second.

We have also implemented ultrasound based 1D tracking of hand distance based on Bayesian testing of hypotheses about hand presence/absence and adaptive recursive RLS Lattice algorithm [1].

 

Related publications:

  1. LIKHONINA Raissa; UGLICKICH Evženie. Hand detection application based on QRD RLS Lattice algorithm and its implementation on Xilinx Zynq Ultrascale+ , Neural Network World vol.32, 2 (2022), p. 73-92, DOI: 10.14311/NNW.2022.32.005.
     

Links

Dataset
 

Contact person

Zdeněk Pohl