In this paper, we propose to use two EMI-Gloves connected to an IBM compatible PC via Degraded HyperEllipsoidal Composite Neural Networks (DHECNNs) to implement a prototype speaking aid for the deaf or non- vocal persons. Using a special learning scheme, the DHECNNs can quickly learn the complex mapping of measurements of ten fingers' flex angles to corresponding categories. The prototype speaking aid is evaluated for the classification of 51 static hand gestures from 4 "speakers". The recognition accuracy for the testing set is 92.3%.
Relation:
第二屆國際醫學工程週論文集=Proceedings of the 2nd Medical Engineering Week of the World,頁244-249