Grasp Posture Control of Wearable Extra Robotic Fingers with Flex Sensors Based on Neural Network
This study proposes a data-driven control method of extra robotic fingers to assist a user in bimanual object manipulation that requires two hands. The robotic system comprises two main parts, i.e., robotic thumb (RT) and robotic fingers (RF). The RT is attached next to the user’s thumb, while the R...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-05-01
|
Series: | Electronics |
Subjects: | |
Online Access: | https://www.mdpi.com/2079-9292/9/6/905 |
id |
doaj-afd0ffff30db41119e48155ae9efb8d6 |
---|---|
record_format |
Article |
spelling |
doaj-afd0ffff30db41119e48155ae9efb8d62020-11-25T03:07:29ZengMDPI AGElectronics2079-92922020-05-01990590510.3390/electronics9060905Grasp Posture Control of Wearable Extra Robotic Fingers with Flex Sensors Based on Neural NetworkJoga Dharma Setiawan0Mochammad Ariyanto1M. Munadi2Muhammad Mutoha3Adam Glowacz4Wahyu Caesarendra 5Department of Mechanical Engineering, Faculty of Engineering, Diponegoro University, Semarang 50275, IndonesiaDepartment of Mechanical Engineering, Faculty of Engineering, Diponegoro University, Semarang 50275, IndonesiaDepartment of Mechanical Engineering, Faculty of Engineering, Diponegoro University, Semarang 50275, IndonesiaDepartment of Mechanical Engineering, Faculty of Engineering, Diponegoro University, Semarang 50275, IndonesiaAGH University of Science and Technology, aleja Adama Mickiewicza 30, 30-059 Kraków, PolandDepartment of Mechanical Engineering, Faculty of Engineering, Diponegoro University, Semarang 50275, IndonesiaThis study proposes a data-driven control method of extra robotic fingers to assist a user in bimanual object manipulation that requires two hands. The robotic system comprises two main parts, i.e., robotic thumb (RT) and robotic fingers (RF). The RT is attached next to the user’s thumb, while the RF is located next to the user’s little finger. The grasp postures of the RT and RF are driven by bending angle inputs of flex sensors, attached to the thumb and other fingers of the user. A modified glove sensor is developed by attaching three flex sensors to the thumb, index, and middle fingers of a wearer. Various hand gestures are then mapped using a neural network. The input data of the robotic system are the bending angles of thumb and index, read by flex sensors, and the outputs are commanded servo angles for the RF and RT. The third flex sensor is attached to the middle finger to hold the extra robotic finger’s posture. Two force-sensitive resistors (FSRs) are attached to the RF and RT for the haptic feedback when the robot is worn to take and grasp a fragile object, such as an egg. The trained neural network is embedded into the wearable extra robotic fingers to control the robotic motion and assist the human fingers in bimanual object manipulation tasks. The developed extra fingers are tested for their capacity to assist the human fingers and perform 10 different bimanual tasks, such as holding a large object, lifting and operate an eight-inch tablet, and lifting a bottle, and opening a bottle cap at the same time.https://www.mdpi.com/2079-9292/9/6/905data-driven control methodextra robotic fingersflex sensorforce-sensitive resistorneural networkbimanual manipulation |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Joga Dharma Setiawan Mochammad Ariyanto M. Munadi Muhammad Mutoha Adam Glowacz Wahyu Caesarendra |
spellingShingle |
Joga Dharma Setiawan Mochammad Ariyanto M. Munadi Muhammad Mutoha Adam Glowacz Wahyu Caesarendra Grasp Posture Control of Wearable Extra Robotic Fingers with Flex Sensors Based on Neural Network Electronics data-driven control method extra robotic fingers flex sensor force-sensitive resistor neural network bimanual manipulation |
author_facet |
Joga Dharma Setiawan Mochammad Ariyanto M. Munadi Muhammad Mutoha Adam Glowacz Wahyu Caesarendra |
author_sort |
Joga Dharma Setiawan |
title |
Grasp Posture Control of Wearable Extra Robotic Fingers with Flex Sensors Based on Neural Network |
title_short |
Grasp Posture Control of Wearable Extra Robotic Fingers with Flex Sensors Based on Neural Network |
title_full |
Grasp Posture Control of Wearable Extra Robotic Fingers with Flex Sensors Based on Neural Network |
title_fullStr |
Grasp Posture Control of Wearable Extra Robotic Fingers with Flex Sensors Based on Neural Network |
title_full_unstemmed |
Grasp Posture Control of Wearable Extra Robotic Fingers with Flex Sensors Based on Neural Network |
title_sort |
grasp posture control of wearable extra robotic fingers with flex sensors based on neural network |
publisher |
MDPI AG |
series |
Electronics |
issn |
2079-9292 |
publishDate |
2020-05-01 |
description |
This study proposes a data-driven control method of extra robotic fingers to assist a user in bimanual object manipulation that requires two hands. The robotic system comprises two main parts, i.e., robotic thumb (RT) and robotic fingers (RF). The RT is attached next to the user’s thumb, while the RF is located next to the user’s little finger. The grasp postures of the RT and RF are driven by bending angle inputs of flex sensors, attached to the thumb and other fingers of the user. A modified glove sensor is developed by attaching three flex sensors to the thumb, index, and middle fingers of a wearer. Various hand gestures are then mapped using a neural network. The input data of the robotic system are the bending angles of thumb and index, read by flex sensors, and the outputs are commanded servo angles for the RF and RT. The third flex sensor is attached to the middle finger to hold the extra robotic finger’s posture. Two force-sensitive resistors (FSRs) are attached to the RF and RT for the haptic feedback when the robot is worn to take and grasp a fragile object, such as an egg. The trained neural network is embedded into the wearable extra robotic fingers to control the robotic motion and assist the human fingers in bimanual object manipulation tasks. The developed extra fingers are tested for their capacity to assist the human fingers and perform 10 different bimanual tasks, such as holding a large object, lifting and operate an eight-inch tablet, and lifting a bottle, and opening a bottle cap at the same time. |
topic |
data-driven control method extra robotic fingers flex sensor force-sensitive resistor neural network bimanual manipulation |
url |
https://www.mdpi.com/2079-9292/9/6/905 |
work_keys_str_mv |
AT jogadharmasetiawan graspposturecontrolofwearableextraroboticfingerswithflexsensorsbasedonneuralnetwork AT mochammadariyanto graspposturecontrolofwearableextraroboticfingerswithflexsensorsbasedonneuralnetwork AT mmunadi graspposturecontrolofwearableextraroboticfingerswithflexsensorsbasedonneuralnetwork AT muhammadmutoha graspposturecontrolofwearableextraroboticfingerswithflexsensorsbasedonneuralnetwork AT adamglowacz graspposturecontrolofwearableextraroboticfingerswithflexsensorsbasedonneuralnetwork AT wahyucaesarendra graspposturecontrolofwearableextraroboticfingerswithflexsensorsbasedonneuralnetwork |
_version_ |
1724670202585546752 |