Open Access Open Access  Restricted Access Subscription Access

Human Robot Interface and Visual Compressive Detecting on Robot Arm

Zaiba ., Ambika Sekhar

Abstract


Vision compressive sensing, brain machine reference commands, and adaptive controller have been effectively integrated that enable the robot to perform manipulation tasks as guided by human operator’s mind and according to its innovative views. The ideology proposed consists of following main phases: (1) action recognition (2) hardware interfacing. Initially action recognition captures video according to movements enacted in recognition system. Later, in hardware interfacing it passes required commands by a sensor attached to microcontroller that senses the input, detects where the arm to be moved and provides better accuracy for the movements to be performed. Thus this system maintains robust communication that can manipulate actions and cognitive emotions. In IoT, the server will upload data to client with the help of MQTT protocol to access robot from anywhere across the world.

Cite this Article

Zaiba, Ambika Sekhar. Human Robot Interface and Visual Compressive Detecting on Robot Arm. Current Trends in Information Technology. 2018; 8(3): 5–12p.


Keywords


Action recognition, brain computer interaction, interfacing of human robot, internet of Things, sensory perception.

Full Text:

PDF

References


Lu R, Li Z, Su C-Y, et al. Development and Learning Control of a Human Limb with a Rehabilitation Exoskeleton. IEEE Trans Ind Electron. Jul 2014; 61(7): 3776–3785p.

Reza Abiri, Griffin Heise, Xiaopeng Zhao, et al. Brain Computer Interface for Gesture Control of a Social Robot: an Offline Study. arXiv:1707.07233 [cs.RO]. 2017.

1Shiyuan Qiu, Zhijun Li (Senior Member, IEEE), Wei He (Senior Member, IEEE), et al. Brain-Machine Interface and Visual Compressive Sensing-Based Teleoperation Control of an Exoskeleton Robot. IEEE Trans Fuzzy Syst. Feb 2017; 25(1): 58–69p.

Kiguchi K, Tanaka T, Fukuda T. Neuro-Fuzzy Control of a Robotic Exoskeleton with EMG Signals. IEEE Trans Fuzzy Syst. Aug 2004; 12(4): 481–490p.

Huang J, Tu X, He J. Design and Evaluation of the RUPERT Wearable Upper Extremity Exoskeleton Robot for Clinical and In-Home Therapies. IEEE Trans Syst Man Cybern Syst. Jul 2016; 46(7): 926–935p, doi: 10.1109/TSMC.2015.2497205.

He W, Zhao Y, Tang H, et al. A Wireless BCI and BMI System for Wearable Robots. IEEE Trans Syst Man Cybern Syst Jul 2016; 46(7): 936–946p.

Nicolelis MAL. Actions from Thoughts. Nature. 2001; 409(6818): 403–407p.

4Kiguchi K, Kariya S, Watanabe K, et al. An Exoskeletal Robot for Human Elbow Motion Support-Sensor Fusion, Adaptation, and Control. IEEE Trans Syst Man Cybern B. Jun 2001; 31(3): 353–361p.


Refbacks

  • There are currently no refbacks.


Copyright (c) 2018 Current Trends in Information Technology

  • eISSN: 2249-4707
  • ISSN: 2348-7895