Academia.eduAcademia.edu

Outline

Arduino Controlled Robotic Arm

2018, International Journal of Trend in Scientific Research and Development

https://doi.org/10.31142/IJTSRD11512

Abstract

The evolution of Internet of Things has been tremendous in today's generation. A Promising and challenging mission is the use of autonomous robot systems to automate tasks in the field of maintenance. Major concern of evolution in robotics involves reducing human burden. This project work addresses the critical environment where Human presence is mandatory due to its nature of demanding accuracy like bomb diffusion, Chemical and radiation containment etc. A promising and challenging mission is the use of autonomous robot system to automate task in the field of maintenance. Each device is uniquely identifiable by the controlling software which is the core concept of Internet of Things Due to the remote operation procedure of a ROBOT .It lags in human precision and adaptability. This project proposes a novel idea of imitating the human hand movement with a virtually controlled robotic arm, which can be utilized in human endurance situations. A video and audio interface plug-in will be developed to recognize audio commands and also video gestures. Every method is experimentally validated and discussed.

FAQs

sparkles

AI

What are the advantages of using a camera in controlling robotic arms?add

The study finds that the camera-controlled robotic arm model reduces costs and improves accuracy compared to traditional IR sensor models. This approach allows for better replication of human mobility and enhances operational efficiency in critical tasks.

How does the brain-computer interface enhance robotic arm functionality?add

The research demonstrates that the brain-computer interface (BCI) infers user intentions through EEG signals, allowing for direct control of the robotic arm. By incorporating visual serving techniques, users can manipulate objects with greater precision and safety.

What role did the Amazon Robotic Challenge play in robotic arm development?add

Introduced in 2015, the Amazon Robotic Challenge encouraged advancements in robotic manipulation for tasks such as picking and stocking. This initiative significantly stimulated research into automation technologies and their practical applications in robotics.

Why do robotic arms require modularity versus integration considerations in design?add

The paper indicates that balancing modularity and integration is crucial for optimizing robotic systems' flexibility and performance. This consideration enhances adaptability in various applications while maintaining operational efficiency.

What experimental methods were used to validate the tracking algorithm of the robotic arm?add

The experiments involved designing three distinct object trajectories, capturing 100 images per trajectory to assess tracking efficacy. Comparison between adaptive and non-adaptive tracking algorithms revealed critical insights into the system's accuracy.

References (20)

  1. L. Torabi and K. Gupta, "An autonomous six- DOF eye-in-hand system for in situ 3D object modeling," Int. J. Robot. Res., vol. 31, no. 1, pp. 82-100,2012.
  2. S. Radmard, D. Meger, E. A. Croft, and J. J. Little, "Overcoming unknown occlusions in eye- in-hand visual search," in Proc. IEEE Int. Conf. Robot. Autom., 2013, pp. 3075-3082.
  3. M. Rickert, A. Sieverling, and O. Brock, "Balancing exploration and exploitation in sampling-based motion planning," IEEE Trans. Robot., vol. 30, no. 6, pp. 1305-1317, Dec. 2014.
  4. C. Yang, M. Fu, and H. Ma, Advanced Technologies in Modern Robotic Applications. Springer, Singapore, 2016.
  5. T. Carlson and J. D. R. Millan, "Brain-controlled wheelchairs: A robotic architecture," IEEE Robotics & Automation Magazine, vol. 20, no. 1, pp.65-73, 2013.
  6. J. Zhao, W. Li, X. Mao, and M. Li, "SSVEP- based experimental procedure for brain-robot interaction with humanoid robots." Journal of Visualized Experiments, vol. 2015, no. 105, 2015.
  7. Hajiloo, M. Keshmiri, W. F. Xie, and T. T.model predictive control for a constrained image-based visual servoing,"IEEE Transactions on Industrial Electronics, vol. 63, no. 4, pp. 2242-2250, 2016.
  8. S. Ren, K. He, R. Girshick, and J. Sun, "Faster R- CNN: Towards Real-Time Object Detection with Region Proposal Networks," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 39, no. 6, pp. 1137-1149, June 2017.
  9. N. Correll, K. E. Bekris, D. Berenson, O. Brock, A. Causo, K. Hauser,K. Okada, A. Rodriguez, J. M. Romano, and P. R. Wurman, "Analysis and Observations From the First Amazon Picking International Journal of Trend in Scientific Research and Development (IJTSRD) ISSN: 2456-6470
  10. @ IJTSRD | Available Online @ www.ijtsrd.com | Volume -2 | Issue -3 | Mar-Apr 2018 Page: 1766
  11. Challenge," IEEE Transactions on Automation Science and Engineering, vol. PP, no. 99, pp. 1- 17, 2017.
  12. C. Hern´andez, M. Bharatheesha, W. Ko, H. Gaiser, J. Tan, K. van Deurzen, M. de Vries, B. V. Mil, J. van Egmond, R. Burger, M. Morariu, J. Ju, X. Gerrmann, R. Ensing, J. van Frankenhuyzen, and M. Wisse,
  13. Team Delft's Robot Winner of the Amazon Picking Challenge 2016, In S. Behnke, R. Sheh, S. Sarıel, and D. D. Lee, eds., RoboCup 2016 Proceedings (to appear), volume 9776 of Lecture Notes in Computer Science, pp. 613-624, Springer, 2017.
  14. J. Mahler and K. Goldberg, "Learning DeepPolicies for Robot Picking by Simulating Robust Grasping Sequences," in Proceedingsof the 1st Annual Conference on Robot Learning, ser. Proceedings ofMachine Learning Research, S. Levine, V. Vanhoucke, and K. Goldberg,Eds., vol.
  15. PMLR, 13-15 Nov 2017, pp. 515-524.
  16. S. Karaman and E. Frazzoli, "Sampling-based algorithms for optimalmotion planning," Int. J. Robot. Res., vol. 30, no. 7, pp. 846-894, 2011.
  17. Y. Yu and K. Gupta, "C-space Entropy: A measure for view planning andexploration for general robot-sensor systems in unknown environments,"Int. J. Robot. Res., vol. 23, no. 12, pp. 1197-1223, 2004.
  18. N. R. Gans, G. Hu, K. Nagarajan, and W. E. Dixon, "Keeping multiplemoving targets in the field of view of a mobile camera," IEEE Trans.Robot., vol. 27, no. 4, pp. 822-828, Aug. 2011.
  19. H. Liu, "A fuzzy qualitative framework for connecting robot qualitative and quantitative representations," IEEE Transactions on Fuzzy Systems, vol. 16, no. 6, pp. 1522-1530, 2009.
  20. K. Tarabanis, R. Y. Tsai, and A. Kaul, "Computing occlusion-free viewpoints," IEEE Trans. Pattern Anal. Mach. Intell., vol. 18, no. 3, pp. 279-292, Mar. 1996.