• editor@ijmra.in
  • ISSN[Online] : 2643-9875  ||  ISSN[Print] : 2643-9840

Volume 07 Issue 12 December 2024

Smart Robotic Surgical Assistant Using Voice Command and Image Processing
1Shreyas J, 2Jyothi AP, 3Mallaradhya H M
1Department of Robotics and Automation, Ramaiah University of Applied Science
2Department of Computer Sciences Ramaiah University of Applied Science
3Department of Mechanical and Manufacturing Engineering Ramaiah University of Applied Science
DOI : https://doi.org/10.47191/ijmra/v7-i12-12

Google Scholar Download Pdf
ABSTRACT:

This paper presents the development of a smart robotic surgical assistant that utilizes voice command and image processing to aid in surgical instrument handling. The system integrates a Dobot robotic arm controlled through verbal instructions to retrieve and position surgical tools, while an image recognition model, based on VGG16, identifies instruments in real-time from camera feeds. This automation enables hands-free operation, supporting sterile conditions and enhancing efficiency in the operating room (OR). A dataset comprising high-resolution surgical tool images was curated to fine-tune the VGG16 model, achieving over 95% accuracy in classification. Voice recognition, incorporated with OpenCV, reached a 92.5% accuracy rate in interpreting commands. This system addresses challenges in surgical tool management, offering an efficient and reliable alternative that reduces human error and improves workflow, signifying a major step toward integrated AI-robotics applications in healthcare.

KEYWORDS:

Robotic surgical assistant, voice command, image processing, surgical tool handling, VGG16, Dobot robotic arm, real-time recognition, Robotic Scrub nurse.

REFERENCES
1) Jacob, M.G., Li, Y.T. and Wachs, J.P., 2011, October. A gesture driven robotic scrub nurse. In 2011 IEEE International Conference on Systems, Man, and Cybernetics (pp. 2039-2044). IEEE.

2) Tan, H., Xu, Y., Mao, Y., Tong, X., Griffin, W.B., Kannan, B. and DeRose, L.A., 2015, May. An integrated vision-based robotic manipulation system for sorting surgical tools. In 2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA) (pp. 1-6). IEEE.

3) Carpintero, E., Perez, C., Morales, R., Garcia, N., Candela, A. and Azorin, J.M., 2010, September. Development of a robotic scrub nurse for the operating theatre. In 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (pp. 504-509). IEEE.

4) Ezzat, A., Kogkas, A., Holt, J., Thakkar, R., Darzi, A. and Mylonas, G., 2021. An eye-tracking based robotic scrub nurse: proof of concept. Surgical endoscopy, 35(9), pp.5381-5391.

5) Badilla-Solórzano, J., Ihler, S., Gellrich, N.C. and Spalthoff, S., 2023. Improving instrument detection for a robotic scrub nurse using multi-view voting. International Journal of Computer Assisted Radiology and Surgery, 18(11), pp.1961-1968.

6) Kochan, A., 2005. Scalpel please, robot: Penelope's debut in the operating room. Industrial Robot: An International Journal, 32(6), pp.449-451.

7) Pérez-Vidal, C., Carpintero, E., Garcia-Aracil, N., Sabater-Navarro, J.M., Azorín, J.M., Candela, A. and Fernandez, E., 2012. Steps in the development of a robotic scrub nurse. Robotics and Autonomous Systems, 60(6), pp.901-911.

8) Chaman, S., 2018, June. Surgical robotic nurse. In 2018 Second International Conference on Intelligent Computing and Control Systems (ICICCS) (pp. 1959-1964). IEEE.

9) Thai, M.T., Phan, P.T., Hoang, T.T., Wong, S., Lovell, N.H. and Do, T.N., 2020. Advanced intelligent systems for surgical robotics. Advanced Intelligent Systems, 2(8), p.1900138.

10) Hussain, S.M., Brunetti, A., Lucarelli, G., Memeo, R., Bevilacqua, V. and Buongiorno, D., 2022. Deep learning-based image processing for robot assisted surgery: a systematic literature survey. IEEE Access, 10, pp.122627-122657

11) Biswas, P., Sikander, S. and Kulkarni, P., 2023. Recent advances in robot-assisted surgical systems. Biomedical Engineering Advances, p.100109.

12) Zinchenko, K., Wu, C.Y. and Song, K.T., 2016. A study on speech recognition control for a surgical robot. IEEE Transactions on Industrial Informatics, 13(2), pp.607-615.

13) Deshmukh, A.U., Aher, P.D., Kakade, M.S. and Bhise, S.D., Voice Control Robot for Object Detection.

14) van Delden, S., Umrysh, M., Rosario, C. and Hess, G., 2012. Pick‐and‐place application development using voice and visual commands. Industrial Robot: An International Journal, 39(6), pp.592-600.

15) Ramgire, J.B. and Jagdale, S.M., 2016, August. Speech control pick and place robotic arm with flexiforce sensor. In 2016 International Conference on Inventive Computation Technologies (ICICT) (Vol. 2, pp. 1-5). IEEE.

16) Li, S., Wang, J., Dai, R., Ma, W., Ng, W.Y., Hu, Y. and Li, Z., 2024. RoboNurse-VLA: Robotic Scrub Nurse System based on Vision-Language-Action Model. arXiv preprint arXiv:2409.19590.

17) Kumar, S., Jyothi, A.P., Sumanth, S. and Nagamani, H.S., 2024, March. Recognition and Classification of Apple and Sugarcane Plant Leaf Diseases using SVM with DAE Models. In 2024 International Conference on Distributed Computing and Optimization Techniques (ICDCOT) (pp. 1-7). IEEE.

18) Pawar, S., Shedge, S., Panigrahi, N., Jyoti, A.P., Thorave, P. and Sayyad, S., 2022, May. Leaf disease detection of multiple plants using deep learning. In 2022 international conference on machine learning, big data, cloud and parallel computing (COM-IT-CON) (Vol. 1, pp. 241-245). IEEE.

19) Jyothi, A.P., Megashree, C., Radhika, S. and Shoba, N., 2021. Detection of cervical cancer and classification using texture analysis. The journal of contemporary issues in business and government, 27(3), pp.1715-1724.
Volume 07 Issue 12 December 2024

There is an Open Access article, distributed under the term of the Creative Commons Attribution – Non Commercial 4.0 International (CC BY-NC 4.0) (https://creativecommons.org/licenses/by-nc/4.0/), which permits remixing, adapting and building upon the work for non-commercial use, provided the original work is properly cited.


Our Services and Policies

Authors should prepare their manuscripts according to the instructions given in the authors' guidelines. Manuscripts which do not conform to the format and style of the Journal may be returned to the authors for revision or rejected.

The Journal reserves the right to make any further formal changes and language corrections necessary in a manuscript accepted for publication so that it conforms to the formatting requirements of the Journal.

International Journal of Multidisciplinary Research and Analysis will publish 12 monthly online issues per year,IJMRA publishes articles as soon as the final copy-edited version is approved. IJMRA publishes articles and review papers of all subjects area.

Open access is a mechanism by which research outputs are distributed online, Hybrid open access journals, contain a mixture of open access articles and closed access articles.

International Journal of Multidisciplinary Research and Analysis initiate a call for research paper for Volume 07 Issue 12 (December 2024).

PUBLICATION DATES:
1) Last Date of Submission : 26 December 2024 .
2) Article published within a week.
3) Submit Article : editor@ijmra.in or Online

Why with us

International Journal of Multidisciplinary Research and Analysis is better then other journals because:-
1 : IJMRA only accepts original and high quality research and technical papers.
2 : Paper will publish immediately in current issue after registration.
3 : Authors can download their full papers at any time with digital certificate.

The Editors reserve the right to reject papers without sending them out for review.

Authors should prepare their manuscripts according to the instructions given in the authors' guidelines. Manuscripts which do not conform to the format and style of the Journal may be returned to the authors for revision or rejected. The Journal reserves the right to make any further formal changes and language corrections necessary in a manuscript accepted for publication so that it conforms to the formatting requirements of the Journal.

Indexed In
Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar