Implementation of an Indoor Navigation Robot for Visually Impaired and New Visitors

Gopalakrishna Murthy C. R.*, Rahul Patil**, Swaroop Dixit S***, Ujwal L.****, Hemanth. M. V.*****
* Assistant Professor, Department of Electronics and Communication Engineering, KSSEM, Bengaluru, India.
**-***** UG Students, Department of Electronics and Communication Engineering, KSSEM, Bengaluru, India.
Periodicity:July - December'2017
DOI : https://doi.org/10.26634/jes.6.1.13890

Abstract

Most of the time people face problems, while navigating to a desired location of their choice in an unknown area. In order to help the people (pedestrians) from such a dilemma situation, an effective user friendly method was proposed by designing the technology in such a way that a developed system can help/ guide the visitors with the proper instructions either by playing audio route information or route map display (Azuma, 1997). The system is developed to behave as an intelligent Robot. This system could be used mainly in shopping malls, colleges, hospitals, museums, and industries for a simple and easy approaching method for Human-Robot Interaction (HRI). The Indoor Navigation Robot (INR) is developed by considering the college point of view, where a new visitor can find respective departments, canteen, library, etc. The Robot recognizes the input voice command with the help of Voice Module. The guidance for visitors will be done in two methods. The first method uses audio player which consists of Voice Recognition Module and Audio Player Recorder; both utilize the predefined voice data for their operation. The second method uses route map display, which is achieved by interfacing the Robot to MATLAB software tool on computer with the help of wireless technology (ZigBee module). The Indoor Navigation Robot has an advanced feature for security, which captures the new visitors' face and stores the captured images into database for further process. The Image will be stored in hard disc of system/Laptop. The stored image can be retrieved as and when required. The Robot has a flexible feature for displaying flash news, alert messages, and greeting messages on LCD screen through ZigBee module (10 m). This flexible feature of Robot allows the administrator to update the new information as and when required on LCD screen. The Robot takes in voice commands from the visitors and navigates by giving voice commands as well as by displaying the route map on the laptop screen, and moves with the visitor to the desired destination. This research work is useful for people who are unable to understand new location and bounded surroundings.

Keywords

Indoor Navigation, Voice Module, ZigBee, Predefined Maps, Route Display.

How to Cite this Article?

Murthy, G, C. R., Patil, R., Dixit, S, S., Ujwal, L., and Hemanth, M. V. (2017). Implementation of an Indoor Navigation Robot for Visually Impaired and New Visitors. i-manager's Journal on Embedded Systems, 6(1), 6-9. https://doi.org/10.26634/jes.6.1.13890

References

[1]. Alam, M. S., Jamil, I. A., Mahmud, K., & Islam, N. (2014, March). Design and implementation of a RF controlled robotic environmental survey assistant system. In Computer and Information Technology (ICCIT), 2013 16th International Conference on (pp. 438-442). IEEE.
[2]. Azuma, R. T. (1997). A survey of augmented reality. Presence: Teleoperators & Virtual Environments, 6(4), 355- 385.
[3]. Ha, E. T., Kim, T. K., Ahn, D. K., Jeong, S. H., Yoon, I. R., & Han, S. H. (2015, October). A stable control of legged robot based on ultrasonic sensor. In Control, Automation and Systems (ICCAS), 2015 15th International Conference on (pp. 1256-1258). IEEE.
[4]. Kersten-Oertel, M., Jannin, P., & Collins, D. L. (2013). The state of the art of visualization in mixed reality image guided surgery. Computerized Medical Imaging and Graphics, 37(2), 98-112.
[5]. Li, G., Wang, H., Ying, X., & Liu, J. (2015). A proxy-based cloud infrastructure for home service robots. In Control and Decision Conference (CCDC), 2015 27th Chinese (pp. 5718-5723). IEEE.
[6]. Peters, T. M. (2006). Image-guidance for surgical procedures. Physics in Medicine and Biology, 51(14), 505- 540.
[7]. Shah, M. S., & Borole, P. B. (2016, April). Surveillance and rescue robot using Android smartphone and the Internet. In Communication and Signal Processing (ICCSP), 2016 International Conference on (pp. 1526-1530). IEEE.
[8]. Wu, C. M., Chen, Y. J., Zaeni, I. A., & Chen, S. C. (2016, November). A new SSVEP based BCI application on the mobile robot in a maze game. In Advanced Materials for Science and Engineering (ICAMSE), International Conference on (pp. 550-553). IEEE.
[9]. Yabuta, Y. (2014, November). An autonomous robot controlled by stereovision system using two-layer LCD for tracking object. In Mecatronics (MECATRONICS), 2014 10th France-Japan/8th Europe-Asia Congress on (pp. 308-312). IEEE.
[10]. Yaniv, Z. & Linte, C. A. (2016). Applications of augmented reality in the operating room. Fundamentals of Wearable Computers and Augmented Reality, 485-518.
If you have access to this article please login to view the article or kindly login to purchase the article

Purchase Instant Access

Single Article

North Americas,UK,
Middle East,Europe
India Rest of world
USD EUR INR USD-ROW
Pdf 35 35 200 20
Online 35 35 200 15
Pdf & Online 35 35 400 25

Options for accessing this content:
  • If you would like institutional access to this content, please recommend the title to your librarian.
    Library Recommendation Form
  • If you already have i-manager's user account: Login above and proceed to purchase the article.
  • New Users: Please register, then proceed to purchase the article.