Project Info

Autonomous and Interactive Robot Tour Guide

Kaveh Fathian
kaveh.fathian@mines.edu

Project Goals and Description:

<span style="font-weight: 400">The goal of the project is to create an autonomous robot dog tour guide (using our Boston Dynamics Spot robot in the ARIA Lab) that takes visitors around campus while providing relevant information about points of interest around Mines. Our goal is to make user input as simple and intuitive as possible, with all commands being given through natural speech. Our project is built around large language models (LLMs), such as ChatGPT, which process user speech along with additional information we supply, and convert these high-level, often vague goals into precise movement and navigation commands that can be executed by the robot. </span>   <span style="font-weight: 400">This project will build off our previous MURF project from the 2024-2025 school year. </span><span style="font-weight: 400">Last year, we developed a simulation environment (in RVIZ and Gazebo) where the robot can autonomously navigate around the map given only the users speech as input. In the coming year, we’d like to implement our system on the actual robot, Spot. This would involve creating a map of campus using SLAM algorithms developed and tested by other students in the lab, as well as a ROS node that translates the movement and navigation commands given by Nav2 into commands that can be executed using the Spot SDK. We would also like to improve the autonomous capabilities of the robot by giving it more information about its environment. We would do this by feeding the input from Spot’s onboard cameras into computer vision models that classify objects in the image. We could then feed information about these objects into the LLM we are using for high level path planning, allowing the robot to complete more complicated goals, such as navigating to or finding different physical objects, without previously knowing of their existence. </span>   <span style="font-weight: 400">Beyond our goal of an automated campus tour guide, our research has broader implications in fields where user experience needs to be simple and intuitive. One potential application is in healthcare, where the robot could assist individuals with limited mobility by responding to voice commands and performing tasks autonomously, such as delivering medicine, supplies, or personal items directly to the patient without requiring human intervention. Other promising applications extend to hospitality, interactive learning, and assistive robotics, which all could benefit from natural, speech-based human-robot interfaces.</span>

More Information:

Grand Challenge: Engineer the tools of scientific discovery.
Lab website: https://www.ariarobotics.com/ Autonomous navigation (Nav2): https://navigation.ros.org Community-made plugins for Nav2: https://navigation.ros.org/plugins/index.html ROS documentation: https://docs.ros.org/en/humble Computer vision: https://en.wikipedia.org/wiki/Computer_vision  Human-robot Interaction: https://en.wikipedia.org/wiki/Human%E2%80%93robot_interaction  Spot SDK: https://dev.bostondynamics.com/ 

Primary Contacts:

Kaveh Fathian
Assistant Professor
Computer Science Department
Colorado School of Mines
Personal website: https://sites.google.com/view/kavehfathian/
Lab website: https://www.ariarobotics.com/
Email: kaveh.fathian@mines.edu

Student Preparation

Qualifications

<span style="font-weight: 400">There should be a low barrier for entry, but experience with Python, ROS, Nav2, and the simulation software (Gazebo and RVIZ) is very helpful. Experience with computer vision and human-robot interaction is also a plus. Must have an interest in robotics!</span>

TIME COMMITMENT (HRS/WK)

5 hours/week

SKILLS/TECHNIQUES GAINED

<span style="font-weight: 400">Experience with Python, ROS, autonomous navigation (Nav2), computer vision, simulations, LLM’s, human-robot interaction, Git, good software engineering practices, and working in a team on a large project.</span>

MENTORING PLAN

The student working on this project will be assigned a mentor (a senior lab member) and will meet with the faculty on a weekly basis. The mentor and the advisor will onboard the new student by sharing resources and training them as needed. The student will participate in weekly work meetings to work alongside other lab members that are working on this topic. This will create an opportunity for the student to learn from current members, ask questions, and participate in the development.

Preferred Student Status

Freshman
Sophomore
Junior
Senior
Share This