AUSTIN (KXAN) — The use of artificial intelligence, or AI, has expanded across various industries over the last couple of years.
A group of engineering students at Texas A&M is experimenting with incorporating AI into emergency response through an AI-powered robotic dog.
The robot dog not only follows commands, but it also sees, remembers, and “thinks,” according to a press release from Texas A&M University’s College of Engineering.
A&M students Sandun Vitharana and Sanjaya Mallikarachchi spearheaded the invention of the dog, which “never forgets where it’s been and what it’s seen.” According to the release, Vitharana is an engineering technology master’s student, and Mallikarachchi is an interdisciplinary engineering doctoral student.
The robot dog has a memory-based system that allows it to recall and reuse previously traveled paths, making navigation more efficient by reducing repeated exploration, which can be critical in search-and-rescue missions, especially in unmapped areas and GPS-denied environments, the release said.

Photo Credit: Logan Jinks/Texas A&M University College of Engineering

Photo Credit: Logan Jinks/Texas A&M University College of Engineering
It also understands voice commands and uses AI and camera input to perform path planning and identify objects.
“A roboticist would describe it as a terrestrial robot that uses a memory-driven navigation system powered by a multimodal large language model (MLLM),” A&M’s press release explained. “This system interprets visual inputs and generates routing decisions, integrating environmental image capture, high-level reasoning, and path optimization, combined with a hybrid control architecture that enables both strategic planning and real-time adjustments.”
The press release noted that robot navigation in unpredictable, unstructured environments has been difficult in autonomous exploration because efficiency and adaptability are critical.
But the robot dogs are somewhat unique because they combine custom MLLM with a visual memory-based system, making them behave more humanlike, the release said.
“Like humans, the robot uses reactive and deliberative behaviors and thoughtful decision-making,” per the release. “It quickly responds to avoid a collision and handles high-level planning by using the custom MLLM to analyze its current view and plan how best to proceed.”
“Some academic and commercial systems have integrated language or vision models into robotics,” Vitharana was quoted as saying in the press release. “However, we haven’t seen an approach that leverages MLLM-based memory navigation in the structured way we describe, especially with custom pseudocode guiding decision logic.”
The release said Mallikarachchi and Vitharana began by exploring how an MLLM could interpret visual data from a camera in a robotic system. They received support from the National Science Foundation and combined that idea with voice commands to build a “natural and intuitive system to show how vision, memory and language can come together interactively.”
“Moving forward, this kind of control structure will likely become a common standard for human-like robots,” Mallikarachchi said in the release.
The university said in the release that the robot dogs’ use could be applied to other situations outside of disaster response, like improving efficiency in hospitals, warehouses and other large facilities.
Its advanced navigation system could also help people with visual impairments, or the dogs could explore minefields or perform reconnaissance in hazardous areas.
“The core of our vision is deploying MLLM at the edge, which gives our robotic dog the immediate, high-level situational awareness and emotional intelligence previously impossible,” said Dr. Isuru Godage, assistant professor in the Department of Engineering Technology and Industrial Distribution, who advised the project. “This allows the system to bridge the interaction gap between humans and machines seamlessly. Our goal is to ensure this technology is not just a tool, but a truly empathetic partner, making it the most sophisticated and first responder-ready system for any unmapped environment.”
More details about the robot can be found on Texas A&M’s website.