Why scientists are teaching AI to think like a dog

Image: Canine AI
Researchers studied dog behavior to teach AI how to act and "plan" more like man's best friend. Copyright arXiv
By Denise Chow with NBC News Tech and Science News
Share this articleComments
Share this articleClose Button

Robotic canines could assist elderly people and those with disabilities.

ADVERTISEMENT

Dogs may be our best friends, but they're also our hard-working colleagues — tasked with everything from guarding our homes to guiding visually impaired people to sniffing out bombs. And now researchers have enlisted the help of an Alaskan Malamute named Kelp to develop an artificial intelligence system that thinks just like a dog, in hopes of creating canine-like robots.

To build a database of dog behavior, a team of scientists led by Kiana Ehsani, a Ph.D. student at the University of Washington, attached sensors to Kelp's paws, torso, and tail to capture her movements for a couple of hours a day while eating, playing fetch, and walking around in various indoor and outdoor environments. A camera affixed to Kelp's head recorded what she saw as she went about her everyday activities.

Researchers studied dog behavior to teach AI how to act and "plan" more like man\'s best friend.
Researchers studied dog behavior to teach AI how to act and "plan" more like man\'s best friend.arXiv

Over the course of several weeks, the researchers amassed more than 24,000 video frames — all associated with particular body movements.

The scientists then used machine learning to comb through the data and identify patterns in Kelp's behavior. This was subsequently used to train an AI system to understand the behavior to a point where it could predict how Kelp — and dogs in general — might react under a variety of circumstances.

Ehsani and her colleagues found that the system could make accurate predictions, but only for relatively short time frames. For instance, the system could see a sequence of five images and then accurately predict Kelp's next five movements.

But what surprised the researchers was what the system had learned from the dog beyond what it had been trained to predict.

The scientists found that the AI system could accurately identify "walkable" surfaces because Kelp intuitively knew if a path was too rocky, for instance, or if she wasn't allowed there. The AI system could also distinguish between different environments — a park, a street, a stadium or an alleyway — based on the dog's movement in those spaces.

"The dog would react and move her joints differently when she's outside in a dog park versus times when she's inside in a living room," Ehsani said. "When we used our dataset to solve these problems, surprisingly we were able to do it."

The research is in its early stages, but Ehsani said she plans to expand the database by monitoring interactions between dogs and by studying the behavior of different breeds.

Eventually, she said, the researchers want to build a four-legged robot that could act as a service dog. "The cost of training service dogs is really high, so it could be easier for us to just train and work on a [robotic dog] and then have this dog be helpful in assisting elderly or assisting people with disabilities," Ehsani said.

But is that a realistic goal? Marc Bekoff, a professor emeritus of ecology and evolutionary biology at the University of Colorado and an expert on dog behavior, called the research "very interesting" but said AI systems are not yet sophisticated enough to replicate the complex interactions between dogs and humans.

"The dog-human interaction — especially among service dogs — is very nuanced," Bekoff said. "There's just an incredible variability among the dogs themselves, among the people, and among the dog-human relationships."

Bekoff, who was not involved in the new research, said AI may mature to the point that it can mimic dogs but "we're light-years away from that."

Ehsani agrees that robots won't replace service dogs anytime soon but said the research is a step in the right direction.

"Service dogs are trained for very specific kinds of actions, like helping the people with visual impairment getting on the bus or passing the street," she told NBC News MACH in an email. "These are the tasks that AI is getting close to solving. Replicating all the emotions is for sure a much harder task."

The research was published online March 28 in the preprint journal arXiv.

FOLLOW NBC NEWS MACH ON TWITTER, FACEBOOK, AND INSTAGRAM.

Share this articleComments

You might also like

Apple launches faster chips, MacBook Pro laptops and cheaper Airpods - what are the upgrades?

What is the metaverse and why is Facebook betting big on it?

Euronews Debates | Profit vs public good: How can innovation benefit everyone?