Newsletter Newsletters Events Events Podcasts Videos Africanews
Loader
Advertisement

Will robots do our dishes by 2030? |Euronews Tech Talks

Will robots do our dishes by 2030? |Euronews Tech Talks
Copyright  Canva
Copyright Canva
By Alice Carnevali
Published on
Share Comments
Share Close Button

Robots are slowly making their way into agriculture, healthcare, and our homes. Should we be worried?

In a room at the University of Oslo in Norway, Tiago moves around, looking people straight in the eye with a smiling face. Sometimes Tiago gets agitated, like when it plays Connect Four, a board game where players need to line up yellow and blue discs.

“Sometimes when Tiago does this [Connect Four], he will misunderstand something and sort of smash the board instead of continuing to play,” professor Kai Olav Ellefsen told Euronews Next.

An overreaction of an extremely competitive person, you might think. But Tiago is not an exchange student from Spain. It’s a robot developed by the company Pal Robotics, equipped with a mobile base, a mechanical arm and a gripper for grasping objects. It looks a bit like Spielberg’s alien E.T., with a friendly smile on its face.

Researchers at the Norwegian university use Tiago to test how robots move in the environment, interact with people and make decisions.

Given Tiago’s reaction to Connect Four, the robot still needs to improve a few aspects to fully understand the game and human interactions.

To better comprehend where robotics stands today and where it is heading, Euronews Tech Talks went to Oslo during the Oslo Innovation Week. There, we collected questions from students and young professionals and posed them to Kai Olav Ellefsen, associate professor at the University of Oslo and leader of the university’s research group on robotics and intelligent systems.

Apple podcast Spotify podcast Castbox podcast

What exactly is a robot?

The term robot appeared for the first time in the play Rossum's Universal Robots, or R.U.R, written by the Czech intellectual Karel Čapek in the 1920s. Already back then, the author was imagining robots as machines revolting against humans.

More than a century later, that fear still lingers, and even today, academia has not settled on a single definition of what a robot actually is.

Although the debate continues, Kai Olav Ellefsen offers a possible definition.

“Robots don’t have to be humanoids,” he said. According to him, robots need to be able to sense their environment, process what they sense, make intelligent decisions, and then act to change the world around them.

“A clothes washing machine could be a robot by that definition, because it perceives how dirty the water is and decides how to continue the cleaning programme depending on that,” he added.

Besides our houses, robots are in a lot of other places. Historically, they have been present in manufacturing, but now they’re increasingly being used in agriculture and healthcare.

For instance, in April, the United Kingdom’s National Institute for Health and Care Excellence approved the use of 11 robotic surgery systems to help patients undergoing soft tissue and orthopaedic procedures. Some of these technologies involve precise mechanical arms controlled by a surgeon from a console, while others are handheld.

Despite the steps forward, according to Ellefsen, robots will not completely replace doctors. “I don’t think they [robots] will really take over these kinds of jobs, but maybe assist more so that doctors and surgeons can focus more on what’s important for a human contact,” he said.

AI’s impact on robotics

Just like many other fields, robotics is deeply impacted by the development of artificial intelligence (AI).

“AI and robotics have been quite a strong combination ever since the field of AI was started in the 1950s,” Ellefsen said.

Today, AI is used to train robots, help them perceive and navigate different environments, and interact with people.

“More people are relying on using what's called machine learning for making robots, where they [robots] learn from examples instead of being programmed for each specific case,” Ellefsen explained.

Machine learning is a subfield of AI focusing on training systems to learn patterns from data and make decisions without being explicitly programmed. Just like ChatGPT is trained with online examples, robots can be trained similarly with language models inserted into them.

“But then a robot doesn't just need language, it also needs vision, sensing, and acting. And so you can supplement this language understanding with what's called a vision model,” Ellefsen said.

Vision models are AI models that can comprehend visual data. Just as ChatGPT can analyse an image and describe it, AI now enables robots to do the same.

And, last but not least, AI can also help robots take actions. “You can also train up what's called a vision language action model, where, based on examples, it [robot] can learn how actions influence the world around it”.

Despite the rise of AI being used in robotics and the sci-fi fear of them taking over, Ellefsen does not believe in an apocalyptic future.

“I think that by 2030, we’ll see some of the earliest adopters with these kinds of robots [humanoids] in their homes to help with daily tasks.

“But they’ll probably be a bit too expensive, and they may break a bit too often for the average user to have them at home”.

Additional sources • James Thomas Host; Johan Breton Sound Editor and Mixer.

Go to accessibility shortcuts
Share Comments

Read more