Anticipating how others will act is a critical human skill that researchers are working to transfer to robots.
There are a few skills we pick up early in life — action anticipation, or the ability to predict how others will act, is a critical one. As toddlers, we quickly realise that colouring on walls, rather than paper, is a sure-fire way to invoke punishment from a parent. As new drivers, we learn to preempt how pedestrians and other road users will behave. As spouses, we figure out when to speak our minds and when to hold our tongues.
This ability to anticipate actions, just before they happen, offers us numerous advantages: minimising surprises, allowing us to react quickly to threats, and being accepted in our social groups.
“Action anticipation is something that’s been around since our species first evolved, and has been very important to our survival for millions of years,” says Basura Fernando, a researcher at A*STAR’s Institute of High-Performance Computing (IHPC). “It helps us to be safe and to be more socially aware.”
In recent years, researchers such as Fernando have been figuring out how to transfer this critical skill to artificial intelligence (AI) systems. “If you’re building AI and robots to help humans, they should be able to anticipate the actions of humans, otherwise they won’t be well-received when they are deployed in real-world applications,” says Fernando, whose work focuses on machine learning and how computers extract information from videos or images.
He points to driverless cars as an example. “It’s really important for an autonomous vehicle to anticipate the actions of drivers, cyclists, and pedestrians, so it can plan how to move accordingly,” explains Fernando. “Early recognition is really important.”
In 2017, Fernando was working as a research fellow in Australia’s capital Canberra when he heard the news that Singapore was about to launch a S$150 million programme — called AI Singapore — dedicated to advancing AI research and development as well as industry adoption of AI in the nation.
It was exciting news. “Back then, it wasn’t very typical for governments to invest so much money on AI projects,” recalls the Sri Lankan native. “Nowadays, everybody understands the value of AI, and governments are looking at how they can invest and harness its competitive advantages for their countries. But Singapore was one of the first to recognise this.”
He adds: “AISG has this objective of creating social and economic impacts, growing local talent, building an AI ecosystem and putting Singapore on the world map. This is fascinating and sounded like there are a lot of opportunities within the AI landscape in Singapore.”
Attracted to the island state’s foresight and commitment to AI research, Fernando packed his bags and moved to Singapore in September 2018. A year later, he was awarded a research grant from AI Singapore and began working to develop AIs capable of action anticipation.
To tackle this challenge, Fernando and his lab first went back to the drawing board. Existing methods, he believed, were somewhat lacking — most machine learning models represented the world, along with the people in it, in a manner that was too static.
“Instead, what we really need is a dynamic representation to handle things like uncertainty in behaviour,” says Fernando. “Because we know that humans don’t always behave in the same way.”
“The same person, for instance, may behave in two different ways in the same scenario, depending on his mental state, time of day, or other factors,” he says.
Building a mental picture
To create the more dynamic model of action anticipation that he envisioned, Fernando took a two-step approach. In the first step, he trained his model to recognise that individuals can alter their behaviour and to identify the factors contributing to such a shift. After that, he tinkered with the model so that it could learn how to assign probabilities to these various actions in order to anticipate them accurately.
This was done by feeding the new model large volumes of data — specifically, video footage from CCTVs of people doing a variety of things in a particular scenario.
“Basically, what the model learns is why a particular person, in a given mental state or activity,
behaves in a certain way,” explains Fernando. “So it tries to map this relationship and then
based on that, predicts what the human may do in the future, even if there are variations
Take, for example, a man venturing into his kitchen. He opens up his fridge and pulls out a carton of milk, then reaches for the sugar canister on a nearby shelf. His virtual assistant wants to help by offering up a useful recipe, but isn’t quite sure of his intentions based on the two ingredients he’s gathered so far: is the man hungry and hoping to make some pancakes, or is he after a pick-me-up coffee?
However, the man soon goes into his pantry and pulls out a bag of flour and some eggs, along with a large mixing bowl, and the answer becomes clear. “By looking at the type of objects the person interacts with, the model can infer that his goal is to make pancakes,” says Fernando.
“Based on this observation, and by using other metadata such as the time the video was taken, the model can form a representation of the person’s mental state and predict his actions,” he says.
It’s a simple example, but one that illustrates the benefits of having an AI that is capable of anticipating human behaviour. Fernando envisions that in the future, such technology could be applied to driverless vehicles, as well as early warning systems at high-risk workplaces that can predict when a person might be unwittingly headed to a hazardous zone.
Since embarking on this project in 2019, Fernando and his team have published their research prolifically. Last year, he also began work on a parallel, AI Singapore-funded project focused on improving virtual assistants, among the other research he does.
At the end of the day, Fernando “remains fascinated” with understanding human behaviour through computer vision and machine learning. “Understanding people is a huge, challenging question in science,” he says. “It’s the motivation behind my work.”
Check out other AI Research awarded projects here: https://aisingapore.org/research/grant-call-awardees/