Associate Professor Brinkworth says the problem with robots right now is they can’t make sandwiches.
Of course, lunch-making robots aren’t a priority for his Autonomous Systems research team at Flinders University.
But the analogy of a robot constructing a meal is a useful one. Putting sandwiches together requires a way to see the world (vision), working with soft materials that come in irregular shapes performing complex physical manoeuvres and making decisions on the spot about how to proceed.
This is what makes constructing sandwiches more difficult than manufacturing automotives, since cars are rigid and the parts consistent.
“I want to bring robots out of the lab and into the real world,” Associate Professor Brinkworth says. “This means creating robotic systems that can operate in a dynamic environment, that can be flexible and that can respond to what’s going on around them.”
For defence applications, for surveillance, for self-driving cars and for tracking real-world natural phenomena, Associate Professor Brinkworth and his colleagues are creating new technologies and robotic systems that are autonomous and adaptable.
Their recent work has focused on creating systems that can detect changes in the environment, such as scanning for unauthorised drones at airports and military sites.
Current surveillance systems typically consist of high-grade cameras that collect visual information. Images are analysed by artificial intelligence trained to classify objects based on their appearance – distinguishing a drone from a bird, for example.
“But it’s really resource intensive and relatively slow to scan the entire environment at high resolution all the time,” Associate Professor Brinkworth says. “We’ve developed a system that operates much faster.”
The key is an approach based on the biological reality of how eyes work. Low-grade vision is applied as a first-pass scanning tool (equivalent to peripheral vision that animals and insects use) and then the system shifts to higher resolution vision (known as a foveal vision by biologists) once something new is detected.
“It’s a much more efficient way to do surveillance,” Associate Professor Brinkworth says.
Putting this new technology into action, Associate Professor Brinkworth has developed software that can be retrofitted to existing high-grade detection cameras to extend their capabilities.
“We’ve done simulations and real-world trials at Woomera in regional South Australia that show we’re able to track drones out 50% further than other systems are able to do – so a detection system that worked over 2km can now operate at 3km,” Associate Professor Brinkworth says. “This has real-world applications for perimeter defence and airport monitoring, where unwanted drone activity needs to be detected.”
The system is incredibly robust, with three modes of detection built in.
“We can track drones visually, by their heat signature and by their sound, meaning it’s very hard to evade this system,” says Associate Professor Brinkworth.
Associate Professor Brinkworth’s work is funded by multiple sources, including the Australian Research Council and the Department of Defence, with industry funding also on the horizon.
“I’m really excited about the possibilities of our scanning technology being applied to self-driving cars,” Associate Professor Brinkworth says. “The ability to detect something moving in the shadows, and for the car to respond appropriately without needing the threat to be completely in front of the car in full light, that’s what I think will really improve the safety of those vehicles.”
Associate Professor Brinkworth is unusual in computer science circles, as he looks to biology for inspiration. Many robotics researchers shy away from biology, as it’s seen as being too variable, with too many uncontrollable factors.
“Standard engineering approaches to autonomous systems involve making everything as static as possible,” Associate Professor Brinkworth says. “Whereas biology is dynamic and highly non-linear – and these characteristics are actually really useful, and efficient if you can apply them appropriately.”
Associate Professor Brinkworth came to autonomous systems research after a PhD in neuroscience, with in-depth knowledge of how nerves work. His research to understand the intricacies of the visual system of flies has been instrumental in setting up his lab’s current capabilities.
“Insect vision is relatively simple, and we can study flies to track the way visual information is processed from the eyes to the brain,” Associate Professor Brinkworth says. “We’ve now applied this knowledge to build automated visual detection systems that have even higher resolution than fly eyes – we’re building on millions of years of evolution but without the same biological limitations.”
Understanding how eyes process visual information can also be applied to design detection systems that pick up erroneous sounds in a landscape, whether on land or underwater.
The work has also led to the development of an automated system for tracking clouds in the sky.
Clouds are constantly morphing and changing, appearing and disappearing based on fluxes in atmospheric conditions. It’s a characteristic that makes predicting cloud cover difficult, a big issue for solar energy providers who grapple with making power capacity and storage calculations based on varied exposure of solar panels to sunlight.
“Predicting cloud cover is important for improving the efficiency of our solar power generation systems – but currently cloud tracking is completely ignored or at best calculated at a low resolution based on average data, which is often inaccurate for specific time frames,” says Associate Professor Brinkworth. “In our system, we don’t actually track the clouds; instead, we track optic flow, which is the movement of motion energy across a scene.”
In the same way that you use peripheral vision to pick up the movement of a cat running across a road out of the 'corner of your eye', the automated cloud detection system picks up clouds as sudden changes in motion energy.
For all the projects Associate Professor Brinkworth and his team work on, one core task dominates: extracting useful information from noisy background data.
“For visual information, acoustic data and energy data, our job is to work out how to design a system that can detect meaningful changes,” he says.
It’s the same challenge our brains face every day.
“For us as humans, it's the unexpected thing that gets our attention,” Associate Professor Brinkworth says. “Then we collect more information to decide whether we can safely ignore the change, or take some kind of action to address a threat. That’s the kind of capability I’m aiming to replicate.”
Download your free copy of Fearless Research
You consent to the use of our cookies if you proceed.