• Credit: Defence
    Credit: Defence
Close×

The Jericho Smart Sensing Lab at the University of Sydney Nano Institute has developed a prototype sensor system for the RAAF that mimics the brain’s neural architecture to deliver high-tech sensing technology.

Dubbed MANTIS, the system under development integrates a traditional camera with a visual system inspired by neurobiological architectures. This ‘neuromorphic’ sensor works at 'incredibly high' speeds, which will allow aircraft, ships and vehicles to identify fast moving objects, such as drones.

“Combining traditional visual input with the neuromorphic sensor is inspired by nature. The praying mantis has five eyes – three simple eyes and two composite. Our prototype works a bit like this, too,” Professor Ben Eggleton, Director of Sydney Nano, said.

MANTIS combines two visual sensing modes in a lightweight portable unit, operated via dashboard interface, with a processing system that is on board the craft, ship or vehicle. The aim is to enable a direct comparison of images and allow for the rapid exploration of the neuromorphic sensor capabilities.

Sydney Nano worked with the School of Architecture, Design and Planning to develop the prototype in just three months.

“There are many things that excite me about MANTIS. The level of detail that it provides and its ability to track high-speed events is very impressive," Air Vice-Marshal Cath Roberts, Head of Air Force Capability, said. “It's a promising sensor fusion that has really strong potential across Defence.”

Professor Eggleton leads the Jericho Lab team that saw delivery of the prototype. The four-kilogram small-form design will allow the camera to be easily used on aircraft, ships and vehicles to detect challenging targets.

“The neuromorphic sensor has exquisite sensing capabilities and can see what can't be seen with traditional cameras,” he said. “It invokes the idea of the eye in animals but has leading-edge technology built into it.”

Whereas a traditional camera is constrained by frame rates, each pixel in a neuromorphic camera functions independently and is always ‘on’. This means the imaging system is triggered by events. If it’s monitoring a static scene, the sensor sees nothing and no data is generated.

“When there is an event, the sensor has incredible sensitivity, dynamic range and speed,” Professor Eggleton said. “The data generated is elegantly interfaced with an IT platform allowing us to extract features using machine-learning artificial intelligence.”

“We look forward to developing this device further and collaborating with other experts in this area, including Western Sydney University’s International Centre for Neuromorphic Systems, which are the leaders in neuromorphic research in Australia.

MANTIS is the result of the partnership between the University of Sydney Nano Institute and Air Force’s Jericho Disruptive Innovation.

“A rapid prototype of this type and scale happening in three months, during COVID, is remarkable,” said Wing Commander Paul Hay, head of advanced sensing at RAAF Jericho Disruptive Innovation.

The Defence Science and Technology Group (DSTG) was also involved in the collaboration, providing early guidance and input.

comments powered by Disqus