Soldiers Control Robotic Dog by Thought

New technology is making mind reading possible with positive implications for the fields of healthcare, aerospace and advanced manufacturing. The technology was recently demonstrated by the Australian Army, where soldiers operated a Ghost Robotics quadruped robot using the brain-machine interface. Photo supplied by Australian Army. Researchers from the University of Technology Sydney (UTS) have developed biosensor technology that will allow you to operate devices, such as robots and machines, solely through thought-control. The advanced brain-computer interface was developed by Distinguished Professor Chin-Teng Lin and Professor Francesca Iacopi, from the UTS Faculty of Engineering and IT, in collaboration with the Australian Army and Defence Innovation Hub. As well as defence applications, the technology has significant potential in fields such as advanced manufacturing, aerospace and healthcare – for example allowing people with a disability to control a wheelchair or operate prosthetics.

CLICK ON THE IMAGE TO ENJOY THE VIDEO

The hands-free, voice-free technology works outside laboratory settings, anytime, anywhere. It makes interfaces such as consoles, keyboards, touchscreens and hand-gesture recognition redundant,” said Professor Iacopi. “By using cutting edge graphene material, combined with silicon, we were able to overcome issues of corrosion, durability and skin contact resistance, to develop the wearable dry sensors,” she said.

A new study shows that the graphene sensors developed at UTS are very conductive, easy to use and robust. The hexagon patterned sensors are positioned over the back of the scalp, to detect brainwaves from the visual cortex. The sensors are resilient to harsh conditions so they can be used in extreme operating environments. The user wears a head-mounted augmented reality lens which displays white flickering squares. By concentrating on a particular square, the brainwaves of the operator are picked up by the biosensor, and a decoder translates the signal into commands.

The technology was recently demonstrated by the Australian Army, where soldiers operated a Ghost Robotics quadruped robot using the brain-machine interface. The device allowed hands-free command of the robotic dog with up to 94% accuracy. “Our technology can issue at least nine commands in two seconds. This means we have nine different kinds of commands and the operator can select one from those nine within that time period,” Professor Lin said. “We have also explored how to minimise noise from the body and environment to get a clearer signal from an operator’s brain,” he said.

The researchers believe the technology will be of interest to the scientific community, industry and government, and hope to continue making advances in brain-computer interface systems.

Source: https://www.uts.edu.au/

X Ray AR Glasses to See Hidden Objects

MIT researchers have built an augmented reality headset that gives the wearer X-ray vision. The headset combines computer vision and wireless perception to automatically locate a specific item that is hidden from view, perhaps inside a box or under a pile, and then guide the user to retrieve it. The system utilizes radio frequency (RF) signals, which can pass through common materials like cardboard boxes, plastic containers, or wooden dividers, to find hidden items that have been labeled with RFID tags, which reflect signals sent by an RF antenna. The headset directs the wearer as they walk through a room toward the location of the item, which shows up as a transparent sphere in the augmented reality (AR) interface. Once the item is in the user’s hand, the headset, called X-AR, verifies that they have picked up the correct object. When the researchers tested X-AR in a warehouse-like environment, the headset could localize hidden items to within 9.8 centimeters, on average. And it verified that users picked up the correct item with 96 percent accuracy. X-AR could aid e-commerce warehouse workers in quickly finding items on cluttered shelves or buried in boxes, or by identifying the exact item for an order when many similar objects are in the same bin. It could also be used in a manufacturing facility to help technicians locate the correct parts to assemble a product.

Our whole goal with this project was to build an augmented reality system that allows you to see things that are invisible — things that are in boxes or around corners — and in doing so, it can guide you toward them and truly allow you to see the physical world in ways that were not possible before,” says Fadel Adib, who is an associate professor in the Department of Electrical Engineering and Computer Science, the director of the Signal Kinetics group in the Media Lab, and the senior author of a paper on X-AR.

Adib’s co-authors are research assistants Tara Boroushaki, who is the paper’s lead author; Maisy Lam; Laura Dodds; and former postdoc Aline Eid, who is now an assistant professor at the University of Michigan. The research will be presented at the USENIX Symposium on Networked Systems Design and Implementation.

Source: https://news.mit.edu/