Graphene Headphones With Superior Sound Quality

When Andre Geim and Kostya Novoselov first isolated graphene in 2004, it opened the door for a wave of innovation based on the wonder material’s jaw-dropping properties. Until now, however, it’s fair to say the Nobel-winning discovery has had limited impact for the average consumer. But that could all be about to change.

Wearable electronics, sports equipment and smartphones have all hitched their wagons to the graphene hype train in recent times, playing on the material’s incredible strength and conductivity. In many cases, its inclusion is probably more beneficial to marketing departments than actual end users, although we’re also finally seeing some products that are genuinely tapping into graphene’s enormous potential.

Canadian startup Ora’s GrapheneQ™ (GQ) headphones fall squarely into the latter category. GrapheneQ is the company’s proprietary composite material, used to make the 40mm acoustic drivers that actually deliver sound to the ear. Consisting of more than 95 per cent graphene, it retains most of the material’s mechanical properties, while at the same time being easier to shape and less expensive to produce. Rather than recreating graphene’s single layer of carbon atoms, GQ consists of flakes of graphene deposited in thousands of layers bonded together with proprietary cross-linking agents. It is lightweight and stiff, with a low density, making it an ideal material for loudspeaker membranes.

Without a doubt, the most exciting aspect about the technology is the unique mechanical properties it holds,” says Ari Pinkas, Ora’s co-founder and business lead. “It is very uncommon for such a rigid material to be so lightweight. This rare combination of high stiffness and low density allows for some pretty cool things in the audio world. To start with, acoustic transducers are already notoriously inefficient: less than 10 per cent of the energy that goes into a loudspeaker gets translated to sound, over 90 per cent simply turns into unwanted heat.”

Having an ultra-lightweight speaker membrane results in a considerable power saving, something that’s particularly desirable for wireless consumer electronics such as smartphones, portable speakers and Bluetooth headphones.

The fact that GrapheneQ is so lightweight means that it takes considerably less energy to move than other materials,” said Pinkas. “More concretely, Ora has observed up to a 70 per cent extension in the battery life of an audio dedicated device when doing physical A/B comparison measurements replacing a loudspeaker’s original membrane with a GrapheneQ cone.”

Source: https://statnano.com/

Human Internal Verbalizations Understood Instantly By Computers

MIT researchers have developed a computer interface that can transcribe words that the user verbalizes internally but does not actually speak aloud. The system consists of a wearable device and an associated computing system. Electrodes in the device pick up neuromuscular signals in the jaw and face that are triggered by internal verbalizations — saying wordsin your head” — but are undetectable to the human eye. The signals are fed to a machine-learning system that has been trained to correlate particular signals with particular words. The device also includes a pair of bone-conduction headphones, which transmit vibrations through the bones of the face to the inner ear. Because they don’t obstruct the ear canal, the headphones enable the system to convey information to the user without interrupting conversation or otherwise interfering with the user’s auditory experience.

The device is thus part of a complete silent-computing system that lets the user undetectably pose and receive answers to difficult computational problems. In one of the researchers’ experiments, for instance, subjects used the system to silently report opponents’ moves in a chess game and just as silently receive computer-recommended responses.

The motivation for this was to build an IA device — an intelligence-augmentation device,” says Arnav Kapur, a graduate student at the MIT Media Lab, who led the development of the new system. “Our idea was: Could we have a computing platform that’s more internal, that melds human and machine in some ways and that feels like an internal extension of our own cognition?” “We basically can’t live without our cellphones, our digital devices,” adds Pattie Maes, a professor of media arts and sciences and Kapur’s thesis advisor. “But at the moment, the use of those devices is very disruptive. If I want to look something up that’s relevant to a conversation I’m having, I have to find my phone and type in the passcode and open an app and type in some search keyword, and the whole thing requires that I completely shift attention from my environment and the people that I’m with to the phone itself. So, my students and I have for a very long time been experimenting with new form factors and new types of experience that enable people to still benefit from all the wonderful knowledge and services that these devices give us, but do it in a way that lets them remain in the present.”

Source: http://news.mit.edu/