How to Write Words in the Air

Scientists at Hongtuo Joint Laboratory in Wuhan, China, have invented what sounds like a mysterious yet fascinating laser pen that can write in mid-air — an intriguing approach that could, theoretically, be an onramp to “Star Wars”-esque hologram technology.

The South China Morning Post (SCMP) reported yesterday that the pen uses ultra-short laser pulses to strip the electrons from air particles and turn them into light-emitting plasma with sufficient precision to form words in mid-air.

With the brand new device, we can draw in the air without using paper and ink,” lab lead scientist Cao Xiangdong told the state-affiliated Science and Technology Daily this week, as reported by the SCMP.

The SCMP reported that the scientists said they used 3D scanning to arrange pixels and form Chinese characters, but didn’t completely explain how the process works. Long story short, it sounds awesome, but we’re gonna want to see more in the way of a demo.

The pen reportedly works in incredibly short laser bursts, equivalent to just a few quadrillionths of a second. At the same time, its power output is nearly incomprehensible.

The laser pen can reach one million megawatts, according to the SCMP, which isn’t too far off from the total amount of power the United States can generate. However, because the bursts are so short, the device doesn’t draw an immense amount of power, making it — the scientists say — relatively safe to use.

The team is hoping the pen could someday be used in quantum computing, brain imaging and other advanced tech. Or maybe we’ll even see some awesome new holographic technology.


Home-grown Semiconductors Ideal for Quantum Computing

Growing electronic components directly onto a semiconductor block avoids messy, noisy oxidation scattering that slows and impedes electronic operation. A UNSW (Australia) study out this month shows that the resulting high-mobility components are ideal candidates for high-frequency, ultra-small electronic devices, quantum dots, and for qubit applications in quantum computing.

Making computers faster requires ever-smaller transistors, with these electronic components now only a handful of nanometres in size. (There are around 12 billion transistors in the postage-stamp sized central chip of modern smartphones.)

However, in even smaller devices, the channel that the electrons flow through has to be very close to the interface between the semiconductor and the metallic gate used to turn the transistor on and off.  Unavoidable surface oxidation and other surface contaminants cause unwanted scattering of electrons flowing through the channel, and also lead to instabilities and noise that are particularly problematic for quantum devices.

In the new work we create transistors in which an ultra-thin metal gate is grown as part of the semiconductor crystal, preventing problems associated with oxidation of the semiconductor surface,” says lead author Yonatan Ashlea Alava.

We have demonstrated that this new design dramatically reduces unwanted effects from surface imperfections, and show that nanoscale quantum point contacts exhibit significantly lower noise than devices fabricated using conventional approaches,” says Yonatan, who is a FLEET PhD student.

This new all single-crystal design will be ideal for making ultra-small electronic devices, quantum dots, and for qubit applications,” comments group leader Prof Alex Hamilton at UNSW.

Collaborating with wafer growers at Cambridge University, the team at UNSW Sydney showed that the problem associated with surface charge can be eliminated by growing an epitaxial aluminium gate before removing the wafer from the growth chamber.

We confirmed the performance improvement via characterisation measurements in the lab at UNSW,” says co-author Dr Daisy Wang.

The high conductivity in ultra-shallow wafers, and the compatibility of the structure with reproducible nano-device fabrication, suggests that MBE-grown aluminium gated wafers are ideal candidates for making ultra-small electronic devices, quantum dots, and for qubit applications.


Quantum computing comes to Google Cloud

Google Cloud has tied up with quantum computing startup IonQ to make its quantum hardware accessible through its cloud computing platform. The company’s 11-qubit quantum hardware is available to Google Cloud Platform (GCP) customers, and the company expects to make its 32-qubit system available later this year. Explaining the significance of the announcement in a conversation with Google Cloud, IonQ CEO & President, Peter Chapman suggests that the offering will ensure “democratized access to quantum systems.”

Making quantum computers easily available to anyone via the cloud demonstrates that quantum is real because now anyone can run a quantum program with a few minutes and a credit card,” says Chapman.

IonQ’s quantum computers are available in the GCP Marketplace and can be immediately provisioned by users. IonQ shares that developers, researchers, and business can access IonQ’s platform with just a few clicks, just like any other platform available on GCP. The company adds that GCP users will be able to program IonQ’s systems using a number of software development kits (SDK), including Cirq, Qiskit, Penny Lane, and tket, or through a custom integration with IonQ’s APIs.

Notably, IonQ’s quantum hardware is also available on Microsoft Azure and AWS.


Quantum Supremacy

Researchers in UC Santa Barbara/Google scientist John Martinis’ group have made good on their claim to quantum supremacy. Using 53 entangled quantum bits (“qubits”), their Sycamore computer has taken on — and solved — a problem considered intractable for classical computers.

Google’s quantum supreme cryostat with Sycamore inside

A computation that would take 10,000 years on a classical supercomputer took 200 seconds on our quantum computer,” said Brooks Foxen, a graduate student researcher in the Martinis Group. “It is likely that the classical simulation time, currently estimated at 10,000 years, will be reduced by improved classical hardware and algorithms, but, since we are currently 1.5 billion times faster, we feel comfortable laying claim to this achievement.

The feat is outlined in a paper in the journal Nature.

The milestone comes after roughly two decades of quantum computing research conducted by Martinis and his group, from the development of a single superconducting qubit to systems including architectures of 72 and, with Sycamore, 54 qubits (one didn’t perform) that take advantage of the both awe-inspiring and bizarre properties of quantum mechanics.

The algorithm was chosen to emphasize the strengths of the quantum computer by leveraging the natural dynamics of the device,” said Ben Chiaro, another graduate student researcher in the Martinis Group. That is, the researchers wanted to test the computer’s ability to hold and rapidly manipulate a vast amount of complex, unstructured data.

We basically wanted to produce an entangled state involving all of our qubits as quickly as we can,” Foxen said, “and so we settled on a sequence of operations that produced a complicated superposition state that, when measured, returned output (“bitstring”) with a probability determined by the specific sequence of operations used to prepare that particular superposition.” The exercise, which was to verify that the circuit’s output correspond to the sequence used to prepare the state, sampled the quantum circuit a million times in just a few minutes, exploring all possibilities — before the system could lose its quantum coherence. “We performed a fixed set of operations that entangles 53 qubits into a complex superposition state,” Chiaro explained. “This superposition state encodes the probability distribution. For the quantum computer, preparing this superposition state is accomplished by applying a sequence of tens of control pulses to each qubit in a matter of microseconds. We can prepare and then sample from this distribution by measuring the qubits a million times in 200 seconds.” “For classical computers, it is much more difficult to compute the outcome of these operations because it requires computing the probability of being in any one of the 2^53 possible states, where the 53 comes from the number of qubits — the exponential scaling is why people are interested in quantum computing to begin with,” Foxen said. “This is done by matrix multiplication, which is expensive for classical computers as the matrices become large.”

According to the new paper, the researchers used a method called cross-entropy benchmarking to compare the quantum circuit’s bitstring to its “corresponding ideal probability computed via simulation on a classical computer” to ascertain that the quantum computer was working correctly. “We made a lot of design choices in the development of our processor that are really advantageous,” said Chiaro. Among these advantages, he said, are the ability to experimentally tune the parameters of the individual qubits as well as their interactions.


Quantum Computer Can See 16 Different Futures Simultaneously

When Mile Gu boots up his new computer, he can see the future. At least, 16 possible versions of it — all at the same time. Gu, an assistant professor of physics at Nanyang Technological University in Singapore, works in quantum computing. This branch of science uses the weird laws that govern the universe’s smallest particles to help computers calculate more efficiently.

Tiny particles of light can travel in a superposition of many different states at the same time. Researchers used this quantum quirk to design a prototype computer that can predict 16 different futures at once.

Unlike classical computers, which store information as bits (binary digits of either 0 or 1), quantum computers code information into quantum bits, or qubits. These subatomic particles, thanks to the weird laws of quantum mechanics, can exist in a superposition of two different states at the same time.

Just as Schrödinger‘s hypothetical cat was simultaneously dead and alive until someone opened the box, a qubit in a superposition can equal both 0 and 1 until it’s measured. Storing multiple different outcomes into a single qubit could save a ton of memory compared to traditional computers, especially when it comes to making complicated predictions.

In a study published April 9 in the journal Nature Communications, Gu and his colleagues demonstrated this idea using a new quantum simulator that can predict the outcomes of 16 different futures (the equivalent of, say, flipping a coin four times in a row) in a quantum superposition. These possible futures were encoded in a single photon (a quantum particle of light) which moved down multiple paths simultaneously while passing through several sensors. Then, the researchers went one step further, firing two photons side-by-side and tracking how each photon’s potential futures diverged under slightly different conditions.

It’s sort of like Doctor Strange in the ‘Avengers: Infinity War‘” movie, Gu told Live Science. Before a climactic battle in that film, the clairvoyant doctor looks forward in time to see 14 million different futures, hoping to find the one where the heroes defeat the big baddie. “He does a combined computation of all these possibilities to say, ‘OK, if I changed my decision in this small way, how much will the future change?’ This is the direction our simulation is moving forwards to.


Quantum Computer Controls One Billion Electrons Per Second One-by-One.

University of Adelaide-led research in Australia has moved the world one step closer to reliable, high-performance quantum computing. An international team has developed a ground-breaking single-electronpump”. The electron pump device developed by the researchers can produce one billion electrons per second and uses quantum mechanics to control them one-by-one. And it’s so precise they have been able to use this device to measure the limitations of current electronics equipment. This paves the way for future quantum information processing applications, including in defence, cybersecurity and encryption, and big data analysis.

This research puts us one step closer to the holy grail – reliable, high-performance quantum computing,” says project leader Dr Giuseppe C. Tettamanzi, Senior Research Fellow, at the University of Adelaide’s Institute for Photonics and Advanced Sensing.

Published in the journal Nano Letters, the researchers also report observations of electron behaviour that’s never been seen before – a key finding for those around the world working on quantum computing.

Quantum computing, or more broadly quantum information processing, will allow us to solve problems that just won’t be possible under classical computing systems,” says Dr Tettamanzi. “It operates at a scale that’s close to an atom and, at this scale, normal physics goes out the window and quantum mechanics comes into play.  To indicate its potential computational power, conventional computing works on instructions and data written in a series of 1s and 0s – think about it as a series of on and off switches; in quantum computing every possible value between 0 and 1 is available. We can then increase exponentially the number of calculations that can be done simultaneously.”