AI Designs Quantum Physics Beyond What Any Human Has Conceived

Quantum physicist Mario Krenn remembers sitting in a café in Vienna in early 2016, poring over computer printouts, trying to make sense of what MELVIN had found. MELVIN was a machine-learning algorithm Krenn had built, a kind of artificial intelligence. Its job was to mix and match the building blocks of standard quantum experiments and find solutions to new problems. And it did find many interesting ones. But there was one that made no sense.

The first thing I thought was, ‘My program has a bug, because the solution cannot exist,’” Krenn says. MELVIN had seemingly solved the problem of creating highly complex entangled states involving multiple photons (entangled states being those that once made Albert Einstein invoke the specter of “spooky action at a distance”). Krenn, Anton Zeilinger of the University of Vienna and their colleagues had not explicitly provided MELVIN the rules needed to generate such complex states, yet it had found a way. Eventually, he realized that the algorithm had rediscovered a type of experimental arrangement that had been devised in the early 1990s. But those experiments had been much simpler. MELVIN had cracked a far more complex puzzle.

When we understood what was going on, we were immediately able to generalize [the solution],” says Krenn, who is now at the University of Toronto. Since then, other teams have started performing the experiments identified by MELVIN, allowing them to test the conceptual underpinnings of quantum mechanics in new ways. Meanwhile Krenn, working with colleagues in Toronto, has refined their machine-learning algorithms. Their latest effort, an AI called THESEUS, has upped the ante: it is orders of magnitude faster than MELVIN, and humans can readily parse its output. While it would take Krenn and his colleagues days or even weeks to understand MELVIN’s meanderings, they can almost immediately figure out what THESEUS is saying.
It is amazing work,” says theoretical quantum physicist Renato Renner of the Institute for Theoretical Physics at the Swiss Federal Institute of Technology Zurich, who reviewed a 2020 study about THESEUS but was not directly involved in these efforts.


Nanorobots Probe Into Cells

U of T Engineering researchers have built a set of magnetic tweezers’ that can position a nano-scale bead inside a human cell in three dimensions with unprecedented precision. The nano-bot has already been used to study the properties of cancer cells, and could point the way toward enhanced diagnosis and treatment.

Professor Yu Sun (MIE, IBBME, ECE) and his team have been building robots that can manipulate individual cells for two decades. Their creations have the ability to manipulate and measure single cells — useful in procedures such as in vitro fertilization and personalized medicine. Their latest study, published today in Science Robotics, takes the technology one step further.

The magnetic bead introduced into the cell and controlled to be navigated onto the nuclear envelope.

So far, our robot has been exploring outside a building, touching the brick wall, and trying to figure out what’s going on inside,” says Sun. “We wanted to deploy a robot in the building and probe all the rooms and structures.” The team has created robotic systems that can manipulate sub-cellular structures inside electron microscopes, but that requires freeze-drying the cells and cutting them into tiny slices. To probe live cells, other teams have used techniques such as lasers or acoustics.

Optical tweezers — using lasers to probe cells — is a popular approach,” says Xian Wang (MIE), the PhD candidate who conducted the research. The technology was honoured with 2018 Nobel Prize in Physics, but Wang says the force that it can generate is not large enough for mechanical manipulation and measurement he wanted to do. “You can try to increase the power to generate higher force, but you run the risk of damaging the sub-cellular components you’re trying to measure,” says Wang.

The system Wang designed uses six magnetic coils placed in different planes around a microscope coverslip seeded with live cancer cells. A magnetic iron bead about 700 nanometres in diameter — about 100 times smaller than the thickness of a human hair — is placed on the coverslip, where the cancer cells easily take it up inside their membranes. Once the bead is inside, Wang controls its position using real-time feedback from confocal microscopy imaging. He uses a computer-controlled algorithm to vary the electrical current through each of the coils, shaping the magnetic field in three dimensions and coaxing the bead into any desired position within the cell.

We can control the position to within a couple of hundred nanometers down the Brownian motion limit,” says Wang. “We can exert forces an order of magnitude higher than would be possible with lasers.”

In collaboration with Dr. Helen McNeil and Yonit Tsatskis at Mount Sinai Hospital and Dr. Sevan Hopyan at The Hospital for Sick Children (SickKids), the team used their robotic system to study early-stage and later-stage bladder cancer cells. Previous studies on cell nuclei required their extraction of from cells. Wang and Sun measured cell nuclei in intact cells without the need to break apart the cell membrane or cytoskeleton. They were able to show that the nucleus is not equally stiff in all directions. “It’s a bit like a football in shape — mechanically, it’s stiffer along one axis than the other,” says Sun. “We wouldn’t have known that without this new technique.”

They were also able to measure exactly how much stiffer the nucleus got when prodded repeatedly, and determine which cell protein or proteins may play a role in controlling this response. This knowledge could point the way toward new methods of diagnosing cancer. “We know that in the later-stage cells, the stiffening response is not as strong,” says Wang. “In situations where early-stage cancer cells and later-stage cells don’t look very different morphologically, this provides another way of telling them apart.”

According to Sun, the research could go even further. “You could imagine bringing in whole swarms of these nano-bots, and using them to either starve a tumour by blocking the blood vessels into the tumor, or destroy it directly via mechanical ablation,” says Sun. “This would offer a way to treat cancers that are resistant to chemotherapy, radiotherapy and immunotherapy.”


Artificial Skin Opens SuperHuman Perception

A new type of sensor could lead to artificial skin that someday helps burn victimsfeel’ and safeguards the rest of us, University of Connecticut (UConn)  researchers suggest in a paper in Advanced Materials.

Our skin’s ability to perceive pressure, heat, cold, and vibration is a critical safety function that most people take for granted. But burn victims, those with prosthetic limbs, and others who have lost skin sensitivity for one reason or another, can’t take it for granted, and often injure themselves unintentionally. Chemists Islam Mosa from UConn, and James Rusling from UConn and UConn Health, along with University of Toronto engineer Abdelsalam Ahmed, wanted to create a sensor that can mimic the sensing properties of skin. Such a sensor would need to be able to detect pressure, temperature, and vibration. But perhaps it could do other things too, the researchers thought.

It would be very cool if it had abilities human skin does not; for example, the ability to detect magnetic fields, sound waves, and abnormal behaviors,” said Mosa.

Mosa and his colleagues created such a sensor with a silicone tube wrapped in a copper wire and filled with a special fluid made of tiny particles of iron oxide just one billionth of a meter long, called nanoparticles. The nanoparticles rub around the inside of the silicone tube and create an electric current. The copper wire surrounding the silicone tube picks up the current as a signal. When this tube is bumped by something experiencing pressure, the nanoparticles move and the electric signal changes. Sound waves also create waves in the nanoparticle fluid, and the electric signal changes in a different way than when the tube is bumped.

The researchers found that magnetic fields alter the signal too, in a way distinct from pressure or sound waves. Even a person moving around while carrying the sensor changes the electrical current, and the team found they could distinguish between the electrical signals caused by walking, running, jumping, and swimming.

Metal skin might sound like a superhero power, but this skin wouldn’t make the wearer Colossus from the X-men. Rather, Mosa and his colleagues hope it could help burn victimsfeelagain, and perhaps act as an early warning for workers exposed to dangerously high magnetic fields. Because the rubber exterior is completely sealed and waterproof, it could also serve as a wearable monitor to alert parents if their child fell into deep water in a pool, for example.


How To Recreate Memories Of Faces From Brain Data

A new technique developed by neuroscientists at the University of Toronto can reconstruct images of what people perceive based on their brain activity. The technique developed by Dan Nemrodov, a postdoctoral fellow in Assistant Professor Adrian Nestor’s lab at U of T Scarborough, is able to digitally reconstruct images seen by test subjects based on electroencephalography (EEG) data.


When we see something, our brain creates a mental percept, which is essentially a mental impression of that thing. We were able to capture this percept using EEG to get a direct illustration of what’s happening in the brain during this process,” says Nemrodov.

For the study, test subjects hooked up to EEG equipment were shown images of faces. Their brain activity was recorded and then used to digitally recreate the image in the subject’s mind using a technique based on machine learning algorithms. It’s not the first time researchers have been able to reconstruct images based on visual stimuli using neuroimaging techniques. The current method was pioneered by Nestor, who successfully reconstructed facial images from functional magnetic resonance imaging (fMRI) data in the past, but this is the first time EEG has been used.