AI-designed Antibody Enters Clinical Trials

The Israeli company Biolojic Design will conduct a trial for cancer patients in Australia with a new type of drugAulos Biosciences is now recruiting cancer patients to try it’s world’s first antibody drug designed by a computer. The computationally designed antibody, known as AU-007, was planned by the artificial intelligence platform of Israeli biotech company Biolojic Design from Rehovot, in a way that would target a protein in the human body known as interleukin-2 (IL-2). The goal is for the IL-2 pathway to activate the body’s immune system and attack the tumors.

The clinical trial will be conducted on patients with final stage solid tumors and will last about a year – but the company hopes to present interim results during 2022. The trial has raised great hopes because if it is successful, it will pave the way for the development of a new type of drug using computational biology and “big data.” Aulos presented pre-clinical data from a study on 19 mice – and they all responded positively to the treatment. In the 17-day trial period of the study, the antibody led to the complete elimination of the tumors in 10 of the mice – and to a significant delay in the development of the tumors in the other nine mice.

Aulos was founded in Boston as a spin-off of Biolojic and venture capital firm Apple Tree Partners, which invested $40 million in the company to advance the antibody project and prove its clinical feasibility. Drugs based on antibodies are considered to be one of the greatest hopes for anti-cancer solutions. Among the best-known in the field are Keytruda, mostly used to treat melanomas and lung cancer; and Herceptin for breast cancer. But the antibodies given today to cancer patients are created by a method that also has disadvantages – most are produced by the immune system in mice, and then are replicated to enable mass production.

Source: https://www.haaretz.com/

World’s Smallest Atom-Memory Unit Created

Faster, smaller, smarter and more energy-efficient chips for everything from consumer electronics to big data to brain-inspired computing could soon be on the way after engineers at The University of Texas at Austin created the smallest memory device yet. And in the process, they figured out the physics dynamic that unlocks dense memory storage capabilities for these tiny devices.

The research published recently in Nature Nanotechnology builds on a discovery from two years ago, when the researchers created what was then the thinnest memory storage device. In this new work, the researchers reduced the size even further, shrinking the cross section area down to just a single square nanometer. Getting a handle on the physics that pack dense memory storage capability into these devices enabled the ability to make them much smaller. Defects, or holes in the material, provide the key to unlocking the high-density memory storage capability.

When a single additional metal atom goes into that nanoscale hole and fills it, it confers some of its conductivity into the material, and this leads to a change or memory effect,” said Deji Akinwande, professor in the Department of Electrical and Computer Engineering.

Though they used molybdenum disulfide – also known as MoS2 – as the primary nanomaterial in their study, the researchers think the discovery could apply to hundreds of related atomically thin materials.

The race to make smaller chips and components is all about power and convenience. With smaller processors, you can make more compact computers and phones. But shrinking down chips also decreases their energy demands and increases capacity, which means faster, smarter devices that take less power to operate.

The results obtained in this work pave the way for developing future generation applications that are of interest to the Department of Defense, such as ultra-dense storage, neuromorphic computing systems, radio-frequency communication systems and more,” said Pani Varanasi, program manager for the U.S. Army Research Office, which funded the research.

The original device – dubbed “atomristor” by the research team – was at the time the thinnest memory storage device ever recorded, with a single atomic layer of thickness. But shrinking a memory device is not just about making it thinner but also building it with a smaller cross-sectional area. “The scientific holy grail for scaling is going down to a level where a single atom controls the memory function, and this is what we accomplished in the new study,” Akinwande said.

Source: https://news.utexas.edu/

China: Explosive Growth In The Digital Economy

China has over 110 million 5G users and is expected to have more than 600,000 5G base stations by the end of this year, covering all cities at prefecture level and above, according to the 5G Innovation and Development Forum held on Sept 15 during the Smart China Expo Online in southwest China’s Chongqing municipality.

Since 5G licenses for commercial use for more than one year were issued, the country has made steady progress in the construction of its 5G network infrastructure, said Han Xia, director of the telecom department at the Ministry of Industry and Information Technology, adding that Chinese telecommunications companies have already built over 500,000 5G base stations with over 100 million 5G internet terminals.

So far, 5G has been deployed in sectors and fields including ports, machinery, automobiles, steel, mining and energy, while 5G application has been accelerated in key areas such as industrial internet, Internet of Vehicles, medical care, and education, Han noted.

The value of the country’s industrial internet hit 2.13 trillion yuan last year, Yin Hao, an academician from the Chinese Academy of Sciences said at the forum, adding that the figure is expected to exceed 5 trillion yuan in 2025.

The integrated development of “5G plus industrial internet” can create new products, generate new models and new forms of business, reduce enterprises’ operating costs, improve their production efficiency, and optimize their resource allocation, Yin noted.

According to Chen Shanzhi, vice president of the China Information and Communication Technologies Group Corporation (CICT), the combination of 5G and other emerging information technologies, including artificial intelligence, cloud computing and big data, will help accelerate the integrated development and innovation of other sectors and bring about explosive growth in the digital economy.

http://global.chinadaily.com.cn/

AI and Big Data To Fight Eye Diseases

In future, it will be possible to diagnose diabetes from the eye using automatic digital retinal screening, without the assistance of an ophthalmologist‘: these were the words used by Ursula Schmidt-Erfurth, Head of MedUni Vienna‘s Department of Ophthalmology and Optometrics. The scientist has opened the press conference about the ART-2018 Specialist Meeting on new developments in retinal therapy. The automatic diabetes screening, has been recently implemented at MedUni Vienna.
Patients flock to the Department to undergo this retinal examination to detect any diabetic changes. It takes just a few minutes and is completely non-invasive

Essentially this technique can detect all stages of diabetic retinal diseasehigh-resolution digital retinal images with two million pixels are taken and analyzed within seconds – but Big Data offers even more potential: nowadays it is already possible to diagnose an additional 50 other diseases in this way. Diabetes is just the start. And MedUni Vienna is among the global leaders in this digital revolution.

The Division of Cardiology led by Christian Hengstenberg within the Department of Medicine II is working on how digital retinal analysis can also be used in future for the early diagnosis of cardiovascular diseases.

This AI medicine is ‘super human’,” emphasizes Schmidt-Erfurth. “The algorithms are quicker and more accurate. They can analyze things that an expert cannot detect with the naked eye.” And yet the commitment to Big Data and Artificial Intelligence is not a plea for medicine without doctors, which some experts predict for the not-to-distant future. “What we want are ‘super doctors’, who are able to use the high-tech findings to make the correct, individualized therapeutic decision for their patients, in the spirit of precision medicine, rather than leaving patients on their own.”

However, it is not only in the diagnosis of diseases that Artificial Intelligence and Big Data, plus virtual reality, provide better results. “We are already performing digitized operations with support from Artificial Intelligence. This involves projecting a virtual and precise image of the area of the eye being operated on onto a huge screen – and the surgeon then performs the operation with a perfect viewon screen” as it were, while actually operating on the patient with a scalpel.”

Source: https://www.news-medical.net/

Quantum Computer Controls One Billion Electrons Per Second One-by-One.

University of Adelaide-led research in Australia has moved the world one step closer to reliable, high-performance quantum computing. An international team has developed a ground-breaking single-electronpump”. The electron pump device developed by the researchers can produce one billion electrons per second and uses quantum mechanics to control them one-by-one. And it’s so precise they have been able to use this device to measure the limitations of current electronics equipment. This paves the way for future quantum information processing applications, including in defence, cybersecurity and encryption, and big data analysis.

This research puts us one step closer to the holy grail – reliable, high-performance quantum computing,” says project leader Dr Giuseppe C. Tettamanzi, Senior Research Fellow, at the University of Adelaide’s Institute for Photonics and Advanced Sensing.

Published in the journal Nano Letters, the researchers also report observations of electron behaviour that’s never been seen before – a key finding for those around the world working on quantum computing.

Quantum computing, or more broadly quantum information processing, will allow us to solve problems that just won’t be possible under classical computing systems,” says Dr Tettamanzi. “It operates at a scale that’s close to an atom and, at this scale, normal physics goes out the window and quantum mechanics comes into play.  To indicate its potential computational power, conventional computing works on instructions and data written in a series of 1s and 0s – think about it as a series of on and off switches; in quantum computing every possible value between 0 and 1 is available. We can then increase exponentially the number of calculations that can be done simultaneously.”

Source: https://www.adelaide.edu.au/