The Bright Future of the Hydrogen Economy

The U.S. is counting on hydrogen to play a significant role in the low-carbon economy of the future, but fundamental questions about transportation, storage and cost need to be addressed in order to integrate hydrogen gas into the nation’s existing infrastructure, according to a preliminary study from a new research program at The University of Texas at Austin. That’s because although hydrogen gas burns carbon free, it only gives about a third of the energy of natural gas per unit volume. That means the U.S. will need to make and store much more of it for heating, transportation, power generation and industrial uses.

The research offers a framework for solving these issues, presenting an initial goal of replacing 10% of the nation’s natural gas supply with hydrogen as a reasonable first target. That move could reduce U.S. greenhouse gasses by 3.2%, based on 2019 emissions, and help meet the Department of Energy’s goal of enabling a low-carbon economy in the U.S. The analysis considers what it would take to scale up the use of hydrogen, including integrating hydrogen into the country’s natural gas system, which is probably the most robust in the world, said lead author Mark Shuster, associate director of energy at the Bureau of Economic Geology in the UT Jackson School of Geosciences.

We know how to move gas. We’re very experienced in it, particularly in the U.S., so it makes sense,” he said. “You have a whole suite of potential uses for the hydrogen, but it’s going to take some work, some research, and I think it’s going to take probably some targeted incentives.”

The paper, authored by scientists and economists at the bureau, was published in the Oil & Gas Journal. It came out as Secretary of Energy Jennifer M. Granholm announced the goal of reducing the cost of clean hydrogen from $5 a kilogram to $1 a kilogram in a decade.

Bureau Chief Economist Ning Lin, a study co-author, said that hydrogen projects will have to quickly become reality for the Department of Energy’s goal to be met.

There is a lot of research being done, but not enough demonstration,” she said. “In order to achieve the goal of having hydrogen as a meaningful sector in our current energy system with competitive cost, we need to see material progress in scaling up to pilot test capacity and strong cost reduction evidence in the next five years.”

Source: https://news.utexas.edu/

Clean Water At Low Cost

Producing clean water at a lower cost could be on the horizon after researchers from The University of Texas at Austin (UT Austin) and Penn State solved a complex problem that had baffled scientists for decades, until now. Desalination membranes remove salt and other chemicals from water, a process critical to the health of society, cleaning billions of gallons of water for agriculture, energy production and drinking. The idea seems simple — push salty water through and clean water comes out the other side — but it contains complex intricacies that scientists are still trying to understand.

The research team, in partnership with DuPont Water Solutions, solved an important aspect of this mystery, opening the door to reduce costs of clean water production. The researchers determined desalination membranes are inconsistent in density and mass distribution, which can hold back their performance. Uniform density at the nanoscale is the key to increasing how much clean water these membranes can create.

Reverse osmosis membranes are widely used for cleaning water, but there’s still a lot we don’t know about them,” said Manish Kumar, an associate professor in the Department of Civil, Architectural and Environmental Engineering at UT Austin, who co-led the research. “We couldn’t really say how water moves through them, so all the improvements over the past 40 years have essentially been done in the dark.”

The paper documents an increase in efficiency in the membranes tested by 30%-40%, meaning they can clean more water while using significantly less energy. That could lead to increased access to clean water and lower water bills for individual homes and large users alike.

Reverse osmosis membranes work by applying pressure to the salty feed solution on one side. The minerals stay there while the water passes through. Although more efficient than non-membrane desalination processes, it still takes a large amount of energy, the researchers said, and improving the efficiency of the membranes could reduce that burden.

Fresh water management is becoming a crucial challenge throughout the world,” said Enrique Gomez, a professor of chemical engineering at Penn State who co-led the research. “Shortages, droughts — with increasing severe weather patterns, it is expected this problem will become even more significant. It’s critically important to have clean water availability, especially in low-resource areas.”

The findings have been published in Science.

Source: https://news.utexas.edu/

World’s Smallest Atom-Memory Unit Created

Faster, smaller, smarter and more energy-efficient chips for everything from consumer electronics to big data to brain-inspired computing could soon be on the way after engineers at The University of Texas at Austin created the smallest memory device yet. And in the process, they figured out the physics dynamic that unlocks dense memory storage capabilities for these tiny devices.

The research published recently in Nature Nanotechnology builds on a discovery from two years ago, when the researchers created what was then the thinnest memory storage device. In this new work, the researchers reduced the size even further, shrinking the cross section area down to just a single square nanometer. Getting a handle on the physics that pack dense memory storage capability into these devices enabled the ability to make them much smaller. Defects, or holes in the material, provide the key to unlocking the high-density memory storage capability.

When a single additional metal atom goes into that nanoscale hole and fills it, it confers some of its conductivity into the material, and this leads to a change or memory effect,” said Deji Akinwande, professor in the Department of Electrical and Computer Engineering.

Though they used molybdenum disulfide – also known as MoS2 – as the primary nanomaterial in their study, the researchers think the discovery could apply to hundreds of related atomically thin materials.

The race to make smaller chips and components is all about power and convenience. With smaller processors, you can make more compact computers and phones. But shrinking down chips also decreases their energy demands and increases capacity, which means faster, smarter devices that take less power to operate.

The results obtained in this work pave the way for developing future generation applications that are of interest to the Department of Defense, such as ultra-dense storage, neuromorphic computing systems, radio-frequency communication systems and more,” said Pani Varanasi, program manager for the U.S. Army Research Office, which funded the research.

The original device – dubbed “atomristor” by the research team – was at the time the thinnest memory storage device ever recorded, with a single atomic layer of thickness. But shrinking a memory device is not just about making it thinner but also building it with a smaller cross-sectional area. “The scientific holy grail for scaling is going down to a level where a single atom controls the memory function, and this is what we accomplished in the new study,” Akinwande said.

Source: https://news.utexas.edu/

New Powerful Quantum Computer

Honeywell, a company best known for making control systems for homes, businesses and planes, claims to have built the most powerful quantum computer ever. Other researchers are sceptical about its power, but for the company, it is a step towards integrating quantum computing into its everyday operationsHoneywell measured its computer’s capabilities using a metric invented by IBM called quantum volume. It takes into account the number of quantum bits – or qubits – the computer has, their error rate, how long the system can spend calculating before the qubits stop working and a few other key properties.

Measuring quantum volume involves running about 220 different algorithms on the computer”, says Tony Uttley, the president of Honeywell Quantum Solutions. Honeywell’s quantum computer has a volume of 64, twice as high as the next highest quantum volume to be recorded, which was measured in an IBM quantum computer.

Like other quantum computers, this one may eventually be useful for calculations that deal with huge amounts of data. “There are three classes of problems that we are focused on right now: optimization, machine learning, and chemistry and material science,” says Uttley. “We can do those problems shrunk down to a size that fits our quantum computer today and then, as we increase the quantum volume, we’ll be able to do those problems on bigger scales.” However, this quantum computer isn’t yet able to perform calculations that would give a classical computer trouble, a feat called quantum supremacy, which was first claimed by Google in October. “While it’s cool that the company that made my thermostat is now building quantum computers, claiming it’s the most powerful one isn’t really substantiated,” says Ciarán Gilligan-Lee at University College London.

“Google’s Sycamore quantum computer used 53 qubits to achieve quantum supremacy, while Honeywell’s machine only has six qubits so far. “We know that anything less than around 50 or 60 qubits can be simulated on a classical computer relatively easily,” says Gilligan-Lee. “A six-qubit quantum computer can probably be simulated by your laptop, and a supercomputer could definitely do it.” Having the highest quantum volume may mean that Honeywell’s qubits are remarkably accurate and can calculate for a long time, but it doesn’t necessarily make it the most powerful quantum computer out there, he says.

Scott Aaronson at the University of Texas at Austin  agrees. “Quantum volume is not the worst measure, but what I personally care about, much more than that or any other invented measure, is what you can actually do with the device that’s hard for a classical computer to simulate,” he says. “By the latter measure, the Honeywell device is not even close to the best out there.”

Source: https://www.newscientist.com/

Electric Cars Soon Less Expensive Than Petrol Vehicles

An international research team has pioneered and about to patent a new filtration technique that could one day slash lithium extraction times and change the way the future is powered. The world-first study, published today in the journal Nature Materials, presents findings that demonstrate the way in which Metal-Organic Framework (MOF) channels can mimic the filtering function, or ‘ion selectivity’, of biological ion channels embedded within a cell membrane.

Inspired by the precise filtering capabilities of a living cell, the research team has developed a synthetic MOF-based ion channel membrane that is precisely tuned, in both size and chemistry, to filter lithium ions in an ultra-fast, one-directional and highly selective manner. This discovery, developed by researchers at Monash University, CSIRO, the University of Melbourne and the University of Texas at Austin, opens up the possibility to create a revolutionary filtering technology that could substantially change the way in which lithium-from-brine extraction is undertaken. This technology is the subject of a worldwide patent application filed in 2019. Energy Exploration Technologies, Inc. (EnergyX) has since executed a worldwide exclusive licence to commercialise the technology.

Based on this new research, we could one day have the capability to produce simple filters that will take hours to extract lithium from brine, rather than several months to years,” said Professor Huanting Wang, co-lead research author and Professor of Chemical Engineering at Monash University. “Preliminary studies have shown that this technology has a lithium recovery rate of approximately 90 percent – a substantial improvement on 30 percent recovery rate achieved through the current solar evaporation process.”

Professor Benny Freeman from the McKetta Department of Chemical Engineering at The University of Texas at Austin, commented: “Thanks to the international, interdisciplinary and collaborative team involved in this research, we are discovering new routes to very selective separation membranes. “We are both enthusiastic and hopeful that the strategy outlined in this paper will provide a clear roadmap for resource recovery and low energy water purification of many different molecular species.”

Associate Professor (Jefferson) Zhe Liu from The University of Melbourne explained: “The working mechanism of the new MOF-based filtration membrane is particularly interesting, and is a delicate competition between ion partial dehydration and ion affinitive interaction with the functional groups distributed along the MOF nanochannels. “There is significant potential of designing our MOF-based membrane systems for different types of filtration applications, including for use in lithium-from-brine extraction.”

Source: https://www.monash.edu/

Artificial Intelligence Revolutionizes Farming

Researchers at MIT have used AI to improve the flavor of basil. It’s part of a trend that is seeing artificial intelligence revolutionize farming.
What makes basil so good? In some cases, it’s AI. Machine learning has been used to create basil plants that are extra-delicious. While we sadly cannot report firsthand on the herb’s taste, the effort reflects a broader trend that involves using data science and machine learning to improve agriculture 

The researchers behind the AI-optimized basil used machine learning to determine the growing conditions that would maximize the concentration of the volatile compounds responsible for basil’s flavor. The basil was grown in hydroponic units within modified shipping containers in Middleton, Massachusetts. Temperature, light, humidity, and other environmental factors inside the containers could be controlled automatically. The researchers tested the taste of the plants by looking for certain compounds using gas chromatography and mass spectrometry. And they fed the resulting data into machine-learning algorithms developed at MIT and a company called Cognizant.

The research showed, counterintuitively, that exposing plants to light 24 hours a day generated the best taste. The research group plans to study how the technology might improve the disease-fighting capabilities of plants as well as how different flora may respond to the effects of climate change.

We’re really interested in building networked tools that can take a plant’s experience, its phenotype, the set of stresses it encounters, and its genetics, and digitize that to allow us to understand the plant-environment interaction,” said Caleb Harper, head of the MIT Media Lab’s OpenAg group, in a press release. His lab worked with colleagues from the University of Texas at Austin on the paper.

The idea of using machine learning to optimize plant yield and properties is rapidly taking off in agriculture. Last year, Wageningen University in the Netherlands organized an “Autonomous Greenhousecontest, in which different teams competed to develop algorithms that increased the yield of cucumber plants while minimizing the resources required. They worked with greenhouses where a variety of factors are controlled by computer systems.

The study has appeared  in the journal PLOS One.

Source: https://www.technologyreview.com/