French company Carmat sells first artificial heart to Italian patient

A French prosthetics company announced on July 19, that it sold its first artificial heart. Carmat, the French company, announced that its artificial heart was bought and transplanted into an Italian patient.


Carmat stated that the procedure “was performed by the team headed by heart surgeon Dr Ciro Maiello at the Azienda Ospedaliera dei Colli hospital in Naples, one of the centres with the greatest experience in the field of artificial hearts in Italy.”
The artificial heart, marketed as the Aeson prosthetic heart, got the certification to be sold in the European Union in December 2020. The certification for the French company, which was founded in 2008, came based on the PIVOTAL research that had started in 2016 and which is still ongoing.
Carmat said the sale of the Aeson heart signified “a major milestone that opens up a new chapter in the company’s development.” The company also added that it was looking to help more customers in France and Germany by the end of the year. The artificial heart helps patients who need immediate transplants but have to wait for the relevant organs. According to a study in 2019, 73 percent of the patients survived the transplant for six months or until a successful permanent transplant within the same period.
While the treatment may prove to be lifesaving, the costs may be prohibitive. The surgical operation cost over 150,000 euros, which was paid for by the regional health system. The national system in Italy will not cover the procedure until it has been proven to be safe over several years.
Source: https://www.carmatsa.com/
AND
https://www.cnbctv18.com/

New Electronic Skin Reacts To Pain Like Human Skin

Researchers have developed electronic artificial skin that reacts to pain just like real skin, opening the way to better prosthetics, smarter robotics and non-invasive alternatives to skin grafts. The prototype device developed by a team at RMIT University (Australia) can electronically replicate the way human skin senses pain. The device mimics the body’s near-instant feedback response and can react to painful sensations with the same lighting speed that nerve signals travel to the brain.

Lead researcher Professor Madhu Bhaskaran said the pain-sensing prototype was a significant advance towards next-generation biomedical technologies and intelligent robotics.

Skin is our body’s largest sensory organ, with complex features designed to send rapid-fire warning signals when anything hurts,” Bhaskaran said. “We’re sensing things all the time through the skin but our pain response only kicks in at a certain point, like when we touch something too hot or too sharp. No electronic technologies have been able to realistically mimic that very human feeling of pain – until now. “Our artificial skin reacts instantly when pressure, heat or cold reach a painful threshold. “It’s a critical step forward in the future development of the sophisticated feedback systems that we need to deliver truly smart prosthetics and intelligent robotics.”

As well as the pain-sensing prototype, the research team has also developed devices made with stretchable electronics that can sense and respond to changes in temperature and pressure. Bhaskaran, co-leader of the Functional Materials and Microsystems group at RMIT, said the three functional prototypes were designed to deliver key features of the skin’s sensing capability in electronic form.

With further development, the stretchable artificial skin could also be a future option for non-invasive skin grafts, where the traditional approach is not viable or not working. “We need further development to integrate this technology into biomedical applications but the fundamentals – biocompatibility, skin-like stretchability – are already there,” Bhaskaran added.

Source: https://www.rmit.edu.au/

Artificial Skin Recreates The Human Sense Of Pain

Prosthetic technology has taken huge strides in the last decade, but accurately simulating human-like sensation is a difficult task. New “electronic skin” technology developed at the Daegu Gyeongbuk Institute of Science and Technology (DGIST) in Korea could help replicate advanced “pain” sensations in prosthetics, and enable robots to understand tactile feedback, like the feeling of being pricked, or that of heat on skin.

Trying to recreate the human senses has been a driver of technologies throughout the 20thcentury, like TV or audio playback. Mimicry of tactile sensing has been a focus of several different research groups in the last few years, but advances have mainly improved the feeling of pressure and strength in prosthetics. Human sensation, however, can detect much more subtle cues. The DGIST researchers, led by Department of Information and Communication Engineering Professor Jae Eun Jang, needed to bring together expertise from several different fields to begin the arduous task of replicating these more complex sensations in their electronic skin, working with colleagues in DGIST’s Robotics and Brain Sciences departments.

“We have developed a core base technology that can effectively detect pain, which is necessary for developing future-type tactile sensor. As an achievement of convergence research by experts in nano engineering, electronic engineering, robotics engineering, and brain sciences, it will be widely applied on electronic skin that feels various senses as well as new human-machine interactions.” Jang explained.

The DGIST team effort has created a more efficient sensor technology, able to simultaneously detect pressure and heat. They also developed a signal processing system that adjusted pain responses depending on pressure, area, and temperature.

Source: https://www.dgist.ac.kr/
AND
https://www.technologynetworks.com/

Artificial Muscle

Wearing a flower brooch that blooms before your eyes sounds like magic. KAIST researchers have made it real with robotic muscles. Researchers have developed an ultrathin, artificial muscle for soft robotics. The advancement, recently reported in the journal Science Robotics, was demonstrated with a robotic blooming flower brooch, dancing robotic butterflies and fluttering tree leaves on a kinetic art piece.

CLICK ON THE IMAGE TO ENJOY THE VIDEO

The robotic equivalent of a muscle that can move is called an actuator. The actuator expands, contracts or rotates like muscle fibers using a stimulus such as electricity. Engineers around the world are striving to develop more dynamic actuators that respond quickly, can bend without breaking, and are very durable. Soft, robotic muscles could have a wide variety of applications, from wearable electronics to advanced prosthetics.

The team from KAIST’s Creative Research Initiative Center for Functionally Antagonistic Nano-Engineering developed a very thin, responsive, flexible and durable artificial muscle. The actuator looks like a skinny strip of paper about an inch long. They used a particular type of material called MXene, which is class of compounds that have layers only a few atoms thick.

Their chosen MXene material (T3C2Tx) is made of thin layers of titanium and carbon compounds. It was not flexible by itself; sheets of material would flake off the actuator when bent in a loop. That changed when the MXene was “ionically cross-linked” — connected through an ionic bond — to a synthetic polymer. The combination of materials made the actuator flexible, while still maintaining strength and conductivity, which is critical for movements driven by electricity.

Their particular combination performed better than others reported. Their actuator responded very quickly to low voltage, and lasted for more than five hours moving continuously. To prove the tiny robotic muscle works actuator into wearable art: an origami-inspired brooch mimics how a narcissus, the team incorporated the flower unfolds its petals when a small amount of electricity is applied. They also designed robotic butterflies that move their wings up and down, and made the leaves of a tree sculpture flutter.

Wearable robotics and kinetic art demonstrate how robotic muscles can have fun and beautiful applications,” said Il-Kwon Oh, lead paper author and professor of mechanical engineering. “It also shows the enormous potential for small, artificial muscles for a variety of uses, such as haptic feedback systems and active biomedical devices.”

Source: https://www.kaist.ac.kr/

Open Bionics Releases Affordable 3D Printed Bionic Arm

Back in January 2019, the UK-company Open Bionics announced it had raised about £5 million from investors to continue developing not just simple 3D printed prosthetics but bionic devices. Last year, the company actually released its first 3D printed bionic arm that was officially medically approved. The prosthetic devices cost about £10,000 ($13,060), which is a third of the cost of traditionally manufactured equivalents. The company has been capable of using 3D technologies to reduce its costs and offer customisable 3D printed bionic devices to clinics.

CLICK ON THE IMAGE TO ENJOY THE VIDEO

Until now the devices were exclusively available in the UK and France. However, the company announced a new partnership with Hanger Clinic to bring its products to the US. One of its key products is the 3D printed Hero Arm: showcasing multi-grip functionality but also empowering aesthetics for below elbow amputee adults and children (aged 8 and above). The Hero Arm, as its name implies can be personalised to resemble a superhero’s arm. Variations include designs inspired by Frozen, Marvel Comics or even Star Wars. The prosthetic device can perform a wide-range of actions like gripping, giving an OK sign, high fiving, fist bumping, or even picking up a small object. The company stated, “Special sensors within the Hero Arm detect muscle movements, meaning you can effortlessly control your bionic hand with intuitive life-like precision. Also, haptic vibrations, beepers, buttons and lights provide you with intuitive notifications.

The first US recipients of the Hero Arm includes 14-year-old Hanger Clinic patient Meredith Gross, a high school freshman. She was born missing part of her lower left arm. She is a competitive golfer and volleyball player and previously had to use sports-specific prostheses. For the first time, she is considering using the 3D printed device for everyday tasks as well. Her mom said, “The Hero Arm has opened up a whole new world for Meredith. She found success from the moment she put It on, and has been able to do things for the first time in her life. This device allows people like Meredith to own their differences with more confidence”.

Source: https://openbionics.com/
AND
https://www.3dnatives.com/

Sensor-packed Glove Coupled With AI

Wearing a sensor-packed glove while handling a variety of objects, MIT researchers have compiled a massive dataset that enables an AI system to recognize objects through touch alone. The information could be leveraged to help robots identify and manipulate objects, and may aid in prosthetics design.

The researchers developed a low-cost knitted glove, called “scalable tactile glove” (STAG), equipped with about 550 tiny sensors across nearly the entire hand. Each sensor captures pressure signals as humans interact with objects in various ways. A neural network processes the signals to “learn” a dataset of pressure-signal patterns related to specific objects. Then, the system uses that dataset to classify the objects and predict their weights by feel alone, with no visual input needed.

In a paper published today in Nature, the researchers describe a dataset they compiled using STAG for 26 common objects — including a soda can, scissors, tennis ball, spoon, pen, and mug. Using the dataset, the system predicted the objects’ identities with up to 76 percent accuracy. The system can also predict the correct weights of most objects within about 60 grams.

Similar sensor-based gloves used today run thousands of dollars and often contain only around 50 sensors that capture less information. Even though STAG produces very high-resolution data, it’s made from commercially available materials totaling around $10.

The tactile sensing system could be used in combination with traditional computer vision and image-based datasets to give robots a more human-like understanding of interacting with objects.

Humans can identify and handle objects well because we have tactile feedback. As we touch objects, we feel around and realize what they are. Robots don’t have that rich feedback,” says Subramanian Sundaram PhD ’18, a former graduate student in the Computer Science and Artificial Intelligence Laboratory (CSAIL). “We’ve always wanted robots to do what humans can do, like doing the dishes or other chores. If you want robots to do these things, they must be able to manipulate objects really well.

The researchers also used the dataset to measure the cooperation between regions of the hand during object interactions. For example, when someone uses the middle joint of their index finger, they rarely use their thumb. But the tips of the index and middle fingers always correspond to thumb usage. “We quantifiably show, for the first time, that, if I’m using one part of my hand, how likely I am to use another part of my hand,” he says.

Source: http://news.mit.edu/