Tag Archives: robot
A rectangular robot as tiny as a few human hairs can travel throughout a colon by doing back flips, Purdue University engineers have demonstrated in live animal models. Why the back flips? Because the goal is to use these robots to transport drugs in humans, whose colons and other organs have rough terrain. Side flips work, too. Why a back-flipping robot to transport drugs? Getting a drug directly to its target site could remove side effects, such as hair loss or stomach bleeding, that the drug may otherwise cause by interacting with other organs along the way.
The study, published in the journal Micromachines, is the first demonstration of a microrobot tumbling through a biological system in vivo. Since it is too small to carry a battery, the microrobot is powered and wirelessly controlled from the outside by a magnetic field.
“When we apply a rotating external magnetic field to these robots, they rotate just like a car tire would to go over rough terrain,” said David Cappelleri, a Purdue associate professor of mechanical engineering. “The magnetic field also safely penetrates different types of mediums, which is important for using these robots in the human body.”
The researchers chose the colon for in vivo experiments because it has an easy point of entry – and it’s very messy. “Moving a robot around the colon is like using the people-walker at an airport to get to a terminal faster. Not only is the floor moving, but also the people around you,” said Luis Solorio, an assistant professor in Purdue’s Weldon School of Biomedical Engineering. “In the colon, you have all these fluids and materials that are following along the path, but the robot is moving in the opposite direction. It’s just not an easy voyage.”
But this magnetic microrobot can successfully tumble throughout the colon despite these rough conditions, the researchers’ experiments showed. The team conducted the in vivo experiments in the colons of live mice under anesthesia, inserting the microrobot in a saline solution through the rectum. They used ultrasound equipment to observe in real time how well the microrobot moved around.
In an effort to make robots more effective and versatile teammates for Soldiers in combat, Army researchers are on a mission to understand the value of the molecular living functionality of muscle, and the fundamental mechanics that would need to be replicated in order to artificially achieve the capabilities arising from the proteins responsible for muscle contraction.
Bionanomotors, like myosins that move along actin networks, are responsible for most methods of motion in all life forms. Thus, the development of artificial nanomotors could be game-changing in the field of robotics research.
Researchers from the U.S. Army Combat Capabilities Development Command‘s Army Research Laboratory ‘(CCDC ARL) have been looking to identify a design that would allow the artificial nanomotor to take advantage of Brownian motion, the property of particles to agitatedly move simply because they are warm.
The CCDC ARL researchers believe understanding and developing these fundamental mechanics are a necessary foundational step toward making informed decisions on the viability of new directions in robotics involving the blending of synthetic biology, robotics, and dynamics and controls engineering.
Army researchers are on a mission to understand the value of the molecular ‘living’ functionality of muscle, and the fundamental mechanics that would need to be replicated in order to artificially achieve the capabilities arising from the proteins responsible for muscle contraction
“By controlling the stiffness of different geometrical features of a simple lever-arm design, we found that we could use Brownian motion to make the nanomotor more capable of reaching desirable positions for creating linear motion,” said Dean Culver, a researcher in CCDC ARL’s Vehicle Technology Directorate. “This nano-scale feature translates to more energetically efficient actuation at a macro scale, meaning robots that can do more for the warfighter over a longer amount of time.”
“These widely accepted muscle contraction models are akin to a black-box understanding of a car engine,” Culver explained. “More gas, more power. It weighs this much and takes up this much space. Combustion is involved. But, you can’t design a car engine with that kind of surface-level information. You need to understand how the pistons work, and how finely injection needs to be tuned. That’s a component-level understanding of the engine. We dive into the component-level mechanics of the built-up protein system and show the design and control value of living functionality as well as a clearer understanding of design parameters that would be key to synthetically reproducing such living functionality.”
Culver stated that the capacity for Brownian motion to kick a tethered particle from a disadvantageous elastic position to an advantageous one, in terms of energy production for a molecular motor, has been illustrated by ARL at a component level, a crucial step in the design of artificial nanomotors that offer the same performance capabilities as biological ones.
“This research adds a key piece of the puzzle for fast, versatile robots that can perform autonomous tactical maneuver and reconnaissance functions,” Culver said. “These models will be integral to the design of distributed actuators that are silent, low thermal signature and efficient – features that will make these robots more impactful in the field.”
Culver noted that they are silent because the muscles don’t make a lot of noise when they actuate, especially compared to motors or servos, cold because the amount of heat generation in a muscle is far less than a comparable motor, and efficient because of the advantages of the distributed chemical energy model and potential escape via Brownian motion.
According to Culver, the breadth of applications for actuators inspired by the biomolecular machines in animal muscles is still unknown, but many of the existing application spaces have clear Army applications such as bio-inspired robotics, nanomachines and energy harvesting.
The Journal of Biomechanical Engineering recently featured their research.
Fieldwork Robotics, a University of Plymouth spin-off company, is developing an autonomous harvesting robot platform. A number of flexible robot arms attached to the platform will be able to pick raspberries, tomatoes, and other crops without crushing them or destroying the plant.
Fieldwork Robotics has completed initial field trials of its robot raspberry harvesting system. The tests took place at a West Sussex farm owned by Fieldwork’s industry partner, leading UK soft-fruit grower Hall Hunter Partnership, which supplies Marks & Spencer, Tesco and Waitrose. Data from the trials will be used to refine and improve the prototype system before further field trials are held later this year. If they are successful, manufacturing of a commercial system is expected to begin in 2020.
Fieldwork Robotics was incorporated to develop and commercialise the work of Dr Martin Stoelen, Lecturer in Robotics at the University’s School of Computing, Electronics and Mathematics.
“Starting the field testing at Hall Hunter Partnership is a major milestone for us, and will give us invaluable feedback to keep developing the system towards commercialisation, as part of our Innovate UK funding. I am very proud of the achievements of the team, at Fieldwork Robotics Ltd and across my different research projects on robotic harvesting here at the University of Plymouth, says Dr Martin Stoelen,
Farmers around the world are increasingly interested in robot technology to address the long-term structural decline in labour. Fieldwork is focusing initially on raspberries because they are hard to pick, are more delicate and easily damaged than other soft fruits, and grow on bushes with complex foliage and berry distribution. Once the system is proved to work with raspberries, it can be adapted readily for other soft fruits and vegetables, with the same researchers also developing proof-of-concept robots for other crops following interest from leading agribusinesses.
Wearing a sensor-packed glove while handling a variety of objects, MIT researchers have compiled a massive dataset that enables an AI system to recognize objects through touch alone. The information could be leveraged to help robots identify and manipulate objects, and may aid in prosthetics design.
The researchers developed a low-cost knitted glove, called “scalable tactile glove” (STAG), equipped with about 550 tiny sensors across nearly the entire hand. Each sensor captures pressure signals as humans interact with objects in various ways. A neural network processes the signals to “learn” a dataset of pressure-signal patterns related to specific objects. Then, the system uses that dataset to classify the objects and predict their weights by feel alone, with no visual input needed.
In a paper published today in Nature, the researchers describe a dataset they compiled using STAG for 26 common objects — including a soda can, scissors, tennis ball, spoon, pen, and mug. Using the dataset, the system predicted the objects’ identities with up to 76 percent accuracy. The system can also predict the correct weights of most objects within about 60 grams.
Similar sensor-based gloves used today run thousands of dollars and often contain only around 50 sensors that capture less information. Even though STAG produces very high-resolution data, it’s made from commercially available materials totaling around $10.
The tactile sensing system could be used in combination with traditional computer vision and image-based datasets to give robots a more human-like understanding of interacting with objects.
“Humans can identify and handle objects well because we have tactile feedback. As we touch objects, we feel around and realize what they are. Robots don’t have that rich feedback,” says Subramanian Sundaram PhD ’18, a former graduate student in the Computer Science and Artificial Intelligence Laboratory (CSAIL). “We’ve always wanted robots to do what humans can do, like doing the dishes or other chores. If you want robots to do these things, they must be able to manipulate objects really well.”
The researchers also used the dataset to measure the cooperation between regions of the hand during object interactions. For example, when someone uses the middle joint of their index finger, they rarely use their thumb. But the tips of the index and middle fingers always correspond to thumb usage. “We quantifiably show, for the first time, that, if I’m using one part of my hand, how likely I am to use another part of my hand,” he says.
The new york-based ‘multi-planetary’ design agency, AI Spacefactory, gives us a closer look of what life on Mars might actually be like as they receive 1st place in the finale of NASA‘s 3D printed habitat challenge. After , the final phase saw structures built head to head over a duration of 30 hours and 3 days. the winning 15-foot tall prototype, called ‘MARSHA’, prevailed due to its level of autonomy and material performance, seeing the team scoop the prize of $500,000.
In addition to being built with nearly no human assistance, AI spacefactory was also awarded the top place for MARSHA’s innovative biopolymer basalt composite – a biodegradable and recyclable material derived from natural materials found on Mars. After withstanding NASA’s pressure, smoke, and impact testing, this material was found to be stronger and more durable than its concrete competitors.
CLICK ON THE IMAGE TO ENJOY THE VIDEO
‘it’s light, and it’s strong, like an airplane. that’s going to be very important for these types of habitats,’ comments Lex Akers, dean of the caterpillar college of engineering and technology at Bradley university.
AI spacefactory autonomously constructed their prototype Mars entire in-situ, lifting an industrial robot 13-feet into the air on a forklift to 3D print the vertical, egg-shape habitat After spending 2 years developing construction technologies for Mars, AI spacefactory plans to bring its space-driven technologies back to earth this year. demonstrating the sustainable nature of their biopolymer composite, they will recycle the materials from MARSHA and re-use them to 3d print TERA – the first-ever space-tech eco habitat on earth.
‘We developed these technologies for space, but they have the potential to transform the way we build on earth,’ said David Malott, CEO and founder of AI spacefactory. ‘By using natural, biodegradable materials grown from crops, we could eliminate the building industry’s massive waste of unrecyclable concrete and restore our planet.’
Computers and artificial intelligence continue to usher in major changes in the way people shop. It is relatively easy to train a robot’s brain to create a shopping list, but what about ensuring that the robotic shopper can easily tell the difference between the thousands of products in the store?
Purdue University researchers and experts in brain-inspired computing think part of the answer may be found in magnets. The researchers have developed a process to use magnetics with brain-like networks to program and teach devices such as personal robots, self-driving cars and drones to better generalize about different objects.
“Our stochastic neural networks try to mimic certain activities of the human brain and compute through a connection of neurons and synapses,” said Kaushik Roy, Purdue’s Edward G. Tiedemann Jr. Distinguished Professor of Electrical and Computer Engineering. “This allows the computer brain to not only store information but also to generalize well about objects and then make inferences to perform better at distinguishing between objects.”
The stochastic switching behavior is representative of a sigmoid switching behavior of a neuron. Such magnetic tunnel junctions can be also used to store synaptic weights. Roy presented the technology during the annual German Physical Sciences Conference earlier this month in Germany. The work also appeared in the Frontiers in Neuroscience.
The switching dynamics of a nano-magnet are similar to the electrical dynamics of neurons. Magnetic tunnel junction devices show switching behavior, which is stochastic in nature. The Purdue group proposed a new stochastic training algorithm for synapses using spike timing dependent plasticity (STDP), termed Stochastic-STDP, which has been experimentally observed in the rat’s hippocampus. The inherent stochastic behavior of the magnet was used to switch the magnetization states stochastically based on the proposed algorithm for learning different object representations. “The big advantage with the magnet technology we have developed is that it is very energy-efficient,” said Roy, who leads Purdue’s Center for Brain-inspired Computing Enabling Autonomous Intelligence. “We have created a simpler network that represents the neurons and synapses while compressing the amount of memory and energy needed to perform functions similar to brain computations.”
A British arts engineering company says it has created the world’s first AI robot capable of drawing people who pose for it. The humanoid called Ai-Da can sketch subjects using a microchip in her eye and a pencil in her robotic hand – coordinated by AI processes and algorithms. Ai-Da‘s ability as a life-like robot to draw and paint ultra-realistic portraits from sight has never been achieved before, according to the designers in Cornwall. It is the brainchild of art impresario and galleries Aidan Meller.
Named after Ada Lovelace , the first female computer programmer in the world, Ai-Da the robot has been designed and built by Cornish robotics company Engineered Arts who make robots for communication and entertainment.
In April 2018, Engineered Arts created an ultra-realistic robot to promote the Westworld TV show.
CLICK ON THE IMAGE TO ENJOY THE VIDEO
“Pioneering a new AI art movement, we are excited to present Ai-Da, the first professional humanoid artist, who creates her own art, as well as being a performance artist. “As an AI robot, her artwork uses AI processes and algorithms. “The work engages us to think about AI and technological uses and abuses in the world today.” explains Aidan Meller.
Professors and post-Phd students at Oxford University and Goldsmiths are providing Ai-Da with the programming and creative design for her art work. While students at Leeds University are custom designing and programming a bionic arm to create her art work.
Ai-Da has a “RoboThespian” body , featuring an expressive range of movements and she has the ability to talk and respond to questions. The robot also has a “Mesmer” head, featuring realistic silicone skin, 3D printed teeth and gums, integrated eye cameras, as well as hair.
China’s state news agency Xinhua this week introduced the newest members of its newsroom: AI anchors who will report “tirelessly” all day every day, from anywhere in the country. Chinese viewers were greeted with a digital version of a regular Xinhua news anchor named Qiu Hao. The anchor, wearing a red tie and pin-striped suit, nods his head in emphasis, blinking and raising his eyebrows slightly.
“Not only can I accompany you 24 hours a day, 365 days a year. I can be endlessly copied and present at different scenes to bring you the news,” he says. Xinhua also presented an English-speaking AI, based on another presenter, who adds: “The development of the media industry calls for continuous innovation and deep integration with the international advanced technologies … I look forward to bringing you brand new news experiences.”
Developed by Xinhua and the Chinese search engine, Sogou, the anchors were developed through machine learning to simulate the voice, facial movements, and gestures of real-life broadcasters, to present a “a lifelike image instead of a cold robot,” according to Xinhua.
If your robot doesn’t weigh anything, you don’t have to worry about falling over, Researchers from the University of Tokyo have developed a quadrotor with legs called Aerial-Biped. Designed primarily for entertainment, Aerial-Biped enables “a richer physical expression” by automatically generating walking gaits in sync with its quadrotor body.
Until someone invents a robot that can moonwalk, you can model a gait that appears normal by simply making sure that the velocity of a foot is zero as long as it’s in contact with the ground. The Aerial-Biped robot learns how to do this through reinforcement learning in a physics simulator, and the policy transfers to the robot well enough that the legs can appear to walk as the quadrotor moves.
CLICK ON THE IMAGE TO ENJOY THE VIDEO
The lead author Azumi Maekawa from the University of Tokyo explained: . “We were inspired by bipedal robots that use invisible force to get stability, such as Magdan, created by Tomotaka Takahashi (an electromagnet on the bottom of its feet lets it walk on a metal plate), and BALLU (which uses buoyancy of a helium-filled balloon). The foot trajectory generation method is based on the assumption that one of the key features of walking (or at least the appearance of walking) is that the velocity of the foot in contact with the ground is zero. The goal is to develop a robot that has the ability to display the appearance of bipedal walking with dynamic mobility, and to provide a new visual experience.