It scampers across the floor, raising and lowering its six little legs as its eyes inspect the world around it. Two thin antennae project from its head; a gold-coloured pattern runs along its black body. Current flows through two tiny wires as if through an open nervous system. This small, artificial ant looks intelligent. You almost expect it to smile.
The bionic insect is controlled by algorithms. Stereo cameras are concealed in its head, sensors in its belly, antennae in its interior. Piezoceramic bending transducers (more on those later) move its legs; three-dimensional conductors are attached to its body. All this allows the ant to see, walk, pull and grab. It can call for help and communicate with others of its kind. The BionicAnt functions autonomously. It can make decisions and engage in co-operative behaviour. And thanks to the latest technologies and its diminutive size (just 14 centimetres long), this tiny robot, which imitates its natural counterpart, lends itself to deployment in the tightest of spaces.
Created by the German company Festo, the BionicAnt is a particularly vivid illustration of the revolutionary times in which we live. Never before have so many different fields of research intersected, wide-ranging innovations complemented one another so effectively, or globally networked companies been able to bring their ideas to fruition so quickly. The astonishing result is exponential growth in the rate of technologization, which has an established hold in the field of mobility and will continue to shape its future.
The keywords for this new era? Robotics, sensor technology, automation. Connectivity, 3-D printing, lightweight construction. Kinematics, adaptivity, miniaturization, integration of multiple functions in a tiny component. Then, add the power of algorithms. Artificial intelligence. Self-learning systems.
Confused? Many of us are. The artificial ant is a project that provides concrete illustrations of these concepts and can help us understand them. It was developed by the Bionic Learning Network, a research partnership under the direction of Festo, a company based in Esslingen, Germany. With its 18,800 employees, the company is one of the world’s leading specialists in automation technology and a long-standing Daimler AG supplier. In its bionics team, Festo has a very special think tank where engineers and designers, biologists and software specialists develop the “concepts of the future.” The team draws inspiration from a model that is millions of years old: nature itself.
Sebastian Schrof, an industrial designer who specializes in robotics and helped develop the ant for Festo, takes the bionic insect and puts it back into its plastic case. It is one of 12 artificial ants that demonstrate what they can do at trade exhibitions and technology shows all over the world. And the spectators are usually wide-eyed in amazement. Not only do the BionicAnts copy the delicately intricate anatomy of real ants, but algorithms allow them to imitate ants’ cognitive functions.
Once on its legs, each ant first makes a map of its surroundings and communicates that to the other ants. The robots soon “know” where they themselves and all of their companions are. If they wish to move an object, they radio their colleagues. The other ants quickly come crawling over to pitch in. Their knowledge has been stored: They are stronger in a swarm, and with joined forces they haul the lump away – without the help of anyone sitting at a remote-control unit.
A brain consisting of numbers
Nadine Kärcher, a member of Festo’s bionics team for the last six years, develops software for the artificial creatures. She collaborated with IT experts at Germany’s University of Ulm to write, among other things, the algorithms needed to transform the ants into acting entities: complex equations based on the most minutely defined steps. Kärcher explains how we should imagine the process: “We teach the ant that, when this occurs, you do that. If your sensors detect something on the right, then you go left to avoid it. And if your battery is empty, you go to the charging station.” Algorithms are virtually endless strings of numbers that act upon the processors.
The processor, a sort of brain that processes and distributes signals and controls the legs and grippers, is located in the ant’s posterior. Its most mind-boggling ability is that it can simulate a collective “swarm intelligence.” Explains Kärcher: “Individual systems coordinate with one another. Collectively, they take on tasks that a system would not be able to manage by itself.” To move the insect’s legs, Festo uses something known as piezo technology. If a voltage is applied to a special crystal, the crystal reacts mechanically. The surface changes its shape, stretches or contracts. The mechanical tension induces the piezo crystal to produce a voltage – an extremely effective reciprocal effect.
The ant also makes use of the “moulded interconnected device” (MID) method, which is used to attach visible three-dimensional conductors to the surfaces of the 3-D-printed components. The result is a contoured component that performs mechanical and electronic functions simultaneously.
Such new technologies inspire the imagination. At Festo, even an artificial kangaroo has been seen hopping around the office. The bionic engineers had observed something amazing in real kangaroos: Much like a rubber ball, these animals recapture energy with each landing, using it for the next hop. Kangaroos also shift their centres of gravity to jump in different parabolas. Elias Knubben, who has headed the bionics team at Festo since 2012, explains: “We took a close look at what was happening – the linear axes and impulses that have to be set in motion and braked. And we learned how to handle energy in an extremely economical way. With each hop, the kangaroo regains 80 percent of its energy. You only have to apply another 20 percent to arrive at full jumping power again. An ingenious principle.”
The bionic engineers at Festo have also emulated the tongue of a chameleon, a project that ultimately produced the FlexShapeGripper. Like a real chameleon’s tongue, this gripper extends over objects to snatch them with the help of an elastic silicone cap. They have also copied the tentacles of an octopus. The artificial octopus arm is flexible and is operated pneumatically. It wraps around objects, and suction cups allow it to create a vacuum and grip smooth surfaces, including panes of glass. Elephant trunks and fish fins have also served as blueprints for extremely capable gripping arms. These robots can pick up tomatoes, apples, and even raw eggs.
“With each new project, we learn an enormous amount from nature,” says Knubben, “and much of that has found its way into production.” Basic research is, after all, no end in itself. The aim is to use new materials, test the use of sensors in a new context, or to ask: What happens when information technology and biology meet? The ultimate goal is to identify and exploit the possibilities of the dawning age.
The future in motion
Robotics can most effectively streamline the automobility of tomorrow during the production process. The robots of the future could leave their cages and work alongside people. They are able to apply fine motor skills to their movements. Their sensors enable them to determine when they need to stop.
Prototypes such as the BionicCobot dispense with steel and electric motors entirely. Compressed air moves their joints, the axes in their elbows, lower arms and wrists. They can exert a strong grip or gently lift objects, firmly press something shut or even tap an employee on the shoulder – as if to say, perhaps, we still have to insert this screw or tighten that bolt.
At Daimler AG, such developments are being followed with keen interest. “We want intelligent robots that detect mistakes, that can follow along with what we are doing and react accordingly, and that can be operated as intuitively as a smartphone,” says Simon Klumpp, a process developer at Daimler AG in charge of assembly, robotics and automation.
With the aim of providing the customer with as personalized a product as possible, the various steps in the
production process must be engineered with ever greater variability and individuality. “Collaboration between people and robots will assume greater importance in the future,” says Simon Klumpp. “For that, however, sensors must be fused, machines must become more sensitive, be better able to see, to grasp and ideally be able to continue learning in the process.”
Development will proceed in the direction of smart systems that can provide people with ever greater assistance. And with the further development of artificial intelligence, even more extensive networking and a continuous flow of exchanged data, many of the innovations will end up being part of our everyday lives. The vision of automated driving and a resulting reduction in accident statistics is drawing ever closer thanks to sophisticated sensor technology. Intelligent parking searches (especially necessary in major cities) could likewise be organized with considerably more efficiency.
But nature is the measure of things in a higher sense as well. Elias Knubben of Festo talks of neuronal networks and swarm intelligence: “That would be a further technological leap.” His most recent bionic project shows quite vividly what is in store for us.
This morning, the team leader is standing in the glass entrance hall of the company headquarters – releasing flying butterflies. They flutter about nimbly, playfully, nearly poetically. There is something almost miraculous about the spectacle. It is like a feather-light example of the smart mobility of tomorrow. The eMotionButterfly is a bionic butterfly weighing a mere 27 grams and possessing complete command of the beating-wing principle. Its wings create thrust and lift at the same time. Its minimalistic body is the product of a 3-D printer, a wafer-thin film stretched over a carbon frame. Motors and electronics are so minimally present that they are barely detectable from a metre away. The bionic butterfly requires very little energy to stay airborne. Ten infrared cameras locate every single insect, take 160 pictures and billions of pixels per second, and track minute markers mounted on the butterflies. The butterflies do not collide, but execute masterful evasive manoeuvres, abruptly changing their direction in a swarm. And they finish by gently landing on the bionic engineer’s outstretched hand.