Tag Archives: Robotics

Six-Legged Robot is faster than Insects

Evolution follows very intelligent designs, filtering out the failures by trial and error. However, evolution in nature takes place over billions of years, but that span of time may not be available to designers of robots. Usually, robotics design, inspired by biology, is about the designer figuring out the clever tricks that evolution has perfected and applying them to the robot for beating nature at her own game.

For instance, studies have shown that most six-legged insects move with a tripedal gait, meaning they move at least three legs at a time. On the other hand, EPFL researchers from Lausanne, Switzerland, have reported in Nature Communications that a bipedal gait for a hexapod is more efficient and a faster way of moving—using two active legs at once.

When moving, especially when moving fast, animals with legs tend to minimize the time their legs remain in contact with the ground. Therefore, fast moving mammals prioritize flight phases, in which their motion seems more like a sequence of jumps rather than fast walking. However, for hexapedal insects, whether they are moving slowly or fast, movement consists of keeping at least three legs in contact with the ground at all times.

Mathematically, the tripedal gait is less efficient than a gait involving two legs. This is simple to calculate, as a hexapod using three legs at a time gets two power strokes per gait cycle, whereas, if it used two legs at a time, it would instead get three. The EPFL researchers tested this theory on hexapedal robots. They conclusively proved that by using two legs at once instead of three, hexapedal robots could move 25% faster. Therefore, rather than use the natural tripedal gait of insects, a hexapedal walking robot, with a bipedal gait, could be more dynamic, although statically not so stable. That brought the investigators to an interesting question: why are insects using a slower gait, when they could be moving faster?

The researchers found that insects also needed to move on places that are not always upright and horizontal, such as walls and ceilings. Walking on walls and ceilings requires feet that stick or grab to surfaces—most flying insects have this capability. They concluded that for walking while clinging to surfaces, it is best to follow a tripedal gait, but when running on the ground, a bipedal gait is faster.

The researchers tested their theory further by negating the adhesive property of insects’ feet by giving flies some polymer boots. The flies responded by moving on to a bipedal gait from a tripedal one. Even when placed on a very slippery surface, their behavior did not change, suggesting the tripedal gait was due to the structure causing the adhesion in the legs, or the sensory feedback the legs generated. This experiment proved conclusively that even when adhesion was unnecessary, insects could not move to a bipedal gait, as having sticky feet, they needed the leverage of three legs to unstick the other three.

Such biorobotics helps us in two ways. On one hand, it explains why nature works the way it does, and on the other, it shows how we can make faster and better robots.

An Autonomous Robot Called Bat Bot or B2

Although detested and at the same time revered by people all over the world, bats are undoubtedly remarkable creatures when it comes to their ability to fly. While birds do perform the most nimble aerobatics, and most fishes swim superbly in water, bats possess the most refined powered flight mechanism, unmatched in the animal kingdom. Now a team of scientists has studied the way bats fly, and have built the first robot to mimic their flight mechanism. They have named the robot Bat Bot, or B2.

The scientists had a tough time when they tried to imitate the natural flight of a bat. Bats have flexible membranes on their wings, and use more than 40 active and passive joints with each flap of their wings. Moreover, they have bones with the capability to deform each time the bat beats its wings. The scientists found it very difficult to replicate the complete suite of biological tricks that bats use regularly.

In creating the Bat Bot, the scientists have achieved an engineering marvel. The Bat Bot weighs only about 94 grams—about as heavy as two golf balls. It has a carbon-fiber skeleton with a head filled with its on-board computer and sensors. The five micro-sized motors are strung along its backbone, and the entire skeletal structure has a silicone membrane stretched over it. A trio of roboticists at Caltech, led by Soon-Jo Chung, designed the Bat Bot capable of autonomous flapping flight. They unveiled it in the journal Science Robotics. At present, Bat Bot can perform only four main components of the movements of a bat’s wing—the shoulder, elbow, wrist bend, and the side-to-side tail swish.

According to Chung, his team had to give up the thought of simply mechanizing the flapping wings of a bat, joint by joint. They quickly understood the impossible task of incorporating all the forty joints in the design of Bat Bot, as it would only have resulted in a heavy robot, incapable of any type of flight.

After a careful study of a bat’s flight mechanism, including the biological studies documented by Dan Riskin of the Discovery Channel, the team tried to understand, among the 40 joints, those absolutely vital for the flight. Finally, they settled on a total of nine joints for the Bat Bot.

Although the Bat Bot is a sophisticated and advanced piece of machinery, it is still a very simple bat compared to the natural animal. For instance, Bat Bot does not have knuckles or joints in its carbon fiber fingers, and Bat Bot cannot actively twist its wrists that normal bats can do naturally.

Chung’s team had to make additional simplifications as well. For instance, the hyper-thin silicon membrane of Bat Bot has uniform flexibility, whereas the wing membrane of an actual bat has variable levels of stiffness in different places.

In spite of the above differences, Bat Bot does make elegant flights, almost indistinguishable from that of its biological cousin. While gliding through the air, Bat Bot has grace and fluidity, independently tucking and extending its wrists, shoulders, elbows, and legs.

Soft Robots Mimic Biological Movements

At Harvard University, researchers have developed a model for designing soft robots. The special features of these robots include bending as a human index finger does and twisting like a thumb when a single pressure source powers the robots.

For long, scientists have followed a process of trial and error for designing a soft robot that moves organically—twisting as a human wrist does, or bending just like a finger. Now, at the Wyss Institute for Biologically inspired Engineering and the Harvard JA Paulson School of Engineering and Applied Sciences, researchers have developed a method for automatically designing soft actuators that are based on the desired movement. They have published their findings in the Proceedings of the National Academy of Sciences.

To perform the biologically inspired motions, the researchers turned to mathematics modeling for optimizing the design of the actuator. According to Katia Bertoldi, Associate Professor and coauthor of the paper, now they do not design the actuators empirically. The new method allows them to plug in a motion and the model gives them the design of the actuator that will achieve that motion.

Although the design of a robot that can bend as a finger or a knee does can seem simple, it is actually an incredibly complex process in practice. The complications of the design stems from the fact that one single actuator cannot produce the complex motions necessary. According to the first author of the paper, Fionnuala Connolly, who is also a graduate student at SEAS, the design requires sequencing the actuator segments. Each of them performs a different motion, with only a single input actuating them all.

The team uses fiber-reinforced, fluid-powered actuators. Their method uses mathematical modeling for optimizing the design of the actuators, which perform a certain motion. With their method, the team was able to design soft robots that bend and twist just as human fingers and thumbs do.

SEAS have developed an online, open-source resource that provides the new methodology in the form of a Soft Robotic Toolkit. This will assist educators, researchers, and budding innovators in designing, fabricating, modeling, characterizing, and controlling their own soft robots.

The robotics community has long been interested in embedding flexible materials such as cloth, paper, fiber, and other particles including soft fluidic actuators, which consist of elastomeric matrices. These are lightweight, affordable, and easily customizable to a given application.

These multi-material fluidic actuators are interesting as the robotics community can rapidly fabricate them in a multi-step molding process. Only a simple control input such as from a pressurized fluid achieves the combinations of extension, contraction, twisting, and bending. Compared to the existing designs, new design concepts are using fabrication approaches and soft materials for improving the performance of these actuators.

For instance, motivating applications are using soft robotics such as heart assist devices and soft robotic gloves for defining motion and forcing profile requirements. It is possible to embed mechanical intelligence within these soft actuators for achieving these performance requirements with simple control inputs. The challenge lies in the nonlinear nature of the large bending motions the hyper-elastic materials produce, which make it difficult to characterize and predict their behavior.

The Raspberry Pi MeARM

Arms are a favorite with robotic enthusiasts. The number of joints in an arm ensures this. For instance, an arm can be made to rotate a full circle, and bend to almost at right angles. Each finger on an arm can be manipulated independently, and each finger can have at least three joints. Therefore, an arm with even two fingers and an opposing thumb can pick up objects—with pressure sensing. A simple project such as an arm can become as complicated as one can make it.

The above reasons made the original MeARM kit a veritable success. It was a pocket sized robot arm and budding Raspberry Pi (RBPi) enthusiasts quickly latched on to it. The design was simple, an open-source. It needed only three parts, the servomotors, screws, and the laser-cut parts. This simplicity spread the design round the world, making it massively successful. Although parents were skeptical of its complexity, children loved it. Its makers, the Bens, are now back with a new project, the MeARM Pi.

The new MeARM Pi, like its predecessor, is also simple enough for children to build it themselves. The RBPi gives the arm its hardware and processing power making the whole project a pleasant, fun, and simple experience. In just thirty minutes, you can build the new MeARM, connect it to your RBPi, add the Wi-Fi, connect it to your network, and start programming it using your favorite language—JavaScript, Python, Snap, or Scratch. Now, isn’t that a fun way to start learning to code?

The workings of the MeARM Pi are straightforward and simple. The GPIO pins on the RBPi drive the servos directly. The RBPi communicates directly with the joysticks using an I2C ADC. Even the on-board RGB LED gets its power directly from the GPIO pins, so playing around with colors is simplified. Although the regular 2 Amp RBPi power supply delivers all this power without any issues, you may consider using an upgraded power supply rated at 2.5 Amps, if you are planning to plug in some more devices.

The HAT with the kit has its own power supply, which will comfortably power both the arm and the RBPi. As the HAT follows the reference design for all RBPi HAT designs, the accompanying open-source Node.js app performs a few key tasks that include controlling the servos in the arm via the GPIO pins. It also reads the state of the joysticks via the ADC.

This great kit is just right for any budding programmer stepping into the world of digital electronics. The kit contains everything needed (except the RBPi): all plastic parts, Allen key screws and Allen key, four metal-gear servos, RBPi HAT with two on-board joysticks.

To improve the quality, the kit comes with metal gear servos rather than the usual plastic ones. Moreover, small fingers of children aren’t well equipped to handle screwdrivers. That is the reason for including the Allen key parts—more reliable.

Depending on preference, you can go for either the blue color kit or the orange one. The programming languages are already available on the RBPi, so as soon as you have assembled the arm, it is ready to pick up things.

What Are The Cobots?

When inventors Joseph Eagleburger and George Devol were discussing about science fiction novels in 1954, they initiated the idea of industrial robots. It took them six years to give shape to their idea and Unimate entered a secure place in the robotic hall of fame, as the world’s first industrial robot. In 1961, Unimate began working on the assembly lines of the General Motors.

At first, people looked on with suspicion on the safety issues related to Unimate. At the time, the only reference people had for robots, was the laser-firing robot from “The Day the Earth Stood Still,” a thriller from the 1950s. Now, 50 years hence, industrial robots are far less scary.

Traditionally, robots were constructed to work under restriction inside robotic work cells with physical barriers for the safety of human workers. However, modern robots work completely outside any cage. On the factory floors today, working safely alongside their human counterparts, you will find unfettered working robots that are termed collaborative robots or cobots. Nevertheless, no robot is entirely devoid of health and safety features.

Unlike in the past, today’s industrial robots or cobots are designed specifically to work safely around humans. In fact, now robots work hand-in-hand with humans on the same assembly tasks and it has been independently certified that this is safe. The two-armed collaborative robot from ABB Robotics, YuMi, contributed largely to this certification.

To prevent accidents with human workers, cobots utilize sensors installed on them. The sensors monitor the location of humans around them on the factory floor and react to human contact. Therefore, even if a person does come too close to the machinery, it simply and automatically shuts down. Moreover, cobots work with strength, speed, and force limited to avoid causing serious injury to humans if there is any contact.

Most cobots are simple enough to require practically no skill in programming them. Anyone, who can operate a smartphone, can program them to operate. In contrast, complex robots of about a decade ago needed a host of highly skilled technicians to program and monitor them while in operation.

Among the industries that are being transformed by such collaborative machinery, the most to benefit is the automotive industry. As such, this sector has always been at the forefront of industrial robotics. Automotive manufacturers have been using robots and robotic equipment since the 1960s, but a lot has changed since then. The competitive nature of the industry forces manufacturing lines to be highly efficient, flexible and more productive than ever before.

Not that all this means any advancement in robotics is a threat to human jobs on the production line. For instance, builders use a concrete mixer to help the bricklayer and not to replace him. In the same way, collaborative robots only assist workers on the assembly line and do not actually replace them. According to some experts, production line workers will ultimately use collaborative robots as helpers in the same way as engineers use computers to further their own work and make their jobs easier.

What are Light-Emitting Capacitors good for?

HLEC, or Hyper-elastic Light-Emitting Capacitors are good for making electroluminescent skin for robotics, and you can do a lot with them. That is according to Dr. Rob Shepard of Cornell University and his team of graduate students, who have published a paper on the electroluminescent skin they have developed recently.

The team was inspired to develop the electroluminescent skin by observing several cephalopods such as the Octopus. According to the team, their material can change its color, just as an octopus can, including changing its size to fit into areas that structures that are more rigid cannot. For instance, the skin continues to emit light even when it has been stretched to about six times its original size.

Layers of transparent hydrogel electrodes separated with elastomer sheets as dielectric make up these HLEC or Hyper-elastic Light-Emitting Capacitors. Panels of these capacitors, integrated into robotic systems, and outfitted with sensors act as ideal health-based sensor applications for wearables. The team at Cornell has fabricated one robotic system from three panels and it is capable or crawling. With each panel consisting of six layers, the robot crawls along with worm-like movements, using two pneumatic actuators that alternately inflate and then deflate. You can see the stretchable skin and its crawling action in the video here.

Although the team is in raptures over how well the HLEC panels function, their next step is convert the material into practical devices with applications – find a reason to use it, as they say.

The team expects the development of uses for these new panels to lead to some innovative applications. Although at present, the speculated devices range primarily from health care to industries related to transportation, there is a significant interest in future robotic application as well. The latter is based on the interest in advancing the way robots interact with humans.

For instance, the robot Atlas, from Boston Dynamics, looks formidable enough to crush you were you to give it a hug accidentally. Humans prefer soft and puffy robots, and in the future, robots may even be able to change color based on the mood of the person in front. People generally grow an innate fear of robots after having watched T-800 in movies such as ‘The Terminator’. However, future robots such as Baymax should help make a difference in their thinking. According to Professor Shephard, HLEC panels can be part of the break-through.

It is important to get the human-robot interactions right. Simple things such as the ability to change their color can let robots make emotional connections with humans. This could be in response to the tone of the room or the mood of the humans in it.

This new electroluminescent skin has a huge potential for all kinds of new devices. However, it needs the assistance of other engineers as well to discover new applications and make use of this technology. Primarily, the material scientists who developed this skin are planning to use this for life-saving wearable health monitors. However, it could easily be used as a robot that fits into tight areas. Once the HLEC panels are commercially available, surely, there will be many people to think of additional innovative applications.

Build a Humanoid Robot from Raspberry Pi

Raspberry Pi or RBPi is the ubiquitous low-cost, credit card sized single board computer with huge potential starting from teaching youngsters computer programming to driving robots on Mars. However, when Tyler Spadgenske tried his hands on RBPi, he used the SBC to create Andy – a completely open-source humanoid robot.

Tyler has tried to make Andy a connected robot. Andy can connect to humans through speech, using language as humans do – for answering questions. With access to the Internet, he (Tyler assures Andy is male) can also talk to client programs over the Web. With ability to connect via Bluetooth, Andy communicates with other robots such as the Mindstorms NXT.

Using a bipedal mechanism that offers him mobility, Andy can do additional tasks such as moving stuff. Of course, Andy has his limitations, but then, he can collaborate with other robots to get those things done, which he cannot. Tyler has given Andy only speech as the user interface, since he feels a humanoid should have no other. However, that does not limit Andy from interfacing with other computers over the Internet, because basically, he is a computer himself.

Initially Tyler was using Robosapien for Andy’s bipedal movement, but that did not work out satisfactorily. He is using a new bipedal system using SolidWorks. Later, Tyler plans to add a torso, a head and arms for Andy, again using SolidWorks and 3D printing.

Starting up Andy is very simple – flip the switch on his back to the on position. Andy has LiPo batteries rated for 11.1V, 1.3A and 1300mAH. These power his motors through the L298 motor drivers, which the RBPi drives. As soon as the RBPi receives power, which is regulated with a UBEC, it starts executing Andy’s software. This begins with some configuration checks such as for starting the server and running some modes. Then Andy settles down and prepares to listen to his microphones.

Now, Andy is up and running as a state machine. He will listen to commands from either his microphones or his server – first converting any command received from either to text and then executing it.

After converting the command to text form, Andy interprets it by comparing it to the command set in his repertoire. That gives him the correct function he must execute for a specific command. For example, for a shutdown command, Andy initiates a complete sequential software and hardware shutdown, ultimately switching himself off. For any other command, however, Andy executes it and then goes back to wait for commands from his microphone or server.

Andy’s brain, the RBPi, controls almost everything for him, including speech recognition and motor control to Andy’s software. Andy has three L298 motor drivers, with each capable of controlling and driving two motors each. Therefore, Andy is capable of driving a total of six motors. As the RBPi had only a limited GPIO pins, Tyler had to expand them using an MCP23017 chip.

Tyler plans to give Andy 10 degrees of freedom with the new SolidWorks hardware. His new features will include monitoring the battery voltage, a power on LED, an LED output with five segments and ten servos – six for the arms and four for the legs.

The GoPiGo Robot Kit for the Raspberry Pi

Making a robot work with the tiny computer Raspberry Pi or RBPi has never been so easy. If you use the RBPi robot kit GoPiGo, all you will need is a small screwdriver with a Phillips head. The GoPiGo kit comes in a box that contains a battery box for eight or 6 AA batteries, two bags of hardware, two bags of acrylic parts, two motors, the GoPiGo board and a pair of wheels. For assembling all this into a working robot, follow these step-by-step instructions.

You start with the biggest acrylic part in the kit, the body plate or the chassis of the GoPiGo. Lay the plate on the GoPiGo circuit board and align the two holes with those on the circuit board. Place two short hex spacers in the holes below the body plate to make sure of which way is the upper side.

Next, you must attach the motors to the chassis. Use the four acrylic Ts in the kit for attaching two motors. Do not over tighten the bolts while attaching the motors, as this may crack the acrylic.

With the motors in place, it is time to attach the two encoders, one for each motor. These encoders fit on the inside of the motors and poke through the acrylic chassis of the GoPiGo. Encoders are an important part, providing feedback on speed and direction of rotation of the motor. If the encoders will not stay on, use blue ticky tacky to make them stay.

Now it is time to attach the GoPiGo board to the chassis. Place the GoPiGo board on the spacers and line its holes with the holes in the board before holding them together with screws. Two hex supports on the back of the GoPiGo board allow you to attach the castor wheel.

That brings us to attaching the wheels to the GoPiGo. You must do this gently, backing the wheels so they do not touch or rub against the screws. The battery box comes next, to be placed as far back on the chassis as possible. This gives it extra space and prevents the box from hitting the SD card on the RBPi.

This completes the mechanical assembly of the GoPiGo robot and only the RBPi remains to be attached. Locate the black plastic female connector on the GoPiGo and slide the GPIO pins of the RBPi into this connector. The RBPi remains protected by a protected plate or a canopy that has to be attached by screwing it on to the chassis.

Make the electrical connections according to the instructions. Be careful while flashing the GoPiGo hardware and leave the motors unconnected during the flashing. After connecting the GoPiGo for the first time, if you find any motor running backwards, simply reverse its connector.

GoPiGo comes with an ATMega 328 micro-controller, operating on 7-12VDC. SN7544 ICs handle the motor control part, which has two optical encoders using 18 pulse counts per rotation and a wheel diameter of 65 mm. External interfaces include single ports of I2C, Serial, analog and digital/PWM. The idling current consumed is about 3-500 mA, and full load current is 800 mA – 2A with both the motors, the servo and the camera running with the RBPi model B+.

What Can the Raspberry Pi Do After Dark?

A lot more goes on in the museums of the world at night, after everyone has vacated the premises and the guards have locked up the place, than one can imagine. The situation may not be as dramatic as what Ben Stiller shows us in the movie, “Night at the Museum,” but still, it does warrant a serious investigation. This is what Tate Britain has done with its After Dark project with help from the inaugural IK Prize.

Tate Britain has one of the largest art collections in the world. In August 2014, it organized a project After Dark, where visitors could experience the thrill of a prohibited voyage, without once stepping into the museum. For 25 hours, more than 100,000 viewers across the globe saw live streaming video over the Internet from four robots let loose in the darkness of the museum. Additionally, 500 people could take control of the robots for approximately 12 minutes each, guide them as they like and see what the robots were witnessing.

RAL Space has engineered the robots, which are based on the tiny single board computer, the Raspberry Pi or RBPi. Working alongside the UK Space Agency or UKSA, RAL Space is one of the world’s leading centers for the research and development of space exploration technologies.

RAL Space worked in close collaboration with Tate Britain, and the team behind the project After Dark combined the latest software with the bespoke RBPi hardware. They designed and engineered the robots, creating a world-first, one of a kind experience and attracted audiences from all over. The Workers, a digital product design studio, designed the Web interface for After Dark.

For the late night explorations within the museum, people from all over the world get to guide four robots by taking control of any one of them. RAL Space has designed the robots to select new operators for driving them every few minutes. As long as the event is live, people can request control of a robot from the project website. The robots know you are waiting, and as soon as a slot frees up, will try to take you on a ride. Even while you wait, you can watch the video of the event being streamed live and appearing on the project website, and on Livestream.com.

You can use the on-screen buttons on the web-based control interface or the arrow keys on your keyboard for controlling the robot. You can make the robot move forward or turn, and even make it look up or look down. The robot senses obstacles around it, feeding this information back to you. Therefore, even though it is nearly dark, you, the navigator, can operate the robot easily.

If you take the robot too close to an object, it will stop moving and inform you through the web-based control interface. Once that happens, you still have control over the robot, as you can make it turn on the spot and let it move forward, continuing with the journey, provided the path ahead is clear.

A Portable Raspberry Pi Powered display

If you have a motor to control, the RasPiRobot Board is a very good fit. Apart from controlling motors, you can also use its switch mode voltage supply to power your RBPi or Raspberry Pi using a large range of battery types. Therefore, with a pack of AA type batteries and the RasPiRobot shield, you can make a very convenient and portable RBPi powered display.

To make an RBPi display that will show the current time as a scrolling text, you need to collect a few parts. These would be – the Adafruit Bicolor square Pixel LED Matrix along with its I2C backpack, A RasPiRobot Board version 2, a battery holder with on/off switch suitable for holding 4xAA batteries and the RBPi Model B+ with 512MB RAM.

Not much of wiring is involved in setting up the parts together. The only soldering you will need to do involves the LED Matrix display, as this comes in a kit form. This is not too difficult as all the instructions are included inside the kit. Once soldering is over, fit the LED Matrix display into the I2C socket of the RasPiRobot Board.

If you are using the latest version 2 of the RasPiRobot board, you have to be careful its extended header pins do not reach up to the bare connections on the underside of the LED Matrix module. In case they do, you will need to insulate the module by covering the header pins with a layer or two of electrical insulating tape.

Next, plug in the RasPiRobot Board on top of the RBPi. Just make sure the RasPiRobot board fits over all the GPIO pins on the right hand side of the RBPi. The RasPiRobot Board has two screw terminals marked GND and Vin. From the battery box, attach the flying leads to these screw terminals taking care of the correct polarity.

Fit four rechargeable AA batteries to the battery holder. Make sure they are fully charged and fitted with the correct polarity. When you turn on the switch on the battery holder, you should see the RBPi light up its power LED as well as the two LEDs on the RasPiRobot Board.

To operate the LED Matrix board from the RBPi, you will need to install the Adafruit I2C and the Python Imaging Libraries – follow the instructions here. The guide also has a few examples to allow you to check the working of your I2C interface and consequently the LED Matrix display. For example, you can have a slow display scrolling text on the LED Matrix, showing the current time.

The LED Backpack library has a number of sub-libraries that handle the low-level interface to the matrix display. The Python Imaging Library handles the job of writing text onto the display as an image. This uses the True type Font FreeSansBold size 9 from the library, although you can use other fonts as well that look good. You may need to experiment with the fonts, as they are not primarily intended to be displayed in the 8×8 pixels the matrix uses. You can select the color of the display also.