Tag Archives: Robotics

Computer Vision & Robotics in Farming

Robots are helping several industries ease labor concerns. This is increasingly so in today’s industrial environment, where the workforce is aging and work output decreasing for lack of efficiency. This includes agricultural fields in both the US and Europe, where the introduction of robots in fieldwork is helping to reduce current labor concerns. New farming standards brought on by natural and chemical resources is increasing the need for precision work, and robots and new technology is helping to alleviate that.

According to Dan Harburg, the design of traditional robots allowed them to perform only specific tasks repeatedly. Dan is with Anterra Capital, an agriculture technology venture capital firm from Amsterdam. According to Dan, robots for agricultural applications must be more flexible than those in automotive manufacturing plants are, as the former need to deal with the natural variation in outdoor environment and food products.

Accordingly, Anterra Capital considers investing in three major category areas in agriculture related to seeding and weeding, harvesting, and environmental control. Dan envisages each of these major categories would benefit from the introduction of advanced technology and robotics.

For instance, farmers get a two-fold benefit from spraying and weeding robots. First, these robots reduce the labor necessary for eliminating such mundane tasks. Second, by precise targeting of crops, the robots bring down the quantity of pesticides necessary to be sprayed. This allows farmers to save on labor costs and produce safer, healthier crops at the same time. For instance, see and spray robots from Blue River reduce agrochemical use by about 90%, as they use computer vision for targeting weeds.

Different technology companies have developed spray robotics. One among them is the Blue River Technology, a farm robotics start-up specializing in spray and weeding robots. According to Blue River, its tractors operate at 6-8 mph, covering 8-12 rows of crops simultaneously. Advanced vision systems on the tractors enable them to differentiate weeds from crops to provide direct targeting as it passes over them.

Autonomous robots from Naio Technologies use laser and camera guidance systems for navigating between rows of fruit s and vegetables autonomously, identifying different types of weeds. Oz, the robot from Naio, runs on four electric engines, working continuously for three hours before needing a battery recharge. Not needing human supervision, Oz follows rows of crops on the plot autonomously, to remove all weeds.

PlantTape, from Spain, offers a plant-transplanting robot that can plant seeds. The robot fully integrates the system of sowing, germination, and nursery care. This brings much higher efficiency as compared to that from conventional transplanting methods. The robot creates trays of tape for holding soil and seeds, with each tray holding nearly 900 plants. While pulling the tape from the tray, the automated robot tractor cuts the tape around each plant as it places the plant accurately in the soil. Farmers use PlantTape robots for planting tomatoes, onions, celery, cauliflowers, broccoli, and lettuces.

Although automation in harvesting crops is common, the variation in size, height, and color of the plants compounds the problem. They also require light pressure and touch for delicate picking. Vacuum suction robotic arms help in this area.

What is a Hygrobot?

In the future, tiny robots such as the hygrobot will be able to avoid the need for batteries and electricity to power them. Like a worm or a snake, moisture will power these tiny wriggly robots.

Hygrobots actually inch forward by absorbing humidity from their surrounding environment. Created by researchers at the Seoul National University, South Korea, these tiny robots can twist, wriggle forwards and back, and crawl just as snakes or worms do. The researchers envisage these hygrobots being useful for a variety of applications in the future, which could include delivering drugs within the human body.

According to the researchers, they received the inspiration for hygrobots from observing plants, and they have described their findings in the journal Science Robotics. Using hydroexpansion, plants change their shape and size when they absorb water from the air or ground. For instance, pinecones know when to close and when to open, depending on whether the air is wet or dry, and this helps them to disperse seeds more effectively. Even earlier, plants have provided inspiration for robots—researchers created robots in imitation of algae.

Although hygrobots are not made of plant cellulose, they mimic the mechanism the plants use. As moisture is available almost everywhere, using it as a source of power for operating robots makes sense. Unlike batteries, moisture is non-toxic, and does not have the tendency to explode. This is an important consideration, as microbots, for instance the spermbot, are usually required to operate within the human body.

One can visualize the motion of hygrobots by observing the Pelargonium carnosum seed bristle—a shrub-like plant found in Africa. The hygrobot mimics the motion of the bristles, as it has two layers made of nanofibers. While one layer absorbs moisture, the other does not.

Placing the bot on a wet surface causes the humidity-absorbing layer to swell up, making the bot move up and away from the wet surface. This allows the layer to lose moisture and dry up, and the bot comes back down—the cycle repeating itself—allowing the bot to move. The researchers demonstrated a hygrobot coated with antibodies crawling across a bacteria-filled culture plate. It could sterilize the entire plate without requiring any artificial power source.

This is how the researchers imagine the bots of the future will deliver drugs within the human body, propelling themselves using only the moisture of the skin. Other than responding only to water vapor, researchers say they could equip them with sensors that respond to other gases as well.

However, this is not the first instance of scientists working with tiny robots. Last year, researchers had created a hydrogel bot for biomedical applications that a magnet could activate. It was able to release localized chemo doses for treatment of tumors.

Not only medical, military, and industrial applications will also benefit from light and agile microbots that do not require additional power inputs to operate. Hygrobot, the biologically inspired bilayer structure harnessing energy from the environmental humidity uses ratchets to move forward. The hygroscopically responsive film quickly swells and shrinks lengthwise in response to a change in humidity.

SALTO the Agile Jumping Robot

In the Biomimetic Millisystems Lab at UC Berkeley, Duncan Haldane is responsible for numerous bite-sized bio-inspired robots—robots with hairs, robots with tails, robots with wings, and running robots. Haldane and the other members of the lab look at the most talented and capable animals for inspiration for their robotic designs.

The African Galago or Bushbaby is one such animal. It is a fluffy, cute, talented, and capable little jumping animal weighing only a few kilos. However, this little creature can jump over and clear bushes nearly two meters tall in a single bound. Biologists have discovered that Galagos’ legs are specially structured to amplify the power of their tendons and muscles. Haldane and his team accordingly made Salto, a legged robot weighing only a hundred grams, endowing it with agility and the most impressive jumping skills. Salto features in a paper the new journal Science Robotics.

Jumping is not only about how high one can jump, how frequently you can jump also matters. Haldane and his team have coined the term agility to refer to how far upwards one can go while jumping repeatedly. Technically, they define it as the maximum average vertical velocity achieved while performing repeated jumps. For a Galago, it can jump 1.7 m in height repeatedly every 0.78 s—an agility of 2.2 m/s.

Therefore, to be called agile, the jumper not only has to jump high, he also has to be able to jump frequently. For instance, although EPFL’s Jumper can jump to impressive heights of 1.3 m, it can only do so every four seconds. That means its agility is a measly 0.325 m/s. On the other hand, Minitaur can jump up only to 0.48 m, but it does so every 0.43 s, which gives it a much better agility of 1.1 m/s, despite its lower jumping ability.

That means improving agility involves jumping either higher, or more frequently, or both. Galagos are agile, because of not only their ability to jump high, but also their ability to do so repeatedly. Most jumping robots have low agility, because, they need to spend time winding a spring for storing up enough energy to jump again, and this reduces their jump frequency. The researchers at Berkeley wanted a robot with an agility matching that of the Galagos. With Salto, they have come close, as Salto can jump 1 m every 0.58 seconds, earning an agility of 1.7 m/s.

Just as with many jumping robots, Salto also uses an elastic element, such as a spring, for the starting point. For Salto, the spring, actually a piece of twistable rubber is placed in series between a motor and the environment, making a Series Elastic Actuator (SEA). Apart from protecting the motor, SEAs also allow for force control and power modulation, while allowing the robot to recover some energy passively.

Such springs are available to the Galagos in the form of their tendons and muscles. However, the leg of the Galagos is in such a form that allows it to output nearly 15 times more force than its muscles can by themselves. Haldane has used this same design for Salto as well.

Six-Legged Robot is faster than Insects

Evolution follows very intelligent designs, filtering out the failures by trial and error. However, evolution in nature takes place over billions of years, but that span of time may not be available to designers of robots. Usually, robotics design, inspired by biology, is about the designer figuring out the clever tricks that evolution has perfected and applying them to the robot for beating nature at her own game.

For instance, studies have shown that most six-legged insects move with a tripedal gait, meaning they move at least three legs at a time. On the other hand, EPFL researchers from Lausanne, Switzerland, have reported in Nature Communications that a bipedal gait for a hexapod is more efficient and a faster way of moving—using two active legs at once.

When moving, especially when moving fast, animals with legs tend to minimize the time their legs remain in contact with the ground. Therefore, fast moving mammals prioritize flight phases, in which their motion seems more like a sequence of jumps rather than fast walking. However, for hexapedal insects, whether they are moving slowly or fast, movement consists of keeping at least three legs in contact with the ground at all times.

Mathematically, the tripedal gait is less efficient than a gait involving two legs. This is simple to calculate, as a hexapod using three legs at a time gets two power strokes per gait cycle, whereas, if it used two legs at a time, it would instead get three. The EPFL researchers tested this theory on hexapedal robots. They conclusively proved that by using two legs at once instead of three, hexapedal robots could move 25% faster. Therefore, rather than use the natural tripedal gait of insects, a hexapedal walking robot, with a bipedal gait, could be more dynamic, although statically not so stable. That brought the investigators to an interesting question: why are insects using a slower gait, when they could be moving faster?

The researchers found that insects also needed to move on places that are not always upright and horizontal, such as walls and ceilings. Walking on walls and ceilings requires feet that stick or grab to surfaces—most flying insects have this capability. They concluded that for walking while clinging to surfaces, it is best to follow a tripedal gait, but when running on the ground, a bipedal gait is faster.

The researchers tested their theory further by negating the adhesive property of insects’ feet by giving flies some polymer boots. The flies responded by moving on to a bipedal gait from a tripedal one. Even when placed on a very slippery surface, their behavior did not change, suggesting the tripedal gait was due to the structure causing the adhesion in the legs, or the sensory feedback the legs generated. This experiment proved conclusively that even when adhesion was unnecessary, insects could not move to a bipedal gait, as having sticky feet, they needed the leverage of three legs to unstick the other three.

Such biorobotics helps us in two ways. On one hand, it explains why nature works the way it does, and on the other, it shows how we can make faster and better robots.

An Autonomous Robot Called Bat Bot or B2

Although detested and at the same time revered by people all over the world, bats are undoubtedly remarkable creatures when it comes to their ability to fly. While birds do perform the most nimble aerobatics, and most fishes swim superbly in water, bats possess the most refined powered flight mechanism, unmatched in the animal kingdom. Now a team of scientists has studied the way bats fly, and have built the first robot to mimic their flight mechanism. They have named the robot Bat Bot, or B2.

The scientists had a tough time when they tried to imitate the natural flight of a bat. Bats have flexible membranes on their wings, and use more than 40 active and passive joints with each flap of their wings. Moreover, they have bones with the capability to deform each time the bat beats its wings. The scientists found it very difficult to replicate the complete suite of biological tricks that bats use regularly.

In creating the Bat Bot, the scientists have achieved an engineering marvel. The Bat Bot weighs only about 94 grams—about as heavy as two golf balls. It has a carbon-fiber skeleton with a head filled with its on-board computer and sensors. The five micro-sized motors are strung along its backbone, and the entire skeletal structure has a silicone membrane stretched over it. A trio of roboticists at Caltech, led by Soon-Jo Chung, designed the Bat Bot capable of autonomous flapping flight. They unveiled it in the journal Science Robotics. At present, Bat Bot can perform only four main components of the movements of a bat’s wing—the shoulder, elbow, wrist bend, and the side-to-side tail swish.

According to Chung, his team had to give up the thought of simply mechanizing the flapping wings of a bat, joint by joint. They quickly understood the impossible task of incorporating all the forty joints in the design of Bat Bot, as it would only have resulted in a heavy robot, incapable of any type of flight.

After a careful study of a bat’s flight mechanism, including the biological studies documented by Dan Riskin of the Discovery Channel, the team tried to understand, among the 40 joints, those absolutely vital for the flight. Finally, they settled on a total of nine joints for the Bat Bot.

Although the Bat Bot is a sophisticated and advanced piece of machinery, it is still a very simple bat compared to the natural animal. For instance, Bat Bot does not have knuckles or joints in its carbon fiber fingers, and Bat Bot cannot actively twist its wrists that normal bats can do naturally.

Chung’s team had to make additional simplifications as well. For instance, the hyper-thin silicon membrane of Bat Bot has uniform flexibility, whereas the wing membrane of an actual bat has variable levels of stiffness in different places.

In spite of the above differences, Bat Bot does make elegant flights, almost indistinguishable from that of its biological cousin. While gliding through the air, Bat Bot has grace and fluidity, independently tucking and extending its wrists, shoulders, elbows, and legs.

Soft Robots Mimic Biological Movements

At Harvard University, researchers have developed a model for designing soft robots. The special features of these robots include bending as a human index finger does and twisting like a thumb when a single pressure source powers the robots.

For long, scientists have followed a process of trial and error for designing a soft robot that moves organically—twisting as a human wrist does, or bending just like a finger. Now, at the Wyss Institute for Biologically inspired Engineering and the Harvard JA Paulson School of Engineering and Applied Sciences, researchers have developed a method for automatically designing soft actuators that are based on the desired movement. They have published their findings in the Proceedings of the National Academy of Sciences.

To perform the biologically inspired motions, the researchers turned to mathematics modeling for optimizing the design of the actuator. According to Katia Bertoldi, Associate Professor and coauthor of the paper, now they do not design the actuators empirically. The new method allows them to plug in a motion and the model gives them the design of the actuator that will achieve that motion.

Although the design of a robot that can bend as a finger or a knee does can seem simple, it is actually an incredibly complex process in practice. The complications of the design stems from the fact that one single actuator cannot produce the complex motions necessary. According to the first author of the paper, Fionnuala Connolly, who is also a graduate student at SEAS, the design requires sequencing the actuator segments. Each of them performs a different motion, with only a single input actuating them all.

The team uses fiber-reinforced, fluid-powered actuators. Their method uses mathematical modeling for optimizing the design of the actuators, which perform a certain motion. With their method, the team was able to design soft robots that bend and twist just as human fingers and thumbs do.

SEAS have developed an online, open-source resource that provides the new methodology in the form of a Soft Robotic Toolkit. This will assist educators, researchers, and budding innovators in designing, fabricating, modeling, characterizing, and controlling their own soft robots.

The robotics community has long been interested in embedding flexible materials such as cloth, paper, fiber, and other particles including soft fluidic actuators, which consist of elastomeric matrices. These are lightweight, affordable, and easily customizable to a given application.

These multi-material fluidic actuators are interesting as the robotics community can rapidly fabricate them in a multi-step molding process. Only a simple control input such as from a pressurized fluid achieves the combinations of extension, contraction, twisting, and bending. Compared to the existing designs, new design concepts are using fabrication approaches and soft materials for improving the performance of these actuators.

For instance, motivating applications are using soft robotics such as heart assist devices and soft robotic gloves for defining motion and forcing profile requirements. It is possible to embed mechanical intelligence within these soft actuators for achieving these performance requirements with simple control inputs. The challenge lies in the nonlinear nature of the large bending motions the hyper-elastic materials produce, which make it difficult to characterize and predict their behavior.

The Raspberry Pi MeARM

Arms are a favorite with robotic enthusiasts. The number of joints in an arm ensures this. For instance, an arm can be made to rotate a full circle, and bend to almost at right angles. Each finger on an arm can be manipulated independently, and each finger can have at least three joints. Therefore, an arm with even two fingers and an opposing thumb can pick up objects—with pressure sensing. A simple project such as an arm can become as complicated as one can make it.

The above reasons made the original MeARM kit a veritable success. It was a pocket sized robot arm and budding Raspberry Pi (RBPi) enthusiasts quickly latched on to it. The design was simple, an open-source. It needed only three parts, the servomotors, screws, and the laser-cut parts. This simplicity spread the design round the world, making it massively successful. Although parents were skeptical of its complexity, children loved it. Its makers, the Bens, are now back with a new project, the MeARM Pi.

The new MeARM Pi, like its predecessor, is also simple enough for children to build it themselves. The RBPi gives the arm its hardware and processing power making the whole project a pleasant, fun, and simple experience. In just thirty minutes, you can build the new MeARM, connect it to your RBPi, add the Wi-Fi, connect it to your network, and start programming it using your favorite language—JavaScript, Python, Snap, or Scratch. Now, isn’t that a fun way to start learning to code?

The workings of the MeARM Pi are straightforward and simple. The GPIO pins on the RBPi drive the servos directly. The RBPi communicates directly with the joysticks using an I2C ADC. Even the on-board RGB LED gets its power directly from the GPIO pins, so playing around with colors is simplified. Although the regular 2 Amp RBPi power supply delivers all this power without any issues, you may consider using an upgraded power supply rated at 2.5 Amps, if you are planning to plug in some more devices.

The HAT with the kit has its own power supply, which will comfortably power both the arm and the RBPi. As the HAT follows the reference design for all RBPi HAT designs, the accompanying open-source Node.js app performs a few key tasks that include controlling the servos in the arm via the GPIO pins. It also reads the state of the joysticks via the ADC.

This great kit is just right for any budding programmer stepping into the world of digital electronics. The kit contains everything needed (except the RBPi): all plastic parts, Allen key screws and Allen key, four metal-gear servos, RBPi HAT with two on-board joysticks.

To improve the quality, the kit comes with metal gear servos rather than the usual plastic ones. Moreover, small fingers of children aren’t well equipped to handle screwdrivers. That is the reason for including the Allen key parts—more reliable.

Depending on preference, you can go for either the blue color kit or the orange one. The programming languages are already available on the RBPi, so as soon as you have assembled the arm, it is ready to pick up things.

What Are The Cobots?

When inventors Joseph Eagleburger and George Devol were discussing about science fiction novels in 1954, they initiated the idea of industrial robots. It took them six years to give shape to their idea and Unimate entered a secure place in the robotic hall of fame, as the world’s first industrial robot. In 1961, Unimate began working on the assembly lines of the General Motors.

At first, people looked on with suspicion on the safety issues related to Unimate. At the time, the only reference people had for robots, was the laser-firing robot from “The Day the Earth Stood Still,” a thriller from the 1950s. Now, 50 years hence, industrial robots are far less scary.

Traditionally, robots were constructed to work under restriction inside robotic work cells with physical barriers for the safety of human workers. However, modern robots work completely outside any cage. On the factory floors today, working safely alongside their human counterparts, you will find unfettered working robots that are termed collaborative robots or cobots. Nevertheless, no robot is entirely devoid of health and safety features.

Unlike in the past, today’s industrial robots or cobots are designed specifically to work safely around humans. In fact, now robots work hand-in-hand with humans on the same assembly tasks and it has been independently certified that this is safe. The two-armed collaborative robot from ABB Robotics, YuMi, contributed largely to this certification.

To prevent accidents with human workers, cobots utilize sensors installed on them. The sensors monitor the location of humans around them on the factory floor and react to human contact. Therefore, even if a person does come too close to the machinery, it simply and automatically shuts down. Moreover, cobots work with strength, speed, and force limited to avoid causing serious injury to humans if there is any contact.

Most cobots are simple enough to require practically no skill in programming them. Anyone, who can operate a smartphone, can program them to operate. In contrast, complex robots of about a decade ago needed a host of highly skilled technicians to program and monitor them while in operation.

Among the industries that are being transformed by such collaborative machinery, the most to benefit is the automotive industry. As such, this sector has always been at the forefront of industrial robotics. Automotive manufacturers have been using robots and robotic equipment since the 1960s, but a lot has changed since then. The competitive nature of the industry forces manufacturing lines to be highly efficient, flexible and more productive than ever before.

Not that all this means any advancement in robotics is a threat to human jobs on the production line. For instance, builders use a concrete mixer to help the bricklayer and not to replace him. In the same way, collaborative robots only assist workers on the assembly line and do not actually replace them. According to some experts, production line workers will ultimately use collaborative robots as helpers in the same way as engineers use computers to further their own work and make their jobs easier.

What are Light-Emitting Capacitors good for?

HLEC, or Hyper-elastic Light-Emitting Capacitors are good for making electroluminescent skin for robotics, and you can do a lot with them. That is according to Dr. Rob Shepard of Cornell University and his team of graduate students, who have published a paper on the electroluminescent skin they have developed recently.

The team was inspired to develop the electroluminescent skin by observing several cephalopods such as the Octopus. According to the team, their material can change its color, just as an octopus can, including changing its size to fit into areas that structures that are more rigid cannot. For instance, the skin continues to emit light even when it has been stretched to about six times its original size.

Layers of transparent hydrogel electrodes separated with elastomer sheets as dielectric make up these HLEC or Hyper-elastic Light-Emitting Capacitors. Panels of these capacitors, integrated into robotic systems, and outfitted with sensors act as ideal health-based sensor applications for wearables. The team at Cornell has fabricated one robotic system from three panels and it is capable or crawling. With each panel consisting of six layers, the robot crawls along with worm-like movements, using two pneumatic actuators that alternately inflate and then deflate. You can see the stretchable skin and its crawling action in the video here.

Although the team is in raptures over how well the HLEC panels function, their next step is convert the material into practical devices with applications – find a reason to use it, as they say.

The team expects the development of uses for these new panels to lead to some innovative applications. Although at present, the speculated devices range primarily from health care to industries related to transportation, there is a significant interest in future robotic application as well. The latter is based on the interest in advancing the way robots interact with humans.

For instance, the robot Atlas, from Boston Dynamics, looks formidable enough to crush you were you to give it a hug accidentally. Humans prefer soft and puffy robots, and in the future, robots may even be able to change color based on the mood of the person in front. People generally grow an innate fear of robots after having watched T-800 in movies such as ‘The Terminator’. However, future robots such as Baymax should help make a difference in their thinking. According to Professor Shephard, HLEC panels can be part of the break-through.

It is important to get the human-robot interactions right. Simple things such as the ability to change their color can let robots make emotional connections with humans. This could be in response to the tone of the room or the mood of the humans in it.

This new electroluminescent skin has a huge potential for all kinds of new devices. However, it needs the assistance of other engineers as well to discover new applications and make use of this technology. Primarily, the material scientists who developed this skin are planning to use this for life-saving wearable health monitors. However, it could easily be used as a robot that fits into tight areas. Once the HLEC panels are commercially available, surely, there will be many people to think of additional innovative applications.

Build a Humanoid Robot from Raspberry Pi

Raspberry Pi or RBPi is the ubiquitous low-cost, credit card sized single board computer with huge potential starting from teaching youngsters computer programming to driving robots on Mars. However, when Tyler Spadgenske tried his hands on RBPi, he used the SBC to create Andy – a completely open-source humanoid robot.

Tyler has tried to make Andy a connected robot. Andy can connect to humans through speech, using language as humans do – for answering questions. With access to the Internet, he (Tyler assures Andy is male) can also talk to client programs over the Web. With ability to connect via Bluetooth, Andy communicates with other robots such as the Mindstorms NXT.

Using a bipedal mechanism that offers him mobility, Andy can do additional tasks such as moving stuff. Of course, Andy has his limitations, but then, he can collaborate with other robots to get those things done, which he cannot. Tyler has given Andy only speech as the user interface, since he feels a humanoid should have no other. However, that does not limit Andy from interfacing with other computers over the Internet, because basically, he is a computer himself.

Initially Tyler was using Robosapien for Andy’s bipedal movement, but that did not work out satisfactorily. He is using a new bipedal system using SolidWorks. Later, Tyler plans to add a torso, a head and arms for Andy, again using SolidWorks and 3D printing.

Starting up Andy is very simple – flip the switch on his back to the on position. Andy has LiPo batteries rated for 11.1V, 1.3A and 1300mAH. These power his motors through the L298 motor drivers, which the RBPi drives. As soon as the RBPi receives power, which is regulated with a UBEC, it starts executing Andy’s software. This begins with some configuration checks such as for starting the server and running some modes. Then Andy settles down and prepares to listen to his microphones.

Now, Andy is up and running as a state machine. He will listen to commands from either his microphones or his server – first converting any command received from either to text and then executing it.

After converting the command to text form, Andy interprets it by comparing it to the command set in his repertoire. That gives him the correct function he must execute for a specific command. For example, for a shutdown command, Andy initiates a complete sequential software and hardware shutdown, ultimately switching himself off. For any other command, however, Andy executes it and then goes back to wait for commands from his microphone or server.

Andy’s brain, the RBPi, controls almost everything for him, including speech recognition and motor control to Andy’s software. Andy has three L298 motor drivers, with each capable of controlling and driving two motors each. Therefore, Andy is capable of driving a total of six motors. As the RBPi had only a limited GPIO pins, Tyler had to expand them using an MCP23017 chip.

Tyler plans to give Andy 10 degrees of freedom with the new SolidWorks hardware. His new features will include monitoring the battery voltage, a power on LED, an LED output with five segments and ten servos – six for the arms and four for the legs.