RSS2.0

Showing posts with label Robots. Show all posts
Showing posts with label Robots. Show all posts

The pi4-workerbot – more adaptable than the average industrial robot

Tuesday, December 14, 2010


Industrial robots are generally programmed to carry out one task and one task only. While they are extremely quick and efficient at performing their assigned task, adapting them to other tasks can be a time consuming and expensive endeavor. In an effort to introduce robots with greater flexibility into industrial inspection and assembly systems, the EU-funded PISA research project has developed the pi4-workerbot. The multi-tasking robot is similar in size to a human being and features two arms, three cameras, fingertip sensitivity and can even produce a variety of facial expressions.

The robot’s size allows it to be employed at pretty much any modern standing or sitting workstation found in an industrial manufacturing environment. Whereas most conventional industrial robotic arms have six degrees of freedom with one swivel point at the shoulder, each of the workerbot’s two arms have seven degrees of freedom with an additional rotation joint that corresponds to the human wrist. This gives the robot the ability to transfer a workpiece from one hand to the other, allowing it to view a complex component from all angles.

The pi4-workerbot’s three cameras include a forehead-mounted 3D camera to capture its general surroundings and two other cameras that are used for inspection purposes. This allows it to inspect one aspect of an object with its left eye and another aspect with its right eye at the same time. Measuring an object with one eye while identifying whether a coating has been perfectly applied with the other, for example.

The researchers also provided the robot with fingertip sensitivity. If the strength of the grip is set correctly, the robot is able to hold an egg without cracking it, says Dr.-Ing. Dragoljub Surdilovic, head of the working group at the Fraunhofer Institute for Production Systems and Design Technology IPK in Berlin where the robot was developed.

The team even gave the workerbot a variety of facial expressions. When work is going smoothly it will smile, while a bored look will indicate it is waiting for work so the production manager knows the production process can be sped up.

The robot was designed to give German manufacturers the technology needed to adapt to a variety of different product versions and fluctuating volumes. With manufacturers having to deal with workforce requirements that change in line with orders on company books, the idea is for manufacturers to lease the workerbots from pi4_robotics when needed.

This Sarcoman robot can juggle

Monday, December 13, 2010


We have seen some interesting robots in our time, and this Sacroman has the ability to do something that I can’t: juggle.

I’m pretty certain that Asimo isn’t capable of doing that. Sure, it may be able to walk up a flight of stairs, but as far as I’m concerned, if a robot can’t do circus stunts, then it isn’t a robot.

Apparently, the robot has a vision system that can somehow make “dynamic adjustments to its movements based on the paths of the balls”. I’ve got a video of it after the jump so you can see it for yourself.

Believe it or not, this video is over four years old. I wouldn’t be surprised if it was older. In fact, I checked on Sarcos Robotics website, a “research and development leader in designing and building”, and the latest news is from the nineties, I’m serious.

Okay, I’m going to be that guy who suggests this. Imagine if you could make several robots that could juggle like this. My question would be if a juggling robot would be at all impressive?

After all, a human learns how to juggle, and it takes some discipline. In fact, it is more discipline than I have ever undertook. However, a robot that juggles is just doing what it is programmed to do. I’ll bet that makes you think twice about starting a robotic circus.

Robotic road train could change commuting as we know it


I think we all know that most of us waste a lot of quality work time driving to and fro to work each day. To make matters worse, some of us decide to “use the time better” by talking on the cell phone or doing other mobile applications that lead to distracted driving accidents.

There is a lot of work on making a driverless car, and the image that you see here is a different idea. What you are looking at here are robotic road trains, and it is a way to get ordinary cars grouped together like train cars in a wireless configuration.

I know that you can’t really read the fine print on the image, but the concept works by having a lead vehicle traveling down the highway, and then other cars join up by entering their destination. From there, six to eight cars will be guided wirelessly by the leader, and they can sit comfortably and do anything they want. Heck, they could probably even sleep. Once the driver/passenger approaches their destination, they can simply break off from the train and go on their way.

I’m sure that you can see a lot drawbacks to this. Like what if the wireless driving program malfunctions, or what if a car in the train runs out of gas. Yes, we are a long way from establishing one of these convoy trains along the road. I would like to see a dry run on a major highway, though.

Robot is programmed to pick strawberries

Thursday, December 2, 2010


This posting should be a warning to anyone who has a career in harvesting fruit. You could be replaced by a robot.

This robot has been developed by Japan’s National Agriculture and Food Research Organization, and it has the ability to detect how ripe a strawberry is, and then can cut the stalk without damaging the sweet fruit itself. There is a video of it after the jump so you can see the strawberry-picking robot in action.

This robot works using two cameras so it has a 3D image of fruit. It then can detect how ripe the strawberry is by its color, and can pick it from there. The whole process itself takes about 9 seconds.

Just to give you an idea about how much time this saves, my Source says that it takes about 500 hours to pick all strawberries in a 1,000 square meter field, but this robot can beat that by 200 hours. I am assuming that these robots would be on some sort of tank treads or all terrain wheels in order to pluck all the strawberries in a square kilometer area.

The designers believe that they can possibly build robots that can pick other fruits besides strawberries. It is definitely the future of farming, complete with robot workers.

Battlefield Extraction-Assist Robot to ferry wounded to safety

Friday, November 26, 2010


The U.S. Army is currently testing a robot designed to locate, lift and carry wounded soldiers out of harm’s way without risking additional lives. With feedback from its onboard sensors and cameras, the Battlefield Extraction-Assist Robot (BEAR) can be remotely controlled through the use of a special M-4 rifle grip controller or by hand gestures using an AnthroTronix iGlove motion glove. This equipment would allow a soldier to direct BEAR to a wounded soldier and transport them to safety where they can be assessed by a combat medic.

Built by Vecna Robotics, BEAR maneuvers via two independent sets of tracked “legs” and is able to stand up and dynamically balance on the balls of its ankles, knees or hips while carrying a load. At full height BEAR stands 1.8 m (6 ft) tall, allowing it to look over walls or to lift its cargo onto a raised surface. To ensure it can handle a fully equipped soldier, BEAR’s hydraulic arms are capable of carrying up to 500 pounds (227 kg), while its hands and fingers allow it to carry out fine motor tasks. It also has a “teddy bear” face that is designed to be reassuring.

BEAR has been undergoing tests over the past year in simulations and live exercises by soldiers at the U.S. Army Infantry Center Maneuver Battle Lab at Fort Benning, Georgia. These tests are designed to provide BEAR’s developers with feedback on the real-world operational capabilities and requirements for the robot.

Anthronix, the makers of the iGlove, which is available commercially as the AcceleGlove, plans to develop a new glove for controlling the robot that will include more accelerometers and a digital compass to allow for greater control using only hand gestures – to instruct the robot to disable an improvised explosive device or travel exactly 500 meters east for example.

The alternative method of remote control, a "Mounted Force Controller" which is mounted on the grip of an M-4 rifle, allows a user to control BEAR without having to put down their weapon.

Currently all BEAR’s actions are controlled by a human user but the developers are working to give BEAR more complicated semi-autonomous behaviors that will allow it to understand and carry out increasingly complicated commands.

Vecna Robotics says BEAR could also be used for search and rescue, handling hazardous materials, surveillance and reconnaissance, mine inspection, lifting hospital patients, or even warehouse automation. However, the battlefield is where we’re probably most likely to first see BEAR.

“If robots could be used in the face of threats such as urban combat, booby-trapped IEDs, and chemical and biological weapons, it could save medics' and fellow soldiers' lives," says Gary Gilbert of the U.S. Army Medical Research and Materiel Command's Telemedicine and Advanced Technology Research Center (TATRC), which helped fund BEAR’s development.

Eyes, ears and brains being developed for underwater robots


Engineers from Germany's Fraunhofer Institute for Optronics are working on an autonomous underwater vehicle (AUV) that would be inexpensive enough to use for industrial applications such as hull and dam inspection, yet independent enough that it wouldn’t require any kind of human control. Typically, more cumbersome but less costly remote operated vehicles (ROVs) are used for grunt work – they are connected to a ship on the surface by a tether, where a human operator controls them. The more technologically-advanced AUVs tend to be used more for well-funded research, but according to the engineers, one of the keys to creating “blue collar” AUVs is to overhaul the ways that they see, hear and think.

The project is being led by Dr. Thomas Rauschenbach. His team wants to build autonomous robots that are not only less expensive, but that are also smaller and tougher than their predecessors, and that can be used in pretty much any underwater setting.

The AUVs will reportedly be able to see even in turbid water, thanks to a laser imaging system. An onboard camera will emit laser pulses, which will be reflected by underwater objects. As the camera receives and processes these waves of reflected light, it will build up a picture of its surroundings.

Hearing, so to speak, will be accomplished via high-frequency sound waves. As with the laser pulses, these ultrasound waves will bounce off of objects and be registered by a sensor, allowing the AUV to inspect those objects. Fraunhofer believes that it is a step up from the sonar technology that similar vehicles currently use.

A control program will keep the AUV on course and out of harm’s way, even allowing for underwater currents. Its pressure-tolerant electronics will be encapsulated in silicone, as will the vehicle’s lithium batteries. The program also features an energy management system, which will conserve power and save data in the event of an outage.

A prototype will be tested in a water tank this year, with sea trial dives of up to 6,000 meters (19,685 feet) scheduled for Q3 of 2011.

Tony Sale’s 60-year-old Robot, George

Thursday, November 25, 2010


The guy on the left is Tony Sale, and he is the inventor of George, the guy on the right.

George is a robot that is made from the aluminum and other scraps of a crashed bomber sixty years ago. Apparently, he has spent a large percentage of that time in a storage shed.

However, he had his fifteen minutes of fame in and around 1950 as they showed him pushing a vacuum cleaner. In reality, he wasn’t able to push a vacuum cleaner, but he could pull it. I suppose that the photo was taken to show the world that the future was robots who do all of our housework. What wife in the fifties wouldn’t want one of those?

George is going to have a permanent home at the National Museum of Computing in Bletchley Park, Buckinghamshire, England. This is the same place where Tony Sale’s rebuilt Colossus computer is also on display. For those who don’t know about the Colossus, it is the “world’s first recognizably programmable computer” and it is most famous for breaking a complicated Nazi code known as the Lorenz Cipher.

I’m certain that we could build a more advanced version of George today, and it probably wouldn’t look like a cross between the Tin Man from The Wizard of Oz and a human-sized version of The Iron Giant.