Cobots are the latest version of robots, named so because, unlike the robots of yesteryear, they are a new breed of “cooperative” robots. What makes these unique is their ability to “sense” their environment.
We like to romanticize robots. To wit, the likes of Robby (of Forbidden Planet fame), Rosie (from The Jetsons), the nameless Class M-3, Model B9, General Utility Non-Theorizing Environmental Robot found in Lost in Space, R2D2 and C-3PO of Star Wars, The Terminator, and so many others of sci-fi fame, are what we would like them to be.
But in reality, the current crop of robots is simply a mass of mechanized, contained, autonomic devices with particular functions – some preprogrammed, some controlled in real time. And many of these are rather sophisticated mechanical devices, but no matter what they do, they do it regardless of surroundings. So if one got in a robot’s space, the robot would happily fuse your arm to a circuit board, or drive a rivet into your ear.
Of course, safety nets have been built around robots to avoid these types of occurrences. But the point being that such robotic devices must have restraints to avoid harming humans. And failures occur.
Cobots, still in the early stages, make the leap to a level of “self-awareness.” They are capable of sensing their surroundings. This is accomplished by the advancement of deep-neural-network (DNN) machine-learning technology.
Cobots use DNN to analyze data received from an array of sensors attached to it. Add some, advanced silicon and software and cobots are capable of maneuvering among obstacles and responding to environmental conditions. They have wireless networks as their main communications protocol.
Now, one can argue that versions of cobots have been used in applications such as hospitals to ferry around bedpans and blood samples, deliver supplies, etc. for years. But a hospital is a fairly well regulated and controlled environment so adding them is relatively safe and predictable. And there is no real need for DNN in well regulated routines. After all, it doesn’t matter who is in room 205, or what they order for dinner, the robot just delivers it.
The difference with this next generation is that these cobots can work close to humans and be aware of them, using a collection of cameras, proximity sensors, Radar/LiDAR, pressure, temperature, and contact sensors, and more – and DNN allows them to learn from environmental surroundings so reprogramming is done in real time and “on site” by the cobot.
One application where these advanced cobots can be found is in restaurant food delivery. In fact, a pilot project from a company called Marble has developed, what might be called, miniature self-driving vehicles (see picture below) that crawl at a couple of miles per hour along San Francisco’s Mission and Potrero Hill districts delivering food ordered via a Yelp EAT24 app. These cobots are programmed to avoid human and other vehicular traffic and to use DNN and their sensors to navigate city streets and inside buildings. And they are connected to the wireless network so their location is always known and, if necessary, they can be given instructions.
There are other apps and other renditions of cobots that work in close proximity with humans on the drawing board, such as smart robotic arms that assist in production and design environments. These are truly the next iteration of “an extra set of hands.”
These are exciting developments and inextricably interwoven with wireless. 5G, HetNets, and other next generation networks and technologies will offer a much wider and deeper ecosystem where intelligent robots will only grow and thrive. Perhaps we are not as far from those sci-fi robots after all.